Welcome to this collection of Python notebooks and tools designed to build AI-powered agents that interact with OpenShift and virtualization environments. These agents can:
- List virtual machines (VMs)
- Create migration plans
- Interact with OpenShift
- Leverage Large Language Models (LLMs) for decision-making
The notebooks demonstrate how to implement AI agents using models like Llama 3.1 and orchestrate workflows using LangChain and LangGraph. They integrate with tools such as OpenShift, virtualization platforms, and use SQLite for state management.
To get the most out of these notebooks, please ensure you have the following installed and set up:
- Ollama: A local LLM server for running language models. Download it from ollama.com/download.
- LLM Models: We use models like
llama3.1:latest
andllama3.1:8b-instruct-fp16
. You can download these models directly from Meta or using Ollama.
-
Downloading Ollama: Download Ollama and follow the installation instructions.
-
Models We Use:
llama3.1:latest
llama3.1:8b-instruct-fp16
-
Downloading Models:
- Direct Download from Meta: Visit llama.com/llama-downloads to download models directly.
- Using Ollama: You can download models using Ollama with the command:
ollama pull llama3.1:latest ollama pull llama3.1:8b-instruct-fp16
Learn about Large Language Models using Llama 3.1. Understand how LLMs can be used for tasks like question answering and text generation.
Discover how LLMs can be extended using tools to solve tasks requiring real-time information or specialized capabilities.
Explore the ReAct (Reasoning + Acting) prompting framework. See how ReAct enables models to reason through problems, take actions, and adjust based on observations, creating a dynamic problem-solving loop.
Set up an AI-powered agent. Learn how to initialize the agent, load configurations, and connect to external services like OpenShift and the language model.
Introduce the ReAct agent powered by Llama 3.1. See how the agent can execute multiple tasks, make decisions, and provide workflow feedback.
Learn how to orchestrate multiple agents using LangChain, allowing them to collaborate to achieve complex goals.
Explore how AI agents can plan tasks using structured processes and tools, focusing on task breakdown and execution.
See how agents interact with virtualization platforms like VMware vSphere. Agents can list VMs, retrieve VM details, and create migration plans.
Discover how AI agents interact with OpenShift to manage resources such as pods, deployments, and nodes. Learn to use the OpenShift agent for handling workflows.
Dive into creating a powerful migration workflow agent that seamlessly integrates virtualization environments with OpenShift.
The following directories contain code and configurations for the AI agents, services, state management, and utilities:
agent
prompt
schemas
services
state
utils
Contributions are welcome! If you have suggestions or improvements, please open an issue or submit a pull request.
This project is licensed under the MIT License.
Special thanks to the developers and community behind LangChain, LangGraph, and LLM models.
For questions or comments, please reach out via email: [email protected]
Let me know if this looks good to you! I'm happy to make further adjustments.