![]() |
|
Generative Ai With Ai Agents & Mcp For Developers - Printable Version +- Softwarez.Info - Software's World! (https://softwarez.info) +-- Forum: Library Zone (https://softwarez.info/Forum-Library-Zone) +--- Forum: Video Tutorials (https://softwarez.info/Forum-Video-Tutorials) +--- Thread: Generative Ai With Ai Agents & Mcp For Developers (/Thread-Generative-Ai-With-Ai-Agents-Mcp-For-Developers) |
Generative Ai With Ai Agents & Mcp For Developers - AD-TEAM - 11-12-2025 ![]() Generative Ai With Ai Agents & Mcp For Developers Published 5/2025 MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz Language: English | Size: 24.78 GB | Duration: 22h 30m Master Generative AI, Model Context Protocol (MCP), and build cutting-edge AI Agent Systems with Python & LLMs What you'll learn Understand the fundamentals of Generative AI and Large Language Models (LLMs) Design and build scalable Generative AI applications using Advanced Gen AI Application Architecture Master Retrieval-Augmented Generation (RAG) techniques for smarter applications Explore and leverage orchestration frameworks like LangChain and LlamaIndex Gain hands-on experience with LangChain Expression Language (LCEL) and its Ecosystem Develop strong Prompt Engineering skills to optimize LLM outputs Build end-to-end Gen AI applications across multiple complexity levels (Beginner to Professional) Implement AI Agent and Multi-Agent systems for advanced automation Integrate Multimodal data (text, image, etc.) into Generative AI applications Learn LLMOps (Large Language Model Operations) for efficient deployment and management Deploy Generative AI applications to production using CI/CD pipelines Understand and implement Model Context Protocol (MCP) for context-aware applications Fine-tune Large Language Models (LLMs) to fit custom project needs Work on real-world Generative AI projects to solidify practical knowledge Requirements Basic understanding of Python programming Familiarity with fundamental concepts of machine learning (helpful but not mandatory) No prior experience with Generative AI or LLMs required Curiosity and willingness to learn cutting-edge AI technologies Description This hands-on course teaches you how to build professional level Generative AI Application, intelligent, autonomous AI Agents using MCP (Model Context Protocol) and modern LLM frameworks.Whether you're an AI beginner or an experienced developer, this course will take you step-by-step through the tools, strategies, and architectures that power modern GenAI applications.What You'll Learn:- Introduction to Generative AI and its role in modern development- Introduction to Large Language Models (LLMs) and how they power intelligent applications- Generative AI Architecture Basics - understand the core components of a Gen AI application- Advanced Gen AI Application Architecture for scalable and modular systems- How to apply the Retrieval-Augmented Generation (RAG) technique for enhanced responses- Choosing the Right Orchestration Framework for building LLM-powered apps- LangChain - A modern framework for LLM orchestration- LangChain Expression Language (LCEL) - Build AI flows with clean, declarative syntax- Deep dive into the LangChain Ecosystem for agents, tools, memory, and chains- Mastering Prompt Engineering - Learn to craft optimal prompts for LLMs- Level 1 Gen AI Applications - Basic AI-powered tools and assistants- LlamaIndex - An alternative to LangChain for RAG and LLM app orchestration- LLMOps (Large Language Model Operations) - Manage and monitor LLM Apps- Level 2 Gen AI Applications - Build intermediate systems with memory, tools, and retrieval- Develop Multimodal Gen AI Applications (text, image, audio integration)- Build and deploy AI Agents & Multi-Agent Systems using orchestration frameworks- Level 3 (Professional) Gen AI Applications - Real-time, scalable, production-ready systems- CI/CD for Gen AI - Deploy your Gen AI apps with automated pipelines- Understand and implement MCP (Model Context Protocol) - Hands-on Projects - From AI assistants to autonomous agents and RAG-powered apps- Fine-tuning LLMs for domain-specific use cases and better performance Overview Section 1: Introduction to the Course Lecture 1 Introduction to the Course & Content Section 2: Introduction to Generative AI Lecture 2 Introduction to Generative AI Section 3: Introduction to Large Language Models (LLMs) Lecture 3 Introduction to Large Language Models & its architecture Lecture 4 In depth intuition of Transformer Architecture Lecture 5 How LLM is trained? Section 4: Introduction & Architecture of a Generative AI Application Lecture 6 Basic Architecture Overview for Gen AI Applications Lecture 7 Advanced Gen AI Application Architectures Lecture 8 Multi-Level Architecture Exploration (Level 1, Level 2, Level 3) Lecture 9 Preview of a Professional Gen AI Application Section 5: LLMs & Frameworks for Generative AI Lecture 10 Selecting the Right Foundation LLMs Lecture 11 Comprehensive Tool Stack for Gen AI Applications Lecture 12 Orchestration Frameworks for Scalable Solutions Section 6: Retrieval-Augmented Generation (RAG) Technique Lecture 13 Introduction to RAG and Key Concepts Lecture 14 Important Concepts of RAG Lecture 15 Core Components of RAG Lecture 16 Addressing RAG Implementation Challenges Section 7: Choosing Orchestration Frameworks for Application Development Lecture 17 Choosing Orchestration Frameworks for Application Development Section 8: LangChain - A Modern Orchestration Framework Lecture 18 Overview of LangChain, Evolution, and Learning Path Lecture 19 Connecting with Leading LLMs Lecture 20 Prompt Templates for Integrating Logic into LLM Interactions Lecture 21 Chains for Sequencing Instructions Lecture 22 Output Parsers for Response Formatting Lecture 23 Working with Custom Data (Data Loaders) & RAG Basic Concepts Lecture 24 Different RAG Components Lecture 25 Basic RAG Implementation with LCEL Lecture 26 Memory Management in LangChain: Temporary and Permanent Memory Section 9: LangChain Expression Language (LCEL) Lecture 27 Introduction to Langchain Expression Language (LCEL) - Chains and Runnables Lecture 28 Built-in Runnables in LCEL Lecture 29 Built-in Functions in runnables Lecture 30 Combining LCEL Chains Lecture 31 RAG demo with LCEL Section 10: LangChain Ecosystem Lecture 32 Comprehensive Overview of the LangChain Ecosystem Lecture 33 LangServe Demo Lecture 34 LangGraph Demo Lecture 35 LangSmith Demo Section 11: Mastering Prompt Engineering Lecture 36 Prompt Engineering Section 12: Level 1 Application Development Lecture 37 Introduction to Level 1 Application Lecture 38 Advanced Chatbot with Memory Lecture 39 Key Data Extraction Lecture 40 Sentiment Analysis Tool Lecture 41 SQL-based Question Answering Application Lecture 42 PDF-based Question Answering Lecture 43 Basic Retriever Applications Lecture 44 RAG Application Section 13: Level 2 Application Development Lecture 45 Introduction to Level 2 Application Lecture 46 Application for Converting Slang to Formal English Lecture 47 Blog Post Generation Application Lecture 48 Text Summarization with Split Lecture 49 Text Summarization Tools Lecture 50 Key Data Extraction from Product Reviews Lecture 51 Interview Questions Creator Application Lecture 52 Medical Chatbot Project Section 14: LlamaIndex - An Alternative of LangChain Lecture 53 Introduction to LlamaIndex Lecture 54 In-depth Exploration of LlamaIndex Section 15: Multimodal Gen AI Applications Lecture 55 Overview of Multimodal LLM Applications Lecture 56 Steps to implement Multimodal LLM Applications Lecture 57 Building Multimodal LLM Applications with LangChain & GPT 4o Vision Section 16: Level 3 (Professional) Application Development Lecture 58 Introduction to Level 3 Application Lecture 59 Project 1: Advanced RAG-Based Knowledge Management System Lecture 60 Project 2: Medical Diagnostics Support Application Section 17: Deploying Gen AI Applications with CI/CD for Production Lecture 61 Complete CICD Deployment Section 18: LLMOps - Large Language Model Operations Lecture 62 What is LLMOps? Lecture 63 Why LLMOps is Different from Traditional MLOps Lecture 64 The Evolution from MLOps to LLMOps Lecture 65 FastAPI for LLM Inference Lecture 66 Setup MLflow on AWS for LLMOps Lecture 67 Training Models with MLflow A Hands-On Guide Lecture 68 MLflow for Model Inference Lecture 69 Dockerizing LLM Inference Services Lecture 70 LLM Evaluation With MLflow And Dagshub Lecture 71 Why we need LLMOps Platform Lecture 72 Generative AI with Google Cloud (Vertex AI) a LLMOps Platform Lecture 73 Vertex AI Hands-On on Google Cloud Lecture 74 Vertex AI Local Setup - Run Gemini on Local Machine Lecture 75 RAG on Vertex AI with Vector Search and Gemini Pro Lecture 76 LLM powered application on Vertex AI Lecture 77 Fine tuning Foundation Model VertexAI Lecture 78 Introduction to AWS Bedrock Lecture 79 Hands-on AWS Bedrock Lecture 80 End to End Project using AWS Bedrock Section 19: Fine-Tuning Large Language Models using PEFT Lecture 81 RAG Vs Fine-tuning Lecture 82 What is Fine Tuning Lecture 83 Fine-Tuning Meta Llama 2 on Custom Data Section 20: AI Agents Lecture 84 Introduction to AI Agents and Agentic Behaviors Lecture 85 Multi-Agent Development with CrewAI Lecture 86 Implementation of AI Agents using LangChain Lecture 87 Implementation of AI Agents using LangGraph Lecture 88 Implementation of AI Agents using Phidata Lecture 89 Implementation of AI Agents using LangFlow Lecture 90 Video Summarizer Agent Lecture 91 Agentic RAG using CrewAI Section 21: Model Context Protocol (MCP) Lecture 92 Introduction to MCP Lecture 93 Setup MCP Server on Cursor Lecture 94 Implement AI Agent with MCP using MCP-USE Developers and software engineers interested in building Generative AI applications,Data scientists and machine learning engineers looking to integrate LLMs into real-world projects,AI enthusiasts eager to explore cutting-edge concepts like AI Agents, MCP, RAG, and LLMOps,Students and researchers who want practical experience in developing AI-powered applications,nyone curious about building end-to-end, production-ready Generative AI systems, from beginner to advanced levels ![]() RapidGator NitroFlare DDownload |