![]() |
|
The Dark Side of AI: Jailbreaking, Injections, Hallucinations & more - Printable Version +- Softwarez.Info - Software's World! (https://softwarez.info) +-- Forum: Library Zone (https://softwarez.info/Forum-Library-Zone) +--- Forum: Video Tutorials (https://softwarez.info/Forum-Video-Tutorials) +--- Thread: The Dark Side of AI: Jailbreaking, Injections, Hallucinations & more (/Thread-The-Dark-Side-of-AI-Jailbreaking-Injections-Hallucinations-more) |
The Dark Side of AI: Jailbreaking, Injections, Hallucinations & more - AD-TEAM - 08-17-2025 ![]() The Dark Side of AI: Jailbreaking, Injections, Hallucinations & more .MP4, AVC, 1920x1080, 30 fps | English, AAC, 2 Ch | 3h 3m | 934 MB Instructor: Scott Kerr Step over to the dark side and learn about the vulnerabilities, exploits, and unintended consequences that AI models like LLMs suffer from, with hands-on prompting and exercises. What you'll learn
Why Learn About the Dark Side of AI? If we asked you to finish the sentance "AI is.", what would you say? You'd probably say something like "awesome" or "incredible". Well here's what we think you'll say after you take this course: "AI is.dangerous!" Don't get us wrong. AI is awesome. In fact, our instructors teach you plenty about just how awesome it is. But understanding AI's potential isn't enough - you need to grasp its pitfalls too. It also has vulnerabilities that can be exploited by people and lead to unintended consequences. This section takes you through a fascinating deep dive into those risks: jailbreaking, prompt injections, hallucinations, prompt and data leakage, and so much more. Through real-world demos, cutting-edge models like ChatGPT DeepSeek, and research-backed insights, you'll see how these issues were discovered, how they show up in the wild, and why even seasoned AI Engineers and Prompt Engineers might be caught off guard. This course is important for everyone that uses AI like ChatGPT. Isn't just eye-opening: it's essential. Homepage ![]() DDownload RapidGator NitroFlare |