NEVER use ChatGPT again

Mudahar advises against using big tech AI tools that compromise privacy by requiring full system access, advocating instead for running AI models locally on personal hardware to maintain control and security. He showcases various local AI options and agentic tools that enable private, cost-effective AI use while warning about the ethical considerations and risks of granting AI system control.

In this video, Mudahar urges viewers to stop relying on ChatGPT and other big tech AI tools that require giving full access to your computer and personal data. He highlights the risks of downloading AI applications from large companies that can control your system and surveil your activities. Instead, he advocates for running AI models locally on your own hardware, which preserves privacy and control. He references his previous video where he showed how to set up a local search engine using Docker and CRXNG, enabling ad-free, private internet searches without depending on Google.

Mudahar introduces several local AI options, such as Olama and LM Studio, which allow users to download and run AI models directly on their machines. He explains the importance of understanding model sizes, measured in billions of parameters, and the VRAM requirements needed to run these models efficiently. For example, a common GPU like the RTX 3060 with 12GB VRAM can handle certain models, but larger models require more powerful hardware. He also points viewers to Hugging Face, a repository of AI models, where they can find various models suited to their hardware capabilities.

The video demonstrates how Mudahar runs a 9 billion parameter model locally, connected to his private search engine, to research a recent cybersecurity incident. He compares this with a larger 27 billion parameter model, noting that while bigger models offer higher quality responses, they demand significantly more GPU resources and run slower. Despite some limitations like occasional hallucinations and slower speeds compared to cloud-based AI, these local models provide sufficient functionality for many tasks without compromising privacy or incurring ongoing costs.

Mudahar also explores agentic AI tools like Hermes Agent, which enable AI to perform complex tasks on your computer, such as researching topics, generating reports, and even controlling files. He demonstrates how these agents can be connected to local AI models and internet search endpoints to automate workflows. However, he warns about the risks of giving AI control over your system, showing how the AI refuses to execute dangerous commands like deleting all files. He emphasizes the need for caution and ethical use when deploying such powerful tools.

In conclusion, Mudahar stresses that AI is here to stay, but users should prioritize running AI locally to maintain privacy, security, and autonomy. He believes that big tech companies are monetizing AI in ways that limit access and exploit user data. By adopting local AI solutions, individuals can harness the power of artificial intelligence safely and cost-effectively. He encourages viewers, especially gamers and tech-savvy users, to explore these options and take control of their AI experience rather than relying on centralized services.