Tech Blog
2 MIN READ

JetBrains AI Assistant: Now Featuring Google Gemini and Local LLM Integration

JetBrains is taking AI-powered development to the next level! The latest update to the JetBrains AI Assistant introduces two exciting integrations: Google Gemini and Local Large Language Models (LLMs). This means developers can benefit from more versatile AI capabilities, enhancing productivity whether they prefer cloud-based assistance or need local, privacy-focused solutions.

Google Gemini: Advanced AI Support

Google Gemini, a cutting-edge AI model developed by Google DeepMind, is now integrated into JetBrains AI Assistant. Gemini provides a variety of powerful features, including smarter code suggestions, improved error analysis, and insightful project recommendations. Developers can expect enhanced contextual understanding and even more accurate suggestions across a wide range of programming languages and frameworks. Gemini’s integration provides developers with access to the latest AI models that can handle complex tasks effectively.

Local LLMs: Privacy and Control

JetBrains understands the importance of offering both privacy-focused and cloud options for many developers. With this update, Local LLMs are integrated into JetBrains tools, allowing developers to work with AI models on their own machines. These local models can be trained and customized to suit specific needs, ensuring that data remains private while still benefiting from AI-driven assistance.

This addition makes JetBrains AI Assistant accessible for environments where data sensitivity is a concern or where internet access is limited. The local option provides similar coding assistance without relying on external servers, striking the right balance between AI power and privacy.

Flexibility to Choose

One of the key benefits of this update is the flexibility it offers. Developers now have the option to switch between advanced AI (like Google Gemini) and Local LLMs, based on their needs and preferences. Whether you need the cutting-edge advancements of Google Gemini for large projects or want to maintain strict data privacy with Local LLMs, JetBrains AI Assistant puts you in control.

What This Means for Developers

With these integrations, JetBrains aims to cater to diverse developer needs, from those working on enterprise applications with strict data policies to those experimenting with the latest technologies. The improved AI Assistant can:

  • Deliver more precise code completions and suggestions.
  • Assist in debugging by offering intelligent explanations and potential fixes.
  • Help adapt to specific project contexts, especially with Local LLM customizations.

JetBrains’ focus remains on empowering developers by providing them with a tailored experience—whether through the power of cloud AI or the independence of local processing.

Try the Updated AI Assistant

These new capabilities are available across JetBrains IDEs, including IntelliJ IDEA, PyCharm, and more. Update to the latest version to explore the integrated power of Google Gemini and Local LLMs.

For more details, head over to JetBrains AI Assistant page.

Skip to content