The Power of Full Project Context #LLM
Alan Turing & The Turing Machine

The Power of Full Project Context #LLM

I've tried integrating RAG into the DevoxxGenie plugin, but why limit myself to just some parts found through similarity search when I can go all out?

RAG is so June 2024 😂

Here's a mind-blowing secret: most of the latest features in the Devoxx Genie plugin were essentially 'developed' by the latest Claude 3.5 Sonnet large language model using the entire project code base as prompt context 🧠 🤯

It's like having an expert senior developer guiding the development process, suggesting 100% correct implementations for the following Devoxx Genie features:

  • Allow a streaming response to be stopped

  • Keep selected LLM provider after settings page

  • Auto complete commands

  • Add files based on filtered text

  • Show file icons in list

  • Show plugin version number in settings page with GitHub link

  • Support for higher timeout values

  • Show progress bar and token usage bar

I've rapidly stopped my OpenAI subscription and gave my credit card details to Anthropic...

Full Project Context

A Quantum Leap Beyond GitHub Copilot

Imagine having your entire project at your AI assistant's fingertips. That's now a reality with the latest version of the Devoxx Genie IDEA plugin together with cloud based models like Claude Sonnet 3.5.

BTW How long will it take until we can do this with local models?!

Add full project to prompt

The latest version of the plugin allows you to add the full project to your prompt, your entire codebase now becomes part of the AI's context. This feature offers a depth of understanding that traditional code completion tools can only dream of.

Smart Model Selection and Cost Estimation

The language model dropdown is not just a list anymore, it's your 'compass' for smart model selection 🤩 👇🏼

  • See available context window sizes for each cloud model

  • View associated costs upfront

  • Make data-driven decisions on which model to use for your project

The new language model dropdown

Visualizing Your Context Usage

Leverage the prompt cost calculator for precise budget management:

  • Track token usage with a progress bar

  • Get real-time updates on how much of the context window you're using

Calculate token cost with Claude Sonnet 3.5
Calculate cost with Google Gemini 1.5 Flash

Cloud Models Overview

Via the plugin settings pages you can see the "Token Cost & Context Window" for all the available cloud models. In a near future release you will be able to update this table. I should probably also support the local models context windows... #PullRequestsAreWelcome

Token Cost & Context Window

Handling Massive Projects?

"But wait, my project is HUGE!" you might say 😅 Fear not. We've got options:

  1. Leverage Gemini's Massive Context:

Gemini's colossal 1 million token window isn't just big, it's massive. We're talking about the capacity to ingest approximately 30,000 lines of code in a single prompt. That's enough to digest many codebases, from the tiniest scripts to some decent big projects.

But if that's not enough you have more options...

BTW Google will be releasing 2M and even 10M token windows in the near future

2. Smart Filtering:

The new "Copy Project" plugin settings panel lets you

  • Exclude specific directories

  • Filter by file extensions

  • Remove JavaDocs to slim down your context

3. Selective Inclusion

Right-click to add only the most relevant parts of your project to the context and/or clipboard.

You can also copy your project to the clipboard, allowing you to paste your project code into an external chat window. This is a useful technique for sharing and collaborating on code 👍🏼

👍🏼Add Project Folders & Files using right-click

The Power of Full Context: A Real-World Example

The DevoxxGenie project itself, at about 70K tokens, fits comfortably within most high-end LLM context windows. This allows for incredibly nuanced interactions – we're talking advanced queries and feature requests that leave tools like GitHub Copilot scratching their virtual heads!

Conclusion: Stepping into the Future of Development

With Claude 3.5 Sonnet, Devoxx Genie isn't just another developer tool... it's a glimpse into the future of software engineering. As we eagerly await Claude 3.5 Opus, one thing is clear: we're witnessing a paradigm shift in AI-augmented programming.

Alan Turing, were he here today, might just say we've taken a significant leap towards AGI (for developers with Claude Sonnet 3.5)

Welcome to the cutting edge of AI-assisted development - welcome to DevoxxGenie 🚀

X Twitter - GitHub - IntelliJ MarketPlace

DevoxxGenie is fully open source

So the entire project’s codebase is given to Claude which runs on Antropic’s servers?

Like
Reply
Barry van Someren

Java Application Hosting & Support | PostgreSQL & Kubernetes Administration | Empowering Web Development Agencies

11mo

Man, I love living in the future. Excellent work, Stephan!

Lize Raes

Software Engineer and Product Manager

11mo

Waw this is impressive and next-level! Amazing how fast DevoxxGenie has developed, and one can see you have been intensively using it yourself, allowing you to bring exactly those features that developers need! Thanks a lot for this excellent work!

Tom Cools

DevRel at Timefold, Java Champion, BeJUG organizer, (Keynote) Conference Speaker

1y

What you have been doing with LLMs and DevoxxGenie is truely amazing! Congratz Stephan, looking forward to upgrading my plugin and trying this out! 😊

To view or add a comment, sign in

Explore topics