LLM code assistance#
We will setup LLM code assistance in our editor so that the LLM can interact with our code directly. You are probably already using an LLM agent and LLM integration in our editor will ease the interaction and save you copy & paste time.
Rider features its own AI assistant, which is very well integrated like GitHub Copilot in VS Community. As of writing this, it requires a credit card for registration. Given that not every student may have a credit card, I did not try it.
Another option is GitHub Copilot, for which an extension exists in Rider. GitHub Copilot can be used for education for free after you apply for a license in GitHub Education.
In this tutorial, I used a plugin called Continue. It lets you connect to lots of different LLM providers, which makes the whole setup feel more open and flexible. The disadvantage is that it might not be as reliable as the official AI assistant JetBrains AI.
Tip
If you don’t like the user experience of Continue, then try GitHub Copilot extension. I did not have spent enough time to compare Copilot and Continue.
Let us first get an API key from Groq and then use it in the Code LLM extension Continue.
Creating a Groq API key#
Go to https://console.groq.com.
Create an account. You will be logged in to the dashboard after creating an account.
Click on
Create API key. You will immediately be shown an API key.Copy the API key.
Installation of Continue#
Click on
.
Click on Plugins.
Settingswindow should pop up.Click on
Marketplacetab.In the search bar, type
continue.Install the
Continueplugin by accepting the terms.Close the
Settingswindow by clicking onSave.You should now see the icon
on the right sidebar, the file
continue_tutorial.pywill be opened and You will get a notification thatFull Line Code Completion is disabled. You can discard these.
Configuration of Continue#
We will now configure the plugin.
If you want to disable sending usage data to Continue: (optional)
On the rightmost bar, click on
. Continue chat settings will show up.
(optional) Go to
Allow Anonymous Telemetryand opt out of telemetry if you desire.Close
Settingsby for example clicking on theicon of Continue again.
Now we will configure which LLM we want to use:
On Continue window, click on
Or, configure your own models. A configuration window will open up.Right below the
Connectbutton, click onClick here to view more providers.Add Chat Modelwindow will pop up.Select the provider
Groq.Select in the model drop-down menu
Autodetect.Paste the API key that you have here into the corresponding field.
Click
Connect. The tutorial filecontinue_tutorial.pyand the text-based configuration fileconfig.yamlwill open up in new tabs. You can close these two files.Additionally
Modelswindow will be open. Selectopenai/gpt-oss-120bforChatandApply.Click on the
Modelsto close the model settings.
Usage example: explanation #
We are going to try the agent on our previous code. So click back to the tab with the source file you have written before – the DKK EUR converter.
On the Continue tab, if the a subwindow called
Create Your Own Assistantis open, you can close it.
We will ask the agent to explain the code for us. Before you do, take one minute at the code line by line and try to guess how the program creates the output. It is completely acceptable, if you don’t understand most of the lines. You will gradually improve. Continue after your try.
Write on the prompt field
explain line by lineand press AltEnter. Without Alt, Continue does not send your code to the agent.Compare your explanation with the agent’s.
LLM-based inline code suggestions can affect your learning negatively#
One of the learning goals of this course is to identify and explain core programming concepts without an LLM to be able to criticize the output of a LLM later when you become more proficient. For learning programming concepts, you have to face some difficulty and not use an autopilot. My recommendations to reach the learning goals of this course are:
First write the programs yourself and ask the LLM as a second step only for getting feedback, improvements or explanations.
If the extension you use provides code suggestions while you write code – called inline code suggestions, deactivate this feature. Compared to typical code completion by the IDE, LLM-based suggestions are long and detailed. This feature can be too much help for learning in the beginning. After you become confident, e.g., you are able to write programs yourself, you can carefully activate this feature again.
As I heard, GitHub Copilot uses inline code suggestions in form of ghost text as default. Search for copilot disable inline suggestions to disable it.
Continue’s inline suggestions work only with particular LLMs. In our configuration, they are not activated. Continue suggests code changes only after asking in the chat window, that must be applied with mouse clicks. First try to understand these changes, and then type them manually until you become more confident.