Tutorials
April 21, 2024

How To Easily Run Llama 3 In Visual Studio Code For FREE

A step-by-step guide in running Llama 3 8b on your local machine.

Jim Clyde Monge
by 
Jim Clyde Monge

Meta released the most capable open-source language model, Llama 3, yesterday. Because it is open-source, you can download the model weights and run them locally on your own machine.

I know, I know. The thought of running an 8 billion parameter AI model on your laptop might sound like it’s only for tech-savvy people. But don’t worry! In this article, I will share a step-by-step guide that makes it easy for anyone to do.

Prerequisites

Before getting into the actual steps, it’s important to note the specifications of the environment that I am currently running on:

  • Laptop: Lenovo ThinkPad X1 Extreme
  • OS: Windows 11 Pro Version 10.0.22631 Build 22631
  • CPU: Processor Intel(R) Core(TM) i7–9850H
  • RAM: 32 GB
  • Disk space: 642 GB

That’s right! You don’t need to have a high-end GPU to run the model locally. With a decent CPU and enough RAM, you can run Llama 3 on your own machine without any issues.

Step 1: Download and install Ollama

Head over to the Ollama website and download the latest version of the installer. Ollama is a versatile tool designed to run, create, and share large language models (LLMs) locally on various platforms.

Download and install Ollama
Image by Jim Clyde Monge

After installing Ollama, make sure that it is running in the background. You can check this by looking for the Ollama icon in your system tray or task manager.

After installing, make sure that Ollama is running in the background.
Image by Jim Clyde Monge

To confirm that Ollama is working properly in the command line interface (CLI), run the following command to check the version. The current version I am running is 0.1.32, so it might be different on your side.

> ollama -v
ollama version is 0.1.32

Step 2: Install CodeGPT extension in VS Code

Next, open Visual Studio Code and go to the extensions tab. Search for “CodeGPT” from codegpt.co and install the extension. This extension will allow you to use Llama 3 directly within VS Code.

Install CodeGPT extension in VS Code
Image by Jim Clyde Monge

Once the extension is installed, you should see the CodeGPT icon on the left sidebar of VS Code.

Step 3: Download the model

Open the terminal in VS Code and run the following command to download the Llama 3 model:

ollama pull llama3:8b

This might take a while to finish because the model size is more than 4GB. Be patient and let it complete. Once done, you should see a success message like this:

Download llama 3 8b model: ollama pull llama3:8b
Image by Jim Clyde Monge

Step 4: Select the Llama 3 model

On the CodeGPT dashboard in the left panel of VS Code, find the Provider dropdown menu and choose Ollama. Then, in the model dropdown, select “Llama3:8b”. If the model doesn’t show up in the list, you can also type “Llama3:8b” manually.

Select the llama 3 8b model in visual studio code
Image by Jim Clyde Monge

Make sure to select the correct model to ensure that CodeGPT uses Llama 3 for generating responses.

Step 5: Run Llama 3 on VS Code

Okay, now that we have the model downloaded and CodeGPT installed on VS Code, let’s check if everything is working correctly by giving it a test prompt.

Prompt: what is the distance of the moon from Earth?

Response: The average distance from the Earth to the Moon is approximately 384,400 kilometers (238,900 miles). This is called the “lunar distance” or “lunar mean distance.” However, the Moon’s orbit is not a perfect circle and its distance from Earth varies slightly due to the elliptical shape of its orbit. The closest point in this orbit, called “perigee,” is about 356,400 kilometers (221,500 miles), while the farthest point, called “apogee,” is about 405,500 kilometers (252,000 miles).

Llama 3 example: Prompt: what is the distance of the moon from Earth? Response: The average distance from the Earth to the Moon is approximately 384,400 kilometers (238,900 miles). This is called the “lunar distance” or “lunar mean distance.” However, the Moon’s orbit is not a perfect circle and its distance from Earth varies slightly due to the elliptical shape of its orbit. The closest point in this orbit, called “perigee,” is about 356,400 kilometers (221,500 miles), while the farthest point,
Image by Jim Clyde Monge

Awesome! It works. Now, let’s use the model to explain a source code. Go ahead and write or open any source code file in VS Code. Right-click on the file and select “CodeGPT: Explain CodeGPT” to ask the AI to explain the source code.

Go ahead and write or open any source code. Right-click on the file and select “CodeGPT: Explain CodeGPT” to ask the AI to explain the source code.
Image by Jim Clyde Monge

Notice that the code gets passed into the CodeGPT panel as a prompt input. The AI analyzes the code and provides a detailed explanation of it.

Notice that the code gets passed into the CodeGPT panel as a prompt input. The AI analyzes the code and gives a detailed explanation of it.
Image by Jim Clyde Monge

This is really cool because you no longer have to copy and paste code blocks into ChatGPT or any other chatbots outside VS Code. Plus, it’s completely free and works locally on your machine, so you don’t need to worry about API costs or internet connectivity.

And there you have it! A step-by-step guide on how to run Llama 3 in Visual Studio Code. I hope you found this guide helpful and easy to follow. Running powerful language models locally on your own machine is not as daunting as it might seem at first.

If you want to learn more tricks for running open-source language models on your local machine, such as using the CLI, LM Studio, or other tools, let me know in the comments below. I’d be happy to share more tips and tricks to help you get the most out of these incredible AI models.

‍

Get your brand or product featured on Jim Monge's audience