close
close

Opera is currently testing the ability to download LLMs for local use, a first for a major browser

0

opera-llms.png

Opera

If you use Opera, you'll be happy to know that today you get access to a new feature: LLMs, which the browser can access entirely locally on your device.

According to a press release from Opera, for the first time, a major browser will allow users to download LLMs for local use via a built-in feature. This means you can use LLMs in Opera without sending data to a server. The feature is available in Opera One's developer stream as part of the company's AI Feature Drops program, which allows early adopters to test experimental AI features.

Available LLMs include Gemma by Google, Mixtral by Mistral AI, Llama by Meta, and Vicuna. In total there are around 50 families and 150 LLMs available.

Also: 5 Reasons Opera Is My Favorite Browser (And You Should Check It Out Too)

Opera notes that each LLM can take up 2GB to 10GB of storage space. More specialized LLMs should require less. A local LLM may also be slightly slower than a server-based one, depending on the computing capabilities of your hardware, Opera warns.

The company recommends trying a few models, including Code Llama, an extension of Llama for generating and discussing code, Mixtral, designed for a wide range of natural language processing tasks such as text generation and question answering, and Phi from Microsoft Research – 2, a language model that demonstrates “excellent reasoning and language comprehension skills.”

To test this feature, you need to update to the latest version of Opera Developer. Once this is done, open the Aria Chat side panel and select “Select Local Mode” from the drop-down box at the top. From there, click “Go to Settings” and search for the models you want to download. Once your selected LLM is downloaded, click the menu button at the top left to start a new chat. Then you can select the model you just downloaded and start chatting.

The LLM you choose will replace Opera's native AI until you switch back to Aria.