Opera Brings AI to its Browser - Laptops Can Interact with Chatbots

Opera Brings AI to its Browser - Laptops Can Interact with Chatbots

Opera has become the first web browser manufacturer to offer built-in support for a wide range of artificial intelligence models that run locally.

Opera's One browser currently offers "experimental support" for 150 local large-scale language models (LLMs) from 50 different model families.

These include major local or open source LLMs such as Meta's Llama, Google's Gemma, Mistral AI's Mixtral, and Vicuna.

In doing so, Opera has outsmarted AI-savvy browsers like Microsoft Edge. The latter offers two AI integrations, Bing and Copilot, neither of which work locally.

The news means that Opera One users will be able to exchange Opera's own Aria AI services for more popular third-party LLMs. They will then be able to run these LLMs locally on their own computers, and the data will not be transferred to a remote server. They would also be accessible from a single browser.

Local LLMs are not only great for AI enthusiasts who value their privacy, but also for those who want to access AI models without an Internet connection. It would also be useful for those who want to use LLM while traveling.

"By introducing Local LLMs in this way, Opera can begin to explore ways to build experience and expertise in the rapidly emerging local AI space," said Krystian Kolondra, EVP of browser and gaming at Opera

While this is an exciting update for AI enthusiasts, it is currently an experimental feature and not yet widely available. Initially, it will only be available through the developer stream of Opera's AI Feature Drops Program.

Launched last month, the program gives early adopters a first taste of Opera's latest AI features. Fortunately, this program is not just for developers; anyone can access it.

Testing this feature requires the latest version of Opera Developer and 2-10 GB of computer storage for one local LLM, which must be downloaded to the machine.

Local LLMs include:

Using local LLMs with Opera One is very easy. After loading the browser, go into the side panel and select Aria.

A drop-down menu will appear at the top of the chat window, press it and click on the option "choose local AI model".

A small box will then appear on the screen with a button that says "go to settings". Here you can select 150 LLMs from 50 families.

Once you have selected your local LLMs, the next step is to download them by clicking on the down arrow button. Next, click on the three-line button at the top of the screen, then click on the "New Chat" button. You can then switch between different LLMs by clicking the "choose local AI model" button.

Categories