Silo AI's new release Viking 7B, bridges the gap for low-resource languages

Viking 7B is a significant milestone on the journey towards a state-of-the-art LLM family for all European languages.
Silo AI's new release Viking 7B, bridges the gap for low-resource languages

Today Europe’s largest private AI lab Silo AI is releasing, together with the University of Turku’s research group TurkuNLP and HPLT, the first multilingual large language model (LLM) for all Nordic languages. 

In addition to the Nordic languages, Viking also covers English and programming languages. Evaluations indicate best-in-class performance in all Nordic languages without compromising performance in English. 

I spoke to  Silo AI CEO and co-founder Peter Sarlin to learn more: 

Viking 7B is a significant milestone on the journey towards a state-of-the-art LLM family for all European languages. English is overrepresented in most models due to the internet and digitisation efforts, as well as the dominance of English in scientific literature, which raises concerns about linguistic and cultural biases. 

Silo AI is dedicated to ensuring that data utilised in these models accurately represent European languages, also covering the English-speaking world. 

Viking relies on the same training approach as Poro, focusing on low-resource languages without compromising English. However, it extends to include Danish, Finnish, Norwegian, Icelandic, Swedish, and programming languages. The model family comes with an updated architecture and in a variety of model sizes. 

According to Sarlin: 

On a general level, training LLMs in low-resource languages requires significant effort and expertise. 

We have done a lot of work to acquire high-quality data in the languages in question, curate and process that data, and then incorporated techniques like cross-training low-resource languages with high-resource languages.”

It is also widely known that the so-called ‘curse of multilinguality’ impacts performance, which implies that commercially established LLMs might be biased towards the largest market and languages.

Viking 7B marks an important step in the company’s ongoing efforts to develop performant language models for all official EU languages. 

Sarlin shared:

“With the Viking model family, we reaffirm our commitment to Europe’s digital sovereignty. With this digital infrastructure, we want to provide an opportunity for companies to use capable open source LLMs to create valuable solutions for the broader community and society.” 

Viking 7B is sensitive to local values and cultures.  This sensitivity ensures that technological advancements serve as connectors, rather than dividers, in digital communication. It enhances Europe’s digital infrastructure, thereby accelerating the adoption of LLM-driven products and applications. 

Sarlin explained:

“By ensuring that Viking 7B, and our other models natively trained on low-resource languages, are built on data and information accurately representing the diverse languages, citizens, organisations and cultural landscape, collectively representing the culture and values of the countries in question.

 This approach aligns with European values and allows for sovereignty in downstream applications and value creation.

A well-known feature of multilingual LLMs is that reasoning capability descends from dominating languages, which implies that the data and training approach used directly impacts what values, culture, and societal norms are represented by the model.”

Image: LUMI.

Further emphasising digital sovereignty, Viking is trained on the EuroHPC supercomputer LUMI, utilising up to 4096 AMD MI-250X GPUs. LUMI is not only Europe’s most powerful supercomputer and the 5th most powerful in the world but also the 3rd greenest supercomputer among the top 500 supercomputers. 

LUMI’s energy consumption is covered by power produced 100 percent by hydroelectricity, and the waste heat of LUMI will account for about 20 percent of the district heating in the surrounding city of Kajaani. 

Viking 7B completed and checkpoint performance 

Today, the Viking models stand at 100 per cent of training on Viking 7B, 85 per cent on 13B and 65 per cent on 33B

According to Silo, AI, common benchmarks help observe evidence of outperformance with respect to other open models (e.g. Falcon, GPT-SW3, Llama, Mistral, MPT, etc.). Results indicate best-in-class performance in low-resource languages vis-à-vis other open models without compromising performance in English and programming languages. 

Silo AI Offers Customisable AI building blocks for diverse usecases 

Accordion to Sarlin: 

A large number of different types of organisations have shown an interest in and ultimately use our models. However, base models represent an essential building block for them to be really useful for different use cases, further customisation is needed. 

There has also been significant demand for our product which is a model-serving platform that allows organisations to integrate the models into their products using their own data, customise it to their use cases, and fully own the model in a compliant, effective, and privacy-respecting manner.”

Viking 7B shatters the linguistic dominance of English in AI, offering strong performance across all Nordic languages alongside English and programming languages. Trained on Europe's green supercomputer,it promotes digital sovereignty and empowers developers to create inclusive AI applications that bridge cultural divides.

Follow the developments in the technology world. What would you like us to deliver to you?
Your subscription registration has been successfully created.