Farang raises €1.5M to launch outperforming AI models

Farang’s technology mimics how the human brain works to create intuitive responses while requiring twenty-five times fewer computational resources and therefore significantly lower training costs.
Farang raises €1.5M to launch outperforming AI models

Stockholm-based Farang, an AI research lab developing next-generation foundational Large Language Models, has raised €1.5 million in seed funding.

Farang has created a novel architecture for Large Language Models, competing with the existing transformer architecture used by current market leaders ChatGPT, Claude, and Gemini. Instead of predicting text word-by-word, Farang's architecture comprehends the complete response first, like imagining a picture before painting it, and then translates that concept into words.

While many recent AI models add reasoning steps to simulate “thinking time,” they still depend on word prediction. Farang’s approach carries out this reasoning internally through non-textual mechanisms, producing more coherent answers while sharply reducing computational demands.

Emil Romanus, Farang’s Founder, comments:

We're not building another application layer on top of existing models. We've developed a completely new foundational architecture that enables us to create specialised AI assistants that outperform current solutions in specific domains like programming and medicine, while using twenty-five times less computational resources. Based on current testing, we believe the percentage of resources used will decrease even further in the future.

Farang is targeting the weak spots of today’s AI assistants, starting with specialised applications where they often fall short. These include support for specific programming languages and frameworks, niche medical fields, and internal company tools. Its first focus is React programming, aiming to deliver more optimized and iterative code than current large language models.

The beauty of our architecture is that it enables us to create highly specialized models for niche use cases like untapped medical domain models that would be prohibitively expensive with traditional approaches,

Romanus explains.

We can analyse unstructured medical data or create programming assistants that truly understand specific frameworks, not just general coding patterns.

Farang’s technology also enables organizations to deploy specialized AI models on-premises with full privacy controls. This is crucial for healthcare, legal, and financial firms handling sensitive data. Instead of sending proprietary information to external AI services, companies can train and run Farang’s models entirely in-house, ensuring data sovereignty.

Romanus added:

Our vision is that companies will have these specialized assistants running on their own infrastructure, integrated with their existing systems. A law firm could have an AI that understands its specific practice areas and case history, or a research hospital could have an assistant trained on their unique patient data—all while keeping that sensitive information completely private.

An ambitious long-term vision

While starting with niche applications to prove the technology, Farang has set its sights on becoming a market leader in artificial intelligence, ultimately aiming to compete with and surpass OpenAI in the general AI space.

Romanus noted that the company is taking a different approach from big tech companies, adding:

By proving our architecture works in specialized domains first, we're building the foundation to eventually challenge the current leaders across all AI applications.

The round was led by Voima Ventures and the Amadeus APEX Technology Fund, with participation from prominent angel investors such as Tero Ojanpera (Co-founder of Silo AI), Nilay Oza, and Niraj Aswani (former founders of Klevu).

Inka Mero, Managing Partner & Founder of Voima Ventures, commented:

We look for exceptional founders and technologies that reset the curve—not just optimize around the edges. Farang showcases how Europe can step up in the global AI race, as its foundational architecture provides a true paradigm shift—specialized, efficient, and enterprise-grade from day one.

Ion Hauer, Principal at APEX Ventures, said the firm is always looking at breakthrough technologies with the potential to reshape industries, and Farang stood out right away:

For years, the industry has been making incremental improvements to the same Transformer foundation. What Emil and his team have developed represents a fundamental architectural leap—the kind of foundational change we believe will separate the next generation of AI leaders from today's incumbents.

The funding will primarily be used to scale up its proof-of-concept models and invest in computing power required to train and fine-tune models for specialised areas such as programming and medicine.

Comments
  1. Would you like to write the first comment?

    Would you like to write the first comment?

    Login to post comments
Follow the developments in the technology world. What would you like us to deliver to you?
Your subscription registration has been successfully created.