Swiss Researchers Unveil Fully Open, Multilingual LLM Trained on Alps Supercomputer

2 Sources

Share

Swiss researchers from ETH Zürich and EPFL are set to release a fully open, multilingual large language model trained on the Alps supercomputer, promising transparency and broad accessibility.

Swiss Researchers Announce Groundbreaking Open-Source LLM

In a significant development for the AI community, researchers from ETH Zürich and the Swiss Federal Technology Institute in Lausanne (EPFL) have unveiled plans to release a fully open large language model (LLM) trained on Switzerland's Alps supercomputer. The announcement was made at the International Open-Source LLM Builders Summit in Geneva, marking a pivotal moment in the pursuit of transparent and accessible AI technology

1

2

.

Model Specifications and Capabilities

Source: Tech Xplore

Source: Tech Xplore

The forthcoming LLM will be available in two sizes: 8 billion and 70 billion parameters. Trained on an impressive 15 trillion tokens of data, the model is designed to be fluent in over 1,000 languages, with approximately 40% of the training data being in languages other than English

1

. This multilingual approach aims to maintain high global applicability and serve a diverse range of users and applications

2

.

Transparency and Open-Source Commitment

What sets this model apart is its commitment to full transparency. Unlike many commercial models developed behind closed doors, the Swiss researchers intend to release not only the model and weights but also the source code used for training. Additionally, they promise that the training data will be transparent and reproducible

1

. This level of openness is expected to foster innovation and accountability in AI development

2

.

Training Infrastructure: Alps Supercomputer

Source: The Register

Source: The Register

The model's training was made possible by the Alps supercomputer, currently ranked as the third most powerful in Europe and eighth worldwide. Alps features over 10,000 Nvidia Grace-Hopper Superchips, each combining a 72-core Arm-based CPU with a 96GB H100 GPU. This architecture allows for up to 42 exaFLOPS of sparse FP8 performance, making it particularly well-suited for AI workloads

1

.

Ethical Considerations and Compliance

The researchers have emphasized their commitment to ethical AI development. They claim that for most tasks and general knowledge questions, circumventing web crawling protections wasn't necessary, and complying with these opt-outs showed no significant performance degradation

1

. The model is being developed with consideration for Swiss data protection laws, copyright laws, and the transparency obligations under the EU AI Act

2

.

Release and Accessibility

The LLM is scheduled for release later this summer under the Apache 2.0 license, a highly permissive open-source license. This move is expected to support adoption across various sectors, including science, government, education, and private industry

1

2

. Accompanying documentation will provide details on the model architecture, training methods, and usage guidelines to facilitate transparent reuse and further development

2

.

Implications for AI Research and Development

This initiative represents a significant step towards democratizing AI technology and fostering a collaborative international ecosystem for open foundation models. By making the entire process transparent and accessible, the researchers aim to drive innovation not only in Switzerland but across Europe and through multinational collaborations

2

. This approach could potentially shift the landscape of AI development, currently dominated by closed-source models from major tech companies in the United States and China.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo