![](https://pitti-backend-assets.ams3.digitaloceanspaces.com/petals_ba3bd101fd.png?w=3840&q=75)
Abstract
Large language models are among the most significant recent advances in machine learning. Still, leveraging these models can be difficult: offloading and quantization have limitations, and third-party APIs are less flexible. As an alternative solution, we propose Petals, an open-source decentralized system (showcased this week at the ACL 2023 Demonstrations track) allowing anybody to run large models or even adapt them using the idle resources of volunteers. In this post, you will learn the motivation behind the system, its underlying ideas, and its advantages compared to other ways of using large models. Petals was developed as a part of the BigScience collaboration by engineers and researchers from Yandex Research, HSE University, University of Washington, Hugging Face, ENS Paris-Saclay, and Yandex School of Data Analysis.
![](https://pitti-backend-assets.ams3.digitaloceanspaces.com/thumbnail_finetuning_modernbert_argilla_828e0d3969.png?w=384&q=75)
![](https://pitti-backend-assets.ams3.digitaloceanspaces.com/thumbnail_finetuning_modernbert_philschmidt_0d32e4f3eb.png?w=384&q=75)
![](https://pitti-backend-assets.ams3.digitaloceanspaces.com/thumbnail_modernbert_anserai_a65c02643c.png?w=384&q=75)
![](https://pitti-backend-assets.ams3.digitaloceanspaces.com/thumbnail_ai_bubble_thumbnail_8909f3f6f8.png?w=384&q=75)
![](https://pitti-backend-assets.ams3.digitaloceanspaces.com/thumbnail_LMSYS_arena_cf9d4a89a6.png?w=384&q=75)