A Guide to Large Language Model Abstractions
Date : 2024-01-30
Description
In this article in Two Sigma, the authors provide a comprehensive overview of the landscape of frameworks for abstracting interactions with and between large language models. They suggest two systems of organization for reasoning about the various approaches to, and philosophies of, LLM abstraction:
- Language Model System Interface Model (LMSI), a new seven layer abstraction, inspired by the OSI model in computer systems and networking, to stratify the programming and interaction frameworks that have emerged in recent months.
- A categorization of five families of LM Abstractions which they have identified to perform similar classes of functionality.
Read article here
Recently on :
Artificial Intelligence
Information Processing | Computing
WEB - 2024-12-30
Fine-tune ModernBERT for text classification using synthetic data
David Berenstein explains how to finetune a ModernBERT model for text classification on a synthetic dataset generated from argi...
WEB - 2024-12-25
Fine-tune classifier with ModernBERT in 2025
In this blog post Philipp Schmid explains how to fine-tune ModernBERT, a refreshed version of BERT models, with 8192 token cont...
WEB - 2024-12-18
MordernBERT, finally a replacement for BERT
6 years after the release of BERT, answer.ai introduce ModernBERT, bringing modern model optimizations to encoder-only models a...
PITTI - 2024-09-19
A bubble in AI?
Bubble or true technological revolution? While the path forward isn't without obstacles, the value being created by AI extends ...
PITTI - 2024-09-08
Artificial Intelligence : what everyone can agree on
Artificial Intelligence is a divisive subject that sparks numerous debates about both its potential and its limitations. Howeve...