Domain-Specific Small Language Models (MEAP V03) by Guglielmo Iozzia
English | 2025 | ISBN: 9781633436701 | 237 pages | EPUB, PDF | 13.5 MB
Bigger isn’t always better. Train and tune highly focused language models optimized for domain specific tasks.
When you need a language model to respond accurately and quickly about a specific field of knowledge, the sprawling capacity of a LLM may hurt more than it helps. Domain-Specific Small Language Models teaches you to build generative AI models optimized for specific fields.
In Domain-Specific Small Language Models you’ll discover
Model sizing best practices
Open source libraries, frameworks, utilities and runtimes
Fine-tuning techniques for custom datasets
Hugging Face’s libraries for SLMs
Running SLMs on commodity hardware
Model optimization or quantization
Perfect for cost- or hardware-constrained environments, Small Language Models (SLMs) train on domain specific data for high-quality results in specific tasks. In Domain-Specific Small Language Models you’ll develop SLMs that can generate everything from Python code to protein structures and antibody sequences—all on commodity hardware.
about the book
Domain-Specific Small Language Models teaches you how to create language models that deliver the power of LLMs for specific areas of knowledge. You’ll learn to minimize the computational horsepower your models require, while keeping high–quality performance times and output. You’ll appreciate the clear explanations of complex technical concepts alongside working code samples you can run and replicate on your laptop. Plus, you’ll learn to develop and deliver RAG systems and AI agents that rely solely on SLMs, and without the costs of foundation model access.