New Arrivals/Restock

Transformers and Large Language Models: A Hands-On Guide to RAG and Agentic AI First Edition

flash sale iconLimited Time Sale
Until the end
04
00
39

$35.99 cheaper than the new price!!

Free shipping for purchases over $99 ( Details )
Free cash-on-delivery fees for purchases over $99
Please note that the sales price and tax displayed may differ between online and in-store. Also, the product may be out of stock in-store.
New  $59.99
quantity

Product details

Management number 220491335 Release Date 2026/05/03 List Price $24.00 Model Number 220491335
Category

This book is a hands-on guide to understanding the foundations, architectures, and real-world applications of transformers and large language models in modern AI.The book begins by laying the foundations of generative AI architectures, tokenization, encoding, and classical modeling techniques. Initial chapters address the evolution from feed-forward networks and recurrent neural networks to long short-term memory (LSTM), setting the stage for the revolutionary transformer architecture. The core of the book focuses on transformers, introducing the encoder-decoder framework, attention mechanisms, positional encodings, and the internal workings of multi-head attention, normalization, and multi-layer perceptrons. Readers gain insight into advanced techniques such as rotary positional embeddings (RoPE), mixture of experts (MoE), and knowledge distillation, alongside practical training strategies like self-supervised learning, fine-tuning, and reinforcement learning with human feedback. Popular models from OpenAI, DeepSeek, and other vendors are examined to highlight the evolution of the LLM landscape. Building on these foundations, the text explores methods for model customization, including parameter-efficient fine-tuning (LoRA, adapters), text generation strategies, prompt engineering, and quantization. Retrieval-Augmented Generation (RAG) is introduced as a critical innovation for grounding LLMs in external knowledge, with detailed evaluation techniques for retrieval and generation. Finally, the book ventures into Agentic AI, demonstrating protocols like Model Context Protocol (MCP) and Agent-to-Agent (A2A) interactions with practical coding examples.In conclusion, this book serves as both a practical guide, equipping readers with the technical depth and applied strategies needed to design, fine-tune, and deploy cutting-edge transformers and large language models for real-world applications.What we will learn:Ø Understand the foundations of AI, ML pipelines, tokenization, encoding, and early neural architectures.Ø Explore transformers in depth—encoder-decoder design, attention mechanisms, and advanced embedding methods.Ø Learn modern LLM advancements like RoPE, MoE, SLMs, fine-tuning strategies, and evaluation techniques.Ø Master practical customization through prompt engineering, PEFT methods, quantization, and text generation.nWho this book is for:Data scientists, ML engineers, AI researchers, and developers exploring Transformers and large language models. Read more

ISBN13 979-8868827846
Edition First Edition
Language English
Publisher Apress
Item Weight 1.11 pounds
Publication date July 3, 2026

Correction of product information

If you notice any omissions or errors in the product information on this page, please use the correction request form below.

Correction Request Form

Product Review

You must be logged in to post a review