Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Building AI Intensive Python Applications
  • Table Of Contents Toc
  • Feedback & Rating feedback
Building AI Intensive Python Applications

Building AI Intensive Python Applications

By : Rachelle Palmer, Ben Perlmutter, Ashwin Gangadhar, Nicholas Larew, Sigfrido Narváez, Thomas Rueckstiess, Henry Weller, Richmond Alake, Shubham Ranjan
close
close
Building AI Intensive Python Applications

Building AI Intensive Python Applications

By: Rachelle Palmer, Ben Perlmutter, Ashwin Gangadhar, Nicholas Larew, Sigfrido Narváez, Thomas Rueckstiess, Henry Weller, Richmond Alake, Shubham Ranjan

Overview of this book

The era of generative AI is upon us, and this book serves as a roadmap to harness its full potential. With its help, you’ll learn the core components of the AI stack: large language models (LLMs), vector databases, and Python frameworks, and see how these technologies work together to create intelligent applications. The chapters will help you discover best practices for data preparation, model selection, and fine-tuning, and teach you advanced techniques such as retrieval-augmented generation (RAG) to overcome common challenges, such as hallucinations and data leakage. You’ll get a solid understanding of vector databases, implement effective vector search strategies, refine models for accuracy, and optimize performance to achieve impactful results. You’ll also identify and address AI failures to ensure your applications deliver reliable and valuable results. By evaluating and improving the output of LLMs, you’ll be able to enhance their performance and relevance. By the end of this book, you’ll be well-equipped to build sophisticated AI applications that deliver real-world value.
Table of Contents (18 chapters)
close
close
3
Part 1: Foundations of AI: LLMs, Embedding Models, Vector Databases, and Application Design
8
Part 2: Building Your Python Application: Frameworks, Libraries, APIs, and Vector Search
11
Part 3: Optimizing AI Applications: Scaling, Fine-Tuning, Troubleshooting, Monitoring, and Analytics
Appendix: Further Reading: Index

Best practices

Selecting the most appropriate embedding models and vector size is not merely a technical decision, but a strategic one that aligns with the unique characteristics, technical and organizational constraints, and objectives of your project.

Maintaining computational efficiency and cost is another cornerstone of effectively using embedding models. As some models can be resource-intensive and have higher response times and higher cost, optimizing the computational aspects without sacrificing the quality of the output is essential. Designing your system to use different embedding models depending on the task at hand will yield a more resilient application architecture.

It’s imperative to regularly evaluate your embedding model to ensure your AI/ML application continues to perform as expected. This involves routinely checking performance metrics and making necessary adjustments. Tweaking your model usage could mean altering vector sizes to avoid overfitting—...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY