Llama for Python Programmers is designed for programmers who want to leverage the Llama 2 large language model (LLM) and take advantage of the generative artificial intelligence (AI) revolution. In this course, you’ll learn how open-source LLMs can run on self-hosted hardware, made possible through techniques such as quantization by using the llama.cpp package. You’ll explore how Meta’s Llama 2 fits into the larger AI ecosystem, and how you can use it to develop Python-based LLM applications. Get hands-on skills using methods such as few-shot prompting and grammars to improve and constrain Llama 2 output, allowing you to get more robust data interchanges between Python application code and LLM inference. Lastly, gain insight into the different Llama 2 model variants, how they were trained, and how to interact with these models in Python.
通过 Coursera Plus 提高技能,仅需 239 美元/年(原价 399 美元)。立即节省

您将学到什么
Understand how to use llama.cpp Python APIs to build Llama 2-based large language model (LLM)applications.
Learn to run and interact with the Llama 2 large language model on commodity local hardware.
Learn to utilize zero- and few-shot prompting as well as advanced methods like grammars in llama.cpp to enhance and constrain Llama 2 model output.
Learn about the different Llama 2 model variants: the base model, chat model, and code llama and how to interact with these models in Python.
要了解的详细信息

添加到您的领英档案
3 项作业
了解顶级公司的员工如何掌握热门技能

从 Software Development 浏览更多内容
状态:免费试用
状态:预览Duke University

Duke University
人们为什么选择 Coursera 来帮助自己实现职业发展

Felipe M.

Jennifer J.

Larry W.

Chaitanya A.
学生评论
- 5 stars
73.68%
- 4 stars
26.31%
- 3 stars
0%
- 2 stars
0%
- 1 star
0%
显示 3/19 个
已于 Jun 13, 2024审阅
Very good course to practice the prompting engineering
已于 Feb 26, 2025审阅
Overall good course. The assignment is finicky due to the model used but overall great intro to llama.





