In Generative AI with Large Language Models (LLMs), created in partnership with AWS, you’ll learn the fundamentals of how generative AI works, and how to deploy it in real-world applications.
What you’ll get
- Gain foundational knowledge, practical skills, and a functional understanding of how gen AI works.
- Dive into the latest research on Gen AI to understand how companies are creating value with cutting-edge technology.
- Instruction from expert AWS AI practitioners who actively build and deploy AI in business use cases today.
What you’ll do
- Deeply understand generative AI, describing the key steps in a typical LLM-based generative AI lifecycle, from data gathering and model selection to performance evaluation and deployment.
- Describe in detail the transformer architecture that powers LLMs, how they’re trained, and how fine-tuning enables LLMs to be adapted to a variety of specific use cases.
- Use empirical scaling laws to optimize the model’s objective function across dataset size, compute budget, and inference requirements.
- Apply state-of-the-art training, tuning, inference, tools, and deployment methods to maximize the performance of models within the specific constraints of your project.
- Discuss the challenges and opportunities that generative AI creates for businesses after hearing stories from industry researchers and practitioners.
- Receive a Coursera certificate demonstrating your skills upon completion of the course.
Andrew Ng, Founder & CEO of Landing AI shared a post on LinkedIn saying, “Generative AI with Large Language Models, created with AWS and hosted on Coursera. This course goes deep into the technical foundations of LLMs and how to use them. You can sign up here: https://lnkd.in/gUYSmCYx“
“You’ll work through the full life-cycle of a generative AI project, and learn specific techniques like RLHF; zero-shot, one-shot, and few-shot learning with LLMs; advanced prompting frameworks like ReAct; even fine-tuning LLMs, and gain hands-on practice with all of these techniques”,
“Instructors Antje Barth, Chris Fregly, Shelbee Eigenbrode, and Mike Chambers all do incredible work at AWS and have supported many companies to build many LLM applications. They bring tremendous practical LLM expertise to this course”, the post added.
Who should join?
- For data scientists: Gain deeper knowledge into the underlying structure and mechanisms of it and explore avenues for further innovations in this field.
- For machine learning engineers: Learn how to better train, optimize, and fine-tune generative models while learning about different use cases and applications.
- For prompt engineers: Explore advanced prompting techniques and learn how to control your output using generative configuration parameters.
- For research engineers: Explore the state of art generative models and architectures in depth to build on top of your own advanced techniques in generative AI.
- For anyone interested: Get an extensive introduction to developing generative AI and its fundamentals.