Resources to learn Generative ai, Large Language Models, and building ML Applications
At FourthBrain, we are dedicated to keeping up with the latest tools and techniques in Machine Learning and Artificial Intelligence. See below for resources, event recordings, and more that will help you stay at the top of your game.
Community Sessions
Our series of three community sessions focused on LLMs and Generative AI.
Community Session 1: The State of LLMs
LLMs are evolving incredibly quickly. The Big Takeaway from this session: It is VERY DIFFICULT to keep up with the state of LLMs.
Resources from this live session:
- LinkedIn Recap from Head of Product Greg Loughnane
- Google Colab Notebook for Lit-LLamA and Dolly15k Demo used in the session
- Slides
- Recording
Community Session 2: Fine-Tuning vs Prompt Engineering
What is the true difference between prompt engineering, fine-tuning, and instruction tuning? We explored this plus more in our second community session.
Resources from this live session:
Community Session 3: Indexing and Chaining
Our third session focused on LangChain and LLMOps! Indexing, chaining, and vector databases were discussed, and we also reviewed how to integrate a prompt chain with data-indexing and tools chains to create a document QA bot.
Resources from this live session:
- LinkedIn Recap from Head of Product Greg Loughnane
- Google Colab Notebook: QuestionMyDoc
- Slides
- Recording
Demos and Live Events
We often partner with other organizations to offer demonstrations and guides for building ML applications, including generative AI and LLMs. Check out some of the recordings below.
Practical Data Science on AWS: Generative AI
Follow along with AWS Instructors Antje Barth and Chris Fregly as they share tips on practical Data Science in the AWS cloud. You'll see demonstrations of using Amazon CodeWhisper to generate Python code, and Amazon SageMaker to generate images using Stable Diffusion.
Building Machine Learning Apps with Hugging Face: LLMs to Diffusion Modeling
During this live workshop, you'll learn how to build powerful ML features without spending time on MLOps. Hugging Face Product Director Jeff Boudier covers how to take data science projects into production without complicated infrastructure, and how to use Hugging Face models, libraries and tools for your ML applications.
Beyond the Jupyter Notebook: Setting up your MLOps Development Environment
This interactive workshop focused on how to set up your new computer to build and deploy industry-standard machine learning applications so that you’re ready to maximize your impact during your first week on the job.
Our live event focused on MacOS. We created subsequent recordings for Windows and Linux.