The code will definitely be publicly avaible when the book is released! We aim to create Google Colab notebooks so that you can run it for free using their T4 GPUs. Having said that, there might be a few expections that require more VRAM, like fine-tuning certains LLMs.
All in all, our goal is to make sure you can run the book without needing additional services. I can't promise that will be for all examples throughout the book but it will be for most.
Great book Thanks Maarten
Love this book. So good. 🤌 Thanks!
Is the code open publicly? like hosted on Github?
LLM requires high computational power, is it feasible to run on personal devices by following this handbook?
The code will definitely be publicly avaible when the book is released! We aim to create Google Colab notebooks so that you can run it for free using their T4 GPUs. Having said that, there might be a few expections that require more VRAM, like fine-tuning certains LLMs.
All in all, our goal is to make sure you can run the book without needing additional services. I can't promise that will be for all examples throughout the book but it will be for most.
Do you need a technical reviewer before the book is published?
Thank you for the suggestion! We are actually nearing some cool deadlines and have passed the last and final round of technical reviewers.
Alright, thanks!