7 Comments

Great book Thanks Maarten

Expand full comment

Love this book. So good. 🤌 Thanks!

Expand full comment

Is the code open publicly? like hosted on Github?

LLM requires high computational power, is it feasible to run on personal devices by following this handbook?

Expand full comment

The code will definitely be publicly avaible when the book is released! We aim to create Google Colab notebooks so that you can run it for free using their T4 GPUs. Having said that, there might be a few expections that require more VRAM, like fine-tuning certains LLMs.

All in all, our goal is to make sure you can run the book without needing additional services. I can't promise that will be for all examples throughout the book but it will be for most.

Expand full comment

Do you need a technical reviewer before the book is published?

Expand full comment

Thank you for the suggestion! We are actually nearing some cool deadlines and have passed the last and final round of technical reviewers.

Expand full comment

Alright, thanks!

Expand full comment