Bailey Wallace
Bailey00
Β·
AI & ML interests
None yet
Recent Activity
reacted to danielhanchen's post with β€οΈ 13 days ago
You donβt need to set LLM parameters anymore! π
llama.cpp uses only the context length + compute your local setup needs. Unsloth also auto-applies the correct model settings
Try in Unsloth Studio - now with precompiled llama.cpp binaries.
GitHub: https://github.com/unslothai/unsloth reacted to danielhanchen's post with π₯ 13 days ago
A new way to use Unsloth.
Coming soon...Organizations
None yet