-
Notifications
You must be signed in to change notification settings - Fork 269
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposal: Improve llama.cpp snippet #778
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for making the docs better! 🤗
title: "Use pre-built binary", | ||
setup: [ | ||
// prettier-ignore | ||
"# Download pre-built binary from:", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Conceptually LGTM. Just wondering if this doesn't bloat the overall UI for snippets for a user i.e. we present too many options to the end-user.
(just food for thought)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In fact, I switched my mind to "UX/UI designer" when I drafted this proposal ;-)
The current UI has a problem that these multiple snippets but no title for them (visually hard to distinct between the 2 snippets):
My first iteration would be to have a title for each snippet, then only "expand" one section at a time (while other options are "collapsed")
But then I think we can also split between the "setup" and "run" step, since ideally the user will setup just once but run multiple times.
Feel free to give other suggestions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea I quite like the second image. Let's ask @gary149/ @julien-c for thoughts here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @gary149 wdyt
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bumping this as I think it'll be good to merge this soon! cc: @gary149 (sorry for creating an extra notification)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay on the UI, but let's see if it's not busted on the Hub side. (sorry for the late reply)
packages/tasks/src/local-apps.ts
Outdated
@@ -39,36 +45,52 @@ export type LocalApp = { | |||
* And if not (mostly llama.cpp), snippet to copy/paste in your terminal | |||
* Support the placeholder {{GGUF_FILE}} that will be replaced by the gguf file path or the list of available files. | |||
*/ | |||
snippet: (model: ModelData) => string | string[]; | |||
snippet: (model: ModelData) => Snippet | Snippet[]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
snippet: (model: ModelData) => Snippet | Snippet[]; | |
snippet: (model: ModelData) => string | string[] | Snippet | Snippet[]; |
to be backward compatible (and to compile)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed with 98a2637
But I would recommend to move completely to Snippet
interface, because my idea is to have dedicated "title" to explain what each snippet does.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i've launched the CI
-n 128`, | ||
{ | ||
title: "Install from brew", | ||
setup: "brew install llama.cpp", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw unrelated but wondering is llama.cpp is on winget? cc @mfuntowicz too
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I checked here: https://winget.run and didn't find any. To think of it, it'd be a pretty cool idea to add that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah it would be nice to have, knowing that llama.cpp already have pre-built binary via CI. Unfortunately I'm not very familiar with windows stuff, so I'll create an issue on llama.cpp to see if someone can help.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Created the issue: ggerganov/llama.cpp#8188
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's note to also update/improve this doc page: https://huggingface.co/docs/hub/en/gguf-llamacpp
packages/tasks/src/local-apps.ts
Outdated
@@ -1,6 +1,12 @@ | |||
import type { ModelData } from "./model-data"; | |||
import type { PipelineType } from "./pipelines"; | |||
|
|||
interface Snippet { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
interface Snippet { | |
export interface LocalAppSnippet { |
suggestion to export it and name it as LocalAppSnippet
so that we can use it from hf.co codebase
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed in 7983478
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, I like the visual improvements mentioned above and we could use them for other snippets as well (i.e., use in transformers), not just local apps. We can discuss about a Snippet
type in a separate PR and move forward with this one for now.
packages/tasks/src/local-apps.ts
Outdated
@@ -1,6 +1,12 @@ | |||
import type { ModelData } from "./model-data"; | |||
import type { PipelineType } from "./pipelines"; | |||
|
|||
interface Snippet { | |||
title: string; | |||
setup: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
setup: string; | |
setup?: string; |
make the setup step optional. Maybe some snippets will not need the setup step
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed in 7983478
packages/tasks/src/local-apps.ts
Outdated
interface Snippet { | ||
title: string; | ||
setup: string; | ||
command: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
command: string; | |
content: string; |
rename command
-> content
to be consistent with how we name this kind of stuff in hf.co codebase
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the suggestions @mishig25 . Sorry for the late response, I'll have a look later this week when I have more time !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed in 7983478
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great, as mentioned I'd suggest to discuss a Snippet
type in a future PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
imo let's merge as is and i can do required changes in moon-landing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm !
In this PR, I propose some changes:
llama-cli
(for more details, see this PR:build
: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... ggerganov/llama.cpp#7809 and this homebrew formula)title
,setup
andcommand
--conversation
mode to start llama.cpp in chat mode (chat template is now supported, ref: Add chat template support for llama-cli ggerganov/llama.cpp#8068)Proposal for the UI:
(Note: Maybe the 3 sections title - setup - command can be more separated visually)