Skip to content

Commit 6eed949

Browse files
authored
[Go] weave models.md (#473)
* [Go] weave models.md See #469.
1 parent a00a415 commit 6eed949

File tree

5 files changed

+426
-61
lines changed

5 files changed

+426
-61
lines changed

docs-go/generate.sh

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ if [[ ! -f $weave ]]; then
66
go -C ../go install ./internal/cmd/weave
77
fi
88

9-
$weave flows > flows.md
10-
9+
for file in flows models; do
10+
$weave $file > $file.md
11+
done
1112

docs-go/models

Lines changed: 159 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,159 @@
1+
# Generating content
2+
3+
Firebase Genkit provides an easy interface for generating content with LLMs.
4+
5+
## Models
6+
7+
Models in Firebase Genkit are libraries and abstractions that provide access to
8+
various Google and non-Google LLMs.
9+
10+
Models are fully instrumented for observability and come with tooling
11+
integrations provided by the Genkit Developer UI -- you can try any model using
12+
the model runner.
13+
14+
When working with models in Genkit, you first need to configure the model you
15+
want to work with. Model configuration is performed by the plugin system. In
16+
this example you are configuring the Vertex AI plugin, which provides Gemini
17+
models.
18+
19+
- {Go}
20+
21+
%include ../go/internal/doc-snippets/models.go import
22+
23+
%include ../go/internal/doc-snippets/models.go init
24+
25+
Note: Different plugins and models use different methods of
26+
authentication. For example, Vertex API uses the Google Auth Library so it can
27+
pull required credentials using Application Default Credentials.
28+
29+
To use models provided by the plugin, you need a reference to the specific model
30+
and version:
31+
32+
- {Go}
33+
34+
%include ../go/internal/doc-snippets/models.go model
35+
36+
## Supported models
37+
38+
Genkit provides model support through its plugin system. The following plugins
39+
are officially supported:
40+
41+
| Plugin | Models |
42+
| ------------------------- | ------------------------------------------------------------------------ |
43+
| [Google Generative AI][1] | Gemini Pro, Gemini Pro Vision |
44+
| [Google Vertex AI][2] | Gemini Pro, Gemini Pro Vision, Gemini 1.5 Flash, Gemini 1.5 Pro, Imagen2 |
45+
| [Ollama][3] | Many local models, including Gemma, Llama 2, Mistral, and more |
46+
47+
[1]: plugins/google-genai.md
48+
[2]: plugins/vertex-ai.md
49+
[3]: plugins/ollama.md
50+
51+
See the docs for each plugin for setup and usage information.
52+
53+
<!-- TODO: There's also a wide variety of community supported models available
54+
you can discover by ... -->
55+
56+
## How to generate content
57+
58+
Genkit provides a simple helper function for generating content with models.
59+
60+
To just call the model:
61+
62+
- {Go}
63+
64+
%include ../go/internal/doc-snippets/models.go call
65+
66+
You can pass options along with the model call. The options that are supported
67+
depend on the model and its API.
68+
69+
- {Go}
70+
71+
%include ../go/internal/doc-snippets/models.go options
72+
73+
### Streaming responses
74+
75+
Genkit supports chunked streaming of model responses:
76+
77+
- {Go}
78+
79+
To use chunked streaming, pass a callback function to `Generate()`:
80+
81+
%include ../go/internal/doc-snippets/models.go streaming
82+
83+
## Multimodal input
84+
85+
If the model supports multimodal input, you can pass image prompts:
86+
87+
- {Go}
88+
89+
%include ../go/internal/doc-snippets/models.go multimodal
90+
91+
<!-- TODO: gs:// wasn't working for me. HTTP? -->
92+
93+
The exact format of the image prompt (`https` URL, `gs` URL, `data` URI) is
94+
model-dependent.
95+
96+
## Function calling (tools)
97+
98+
Genkit models provide an interface for function calling, for models that support
99+
it.
100+
101+
- {Go}
102+
103+
%include ../go/internal/doc-snippets/models.go tools
104+
105+
This will automatically call the tools in order to fulfill the user prompt.
106+
107+
<!-- TODO: returnToolRequests: true` -->
108+
109+
<!--
110+
111+
### Adding retriever context
112+
113+
Documents from a retriever can be passed directly to `generate` to provide
114+
grounding context:
115+
116+
```javascript
117+
const docs = await companyPolicyRetriever({ query: question });
118+
119+
await generate({
120+
model: geminiPro,
121+
prompt: `Answer using the available context from company policy: ${question}`,
122+
123+
context: docs,
124+
});
125+
```
126+
127+
The document context is automatically appended to the content of the prompt
128+
sent to the model.
129+
130+
-->
131+
132+
### Recording message history
133+
134+
Genkit models support maintaining a history of the messages sent to the model
135+
and its responses, which you can use to build interactive experiences, such as
136+
chatbots.
137+
138+
- {Go}
139+
140+
In the first prompt of a session, the "history" is simply the user prompt:
141+
142+
%include ../go/internal/doc-snippets/models.go hist1
143+
144+
When you get a response, add it to the history:
145+
146+
%include ../go/internal/doc-snippets/models.go hist2
147+
148+
You can serialize this history and persist it in a database or session storage.
149+
For subsequent user prompts, add them to the history before calling
150+
`Generate()`:
151+
152+
%include ../go/internal/doc-snippets/models.go hist3
153+
154+
If the model you're using supports the system role, you can use the initial
155+
history to set the system message:
156+
157+
- {Go}
158+
159+
%include ../go/internal/doc-snippets/models.go hist4

0 commit comments

Comments
 (0)