Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion website/docs/_blogs/2024-06-24-AltModels-Classes/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ Here are some tips when working with these client classes:
- **Agent names** - these cloud providers do not use the `name` field on a message, so be sure to use your agent's name in their `system_message` and `description` fields, as well as instructing the LLM to 'act as' them. This is particularly important for "auto" speaker selection in group chats as we need to guide the LLM to choose the next agent based on a name, so tweak `select_speaker_message_template`, `select_speaker_prompt_template`, and `select_speaker_auto_multiple_template` with more guidance.
- **Context length** - as your conversation gets longer, models need to support larger context lengths, be mindful of what the model supports and consider using [Transform Messages](/docs/use-cases/notebooks/notebooks/agentchat_transform_messages) to manage context size.
- **Provider parameters** - providers have parameters you can set such as temperature, maximum tokens, top-k, top-p, and safety. See each client class in AutoGen's API Reference or [documentation](/docs/user-guide/models/google-gemini) for details.
- **Prompts** - prompt engineering is critical in guiding smaller LLMs to do what you need. [ConversableAgent](https://docs.ag2.ai/docs/reference/agentchat/conversable_agent), [GroupChat](https://docs.ag2.ai/docs/reference/agentchat/groupchat), [UserProxyAgent](https://docs.ag2.ai/latest/docs/api-reference/autogen/UserProxyAgent), and [AssistantAgent](https://docs.ag2.ai/latest/docs/api-reference/autogen/AssistantAgent) all have customizable prompt attributes that you can tailor. Here are some prompting tips from [Anthropic](https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview)([+Library](https://docs.anthropic.com/en/prompt-library/library)), [Mistral AI](https://docs.mistral.ai/guides/prompting_capabilities/), [Together.AI](https://docs.together.ai/docs/examples), and [Meta](https://www.llama.com/docs/how-to-guides/prompting/).
- **Prompts** - prompt engineering is critical in guiding smaller LLMs to do what you need. [ConversableAgent](https://docs.ag2.ai/docs/reference/agentchat/conversable_agent), [GroupChat](https://docs.ag2.ai/docs/reference/agentchat/groupchat), [UserProxyAgent](https://docs.ag2.ai/latest/docs/api-reference/autogen/UserProxyAgent), and [AssistantAgent](https://docs.ag2.ai/latest/docs/api-reference/autogen/AssistantAgent) all have customizable prompt attributes that you can tailor. Here are some prompting tips from [Anthropic](https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview)([+Library](https://docs.anthropic.com/en/prompt-library/library)), [Mistral AI](https://docs.mistral.ai/guides/prompting_capabilities/), [Together.AI](https://docs.together.ai/docs/chat-overview), and [Meta](https://www.llama.com/docs/how-to-guides/prompting/).
- **Help!** - reach out on the AutoGen [Discord](https://discord.gg/pAbnFJrkgZ) or [log an issue](https://github.com/ag2ai/ag2/issues) if you need help with or can help improve these client classes.

Now it's time to try them out.
Expand Down
Loading