Skip to main content
const CanopyConfig = {
  authToken:"<your-auth-token>"
  model:"gpt-3.5-turbo",
  voice:"alloy",
}
const canopy = new Canopy (CanopyConfig)

The CanopyConfig object is used to configure the Canopy SDK. You can pass it very few parameters and choose default settings or you can pass it many parameters and customize the SDK to your needs.

Parameters

authToken
string
required
You must authenticate using an authToken. You can learn how to get an authToken here.
model
string
default:"gpt-3.5-turbo"
Refers to the LLM model you want to use. Learn more here .
voice
string
default:"alloy"
The sound of the speech the model outputs. Based on the Open AI TTS engine. Find out more here .
initialSystemMessage
string
The first message of the conversation is a system message that outlines the bots role and any information it may have. If you want to build agents learn how this should be set up here
baseAgentUrl
string
The URL of the server you are ready to recieve POST requests to for agents. Learn more about how to set up your server here.
playbackSpeed
string
default:"1"
The speed of the speech relative to a human standard. Learn more here
modelMaxTokens
string
default:"300"
The maximum number of tokens your model will generate, in one generation. Maximum is 1000.