This function completes a given prompt using the Codestral API. It supports different models for fill-in-the-middle, chat with Codestral, and chat with Codestral Mamba. The function relies on environment variables for some parameters.
codestral(
prompt,
suffix = "",
path = NULL,
mistral_apikey = Sys.getenv(x = "R_MISTRAL_APIKEY"),
codestral_apikey = Sys.getenv(x = "R_CODESTRAL_APIKEY"),
fim_model = Sys.getenv(x = "R_CODESTRAL_FIM_MODEL"),
chat_model = Sys.getenv(x = "R_CODESTRAL_CHAT_MODEL"),
mamba_model = Sys.getenv(x = "R_MAMBA_CHAT_MODEL"),
temperature = as.integer(Sys.getenv(x = "R_CODESTRAL_TEMPERATURE")),
max_tokens_FIM = Sys.getenv(x = "R_CODESTRAL_MAX_TOKENS_FIM"),
max_tokens_chat = Sys.getenv(x = "R_CODESTRAL_MAX_TOKENS_CHAT"),
role_content = Sys.getenv(x = "R_CODESTRAL_ROLE_CONTENT")
)The prompt to complete.
The suffix to use. Defaults to an empty string.
The path to the current file. Defaults to NULL.
The API keys to use for accessing
Codestral Mamba and Codestral. Default to the value of the
R_MISTRAL_APIKEY, R_CODESTRAL_APIKEY environment variable. Note that
the name of the variable mistra_apikey is purposely not mentionning
Codestral Mamba because this key can be use for other Mistral AI models
(except Codestral).
The model to use for fill-in-the-middle. Defaults to the
value of the R_CODESTRAL_FIM_MODEL environment variable.
The model to use for chat with Codestral. Defaults to the
value of the R_CODESTRAL_CHAT_MODEL environment variable.
The model to use for chat with Codestral Mamba. Defaults to the
value of the R_MAMBA_CHAT_MODEL environment variable.
The temperature to use. Defaults to the value of the
R_CODESTRAL_TEMPERATURE environment variable.
Integers giving the maximum number of
tokens to generate for FIM and chat. Defaults to the value of the
R_CODESTRAL_MAX_TOKENS_FIM, R_CODESTRAL_MAX_TOKENS_CHAT environment
variables.
The role content to use. Defaults to the value of the
R_CODESTRAL_ROLE_CONTENT environment variable.
A character string containing the completed text.