get_max_token_limit

get_max_token_limit(model: str = 'gpt-3.5-turbo-0613') -> int
Parameters:
NameDescription
modelType: str

Default: ‘gpt-3.5-turbo-0613’