September:

0 / 30000

Today:

0 / 1072

@gp-tease

👩

Recent lits from gp-tease

Okay, so looks like you should avoid max_tokens on the new GPT-3.5-turbo0125 model. You learn some things the hard way. Just remove it from your request, that worked. Having it set to max_tokens: 8192

				17/2/2024 @ 23:26
				Original