invalid_request_error
complete
JC
complete
H
Hideaki Hayashi
It works now! Thank you for the quick fix!
JC
Hideaki Hayashi: You are welcome!
JC
Hideaki Hayashi Fixed, gpt-4 has a context size of 8K, you can now choose larger max_tokens.
JC
checking
H
Hideaki Hayashi
It seems this happens whenever the prompt is long. My prompt with about 4000 tokens always get this error.