GPT4-1106-preview model has wrong context size
complete
Viet To
Hi,
When trying a conversation with the new GPT-1106-preview model, which is supposed to have 128k context size, I see this error:
`exceeds the 8000 tokens limit of gpt-4-1106-preview. Please switch to a model with larger context size or reduce your request tokens.
Looks like the model can only accept 8000 tokens maximum which is incorrect.
Could you please double check?
JC
complete
JC
Viet To It's fixed now, thanks for reporting!
Viet To
JC: Hi, thanks for looking into this.
However, I still experience the same issue after trying again just now. Have done hard refresh/disabled browser cache but to no avail.
Would you mind double checking again? Thanks
JC
Viet To: ahh, my bad, messed it with another issue. The context size of the new gpt-4 models are fixed now. Sorry for the inconvenience!