But yes, if you ask OpenAI to predict next set of tokens (which is how chat works), it won’t be up to date with latest information. But if you’re using it for embeddings this is less of a problem since language itself doesn’t evolve as quickly, and using embeddings is all about encoding the meaning of text, which is likely not going to change so much - but not to say it can’t for example the definition of “transformer” pre 2017 is probably not referring to the “transformer architecture”.
reply