I’m just guessing but likely they are training or instructing it in such a way that it will defer to sources that it finds through searching the internet. I guess the first thing it does when you ask a question is it searches the internet for recent news articles and other sources and now you have the context full of “facts” that it will stick to. Other LLMs haven’t really done that by default (although now I think they are doing that more) so they would just give answers purely on their weights which is basically the entire internet compressed down to 150 GB or whatever.
I’m just guessing but likely they are training or instructing it in such a way that it will defer to sources that it finds through searching the internet. I guess the first thing it does when you ask a question is it searches the internet for recent news articles and other sources and now you have the context full of “facts” that it will stick to. Other LLMs haven’t really done that by default (although now I think they are doing that more) so they would just give answers purely on their weights which is basically the entire internet compressed down to 150 GB or whatever.