2023 will undoubtedly be a year of change in SEO – they all are. For almost a quarter of a century, Google has kept search professionals on their toes as it sought to first attain and then maintain its seemingly unassailable grip on the world of search.
Nevertheless, as we do each year, we’ve put together some predictions from our experts on how they see this change happening, for what purpose and how that could impact brands operating online.
My predictions The experts have pretty much covered the bases here – nought is mutable but mutability as the other Shelley once said, and that’s especially true in search. However, Google’s recommendations often run ahead of its actual ability, so I’ll have to throw my lot in with David this year. I think we’re still a couple of years off Google being able to properly differentiate between what is and what is not ‘helpful content’ but that doesn’t mean we won’t see them take baby steps in that direction.
You think chatGPT is good? GPT-3, the language model on which it’s based, was trained on 175 Bn parameters – GPT-4 is set to be trained on around 100 trillion parameters, orders of magnitude more than its predecessor, and it will likely arrive at some point in 2023 (though whether we’ll see that introduced to the chat model is more uncertain). However, it’s still going to be a plagiarism machine. Soft plagiarism, yes, but plagiarism nevertheless.
While members of the online marketing and SEO community will lose their minds over how quickly they can spam the content of a billion page affiliate website, longer term gains will be made by brands that focus on building expert entities. Media outlets have routinely indicated that Sundar Pichai, and Google more broadly, are worried by chatGPT, but in reality they are far more subdued. Jeff Dean, Google’s AI lead (and a good follow on Twitter), for example, stated the following in response to a question at an all-hands meeting at the end of last year:
We are absolutely looking to get these things out into real products and into things that are more prominently featuring the language model rather than under the covers, which is where we’ve been using them to date […] but, it’s super important we get this right.
“You can imagine for search-like applications, the factuality issues are really important and for other applications, bias and toxicity and safety issues are also paramount.”
LaMDA, for example, was good enough to ‘convince’ one Googler that it was sentient. I can’t imaging chatGPT has managed that particular feat yet. While I can’t see any need for Google to rush, the chances are we’ll see some kind of early access allowed for LaMDA in 2023 – but most of the search giant’s efforts will be in ways to establish credentials for EEAT.