Is AI an Expert Content Creator?
The question of if it can be claimed that AI is an expert content creator crops up when considering certain website topics that occupy a space reserved for experts only.
It occurred to me that our beloved search engine Google might have scored an own goal in this respect, since it is pushing the concept of artificial intelligence as being potentially the know-it-all source of information in any field.
Why might this be seen as a problem for Google?
YMYL (Your Money, Your Life)
For most people, the knowledge of how Google rates certain content that it allows to be ranked highly in its search pages is not given a second thought.
However, for those of us that create content and are aware of the limitations placed on the ranking criteria of certain topics like health or finance, it is a big deal. That's because the YMYL aspect of Google's ranking algorithm actively demotes (or doesn't rank at all) content not written by a verifiable "expert" in the field.
This makes sense as, for example if a "lay person" writes an article about a health topic that gives advice on certain medications, it should not be accessible if it could cause a person harm. Similarly, if someone were to write an article providing inexpert financial advice that might cause a person to lose money, that article should not be easily accessible in the search engine's top pages.
It's quite true that you shouldn't get your financial advice from your hairdresser, you should go see a professional financial advisor!
But, there is a big "BUT" in this scenario that wasn't there before the advent of AI created content.
Who is the Expert?
You see, now that we can use AI to write complete articles for our websites (if you didn't know that, you do now), the question should be asked if AI is really an expert in whatever field it is expected to write about.
For the record, this article is not AI written. It's all me.
I do have access to an AI content writer and I do use it on occasion to create content about topics I'm not so well versed in, to save time on the research aspect.
I have seen first hand how good that content can be. It's well written, logically presented and factual.
Gone are the days when it was common to visit a webpage only to be confronted by a page of text that was little more than a barely readable jumble of words because it had been machine generated by a content spinner. Google seems to have managed to remove most of that kind of junk content from its search engine.
However, AI content is now certainly good enough to be indexed and ranked by the search engine. Not only that, but it is so well written and factual that it can and should be considered "expert."
That's where Google's own goal has been scored.
Its YMYL algo should not dismiss expert content written purely by AI, even more so when there is an inclusion to state that the piece is 100% AI generated.
Why is this a big deal?
It's a very big deal, because when AI writes an article providing, for example medical advice about a certain medical condition, that advice will be as "expert" as that provided by a doctor or any medical professional in fact. That's because AI will have taken its research input for the article from current medical journals and contain verifiable data.
Do you see where this is going now?
The Expert Conundrum
No longer can Google realistically discount YMYL content based on the lack of professional or expert credentials of the website author, if that author is using AI to generate their articles!
The argument that it's not "expert content" is now void when it's produced by AI.
That's because AI produces arguably the most "expert" content that can be produced (as long as the necessary expert information is available for AI to access in the first place). Which places a big question mark on the viability of Google's stance on YMYL topics.
It cannot longer argue that a website publishing health content is not "expert" when the author has no medical credentials, as long as the content is being produced by AI. That's because AI, being the author, is the expert.
AI doesn't need credentials. It is what it is.
Just to prove to you how good AI is, I'm providing an example of AI written content below that you can see for yourself:
I asked AI this question: "What is the most effective treatment for rosacea?"
This is it's answer:
"The most effective treatment for rosacea is a combination of topical and oral medications. Topical medications such as metronidazole, azelaic acid, and clindamycin can be used to reduce inflammation and redness while oral medications such as tetracyclines and isotretinoin can be used to reduce the number of flare-ups. In addition, lifestyle modifications such as avoiding triggers, wearing sunscreen, and avoiding harsh skin care products can help reduce the severity and frequency of flare-ups."
Note: The above paragraph was written by AI
Check out the information in medical publications if you like, just to see if it is giving truly expert advice. I believe you will find that it does.
I am not a medical expert, but I am perfectly capable of researching any topic that interests me and can likely come up with something similar to what AI has produced above if I were researching the above question. I've always been irritated by the implication (from Google) that because I am not a qualified medical expert that I cannot produce factual content through research.
I accept that while my findings on a subject may be completely accurate and factual, a casual visitor to my website may not know that. I also accept they need to see expert credentials that they can trust before taking any advice.
By the same token, if AI is the accredited author of certain advice, it should be accepted as expert and therefore safe to take.
Is this not the (artificial intelligence) path being taken by Google and other big tech companies, after all?
Expert credentials in a field have always been the surest indicator that published advice in that field my be accepted as accurate and factual by the author.
That "expert" safety net has been the norm for a long time. However, the advent if artificial intelligence has created a content provider that will one day take over from the experts in a number of fields where advice is given.
Online medical advice provided by AI is already in the test phase, proving that AI is fast becoming readily acceptable as a trusted medical expert. It will eventually replace doctors for answering the public's questions on their personal medical situations.
In a nutshell, I have argued that where a lay person may not be acceptable as a source of expert advice in a topic that could potentially cause, for example physical health or financial harm, Artificial Intelligence can and should be acceptable as an expert.
The next stage is for the search engines like Google to accept that AI produced content should also be afforded the same "expert status" as a human expert in any field, including those YMYL topics that currently exclude non-experts from the search engine results pages.
What will Google do with this revelation?
Interestingly, what will content producers make of all this?
My guess is, at least for the moment, it will be met with some resistance because it often takes us humans a while to let new concepts sink in. What does AI think of it?
I asked, "Does Google accept that content created by artificial intelligence is an expert in the given field?"
"At this time, Google does not accept content created by artificial intelligence as an expert in any given field. Google has instead chosen to focus on content created by human experts."
Note: The above paragraph was written by AI
Well, bully for Google. At this time, it seems there is still a barrier to get over. But over it we will get!
Originally posted on: Dec 31, 2022
(BACK TO TOP)
There are 0 comments on this post: