The rise of AI
I’ve been thinking a lot about the rise of artificial intelligence tools in recent times and about it’s potential to accelerate the decline of real intelligence. I’m somewhat conflicted about how I feel about it if I am to be honest with you.
I’m sitting here typing these words as they come to me. It was a spur of the moment thing because I had been thinking about AI tools that I’ve used recently. I’d been doing some website work for someone I know who is trying to improve their SEO rankings and was sorely lacking in useful content on some of their pages. I, not being very knowledgeable about their industry, turned to ChatGPT to help me generate some content for the pages.
This person I was doing it for checked it out once I’d done it and was blown away by the content. They said it was spot on.
“Phew”
I’d just trusted some AI tool to generate this for me and had no way (other than their confirmation) to be sure it was correct. It’s a great tool.
I’ve used it in other ways as well. It’s a fountain of knowledge on just about anything. Need to know what something means, or how to do something, or when something happened… you get the idea. It can help.
It can also take something you’ve written and reword it. It’s great.
BUT… it’s when you get to that point that it starts to become murky.
It’s all too easy to take content that someone else created and ask a tool like this to reword it. The end result could be a text that says the same thing but using different words and turns of phrase so you wouldn’t know it was someone else's content to begin with.
It is readily achievable to appropriate content authored by another individual and employ a tool such as this for its paraphrasing. The resultant output may manifest as a text conveying identical information while employing distinct terminology and expressions, thereby obfuscating its origin as the work of another.
Verily, it is a task made facile to appropriate the literary compositions of another, and employ a contrivance such as this to transmogrify the verbiage. The resultant script shall indeed convey an identical import, yet cloaked in a lexicon and style diversely ornate, thereby concealing its true lineage from whence it was first wrought by another pen.
You can even have it change the style of writing if you want it for something more professional or perhaps even old victorian style.
It becomes something of an honesty system.
It’s not just the writing and textual content generation side of things that can be used inappropriately.
There are some very powerful and capable image generation AI tools available now. Meta has recently added AI sticker generation into their messenger systems so it can create completely new stickers in moments based on the search entered by a user.
It’s also possible to use these tools to generate “photographs” and since they are improving all the time it can be quite difficult to be certain if the image you see before you at any given point has been generated by an AI.
Sometimes it’s easy to tell the difference, other times not so much.
Someone shared a “behind the scenes” set of photos on social media the other day that showed the setup they had for taking the photograph and the end result. They listed the gear they used and so on, but the background of the end result photograph didn’t seem like it was possible from the setup in the behind the scenes.
Sure enough, look in closer and the tell-tale signs of AI background replacement can be seen. The worst part about it though is that the effect they had used AI to generate for the background was easily achievable in camera with common (and inexpensive) household items and lights they already had. Yet instead, due to laziness or convenience or both, they just used AI to generate it for them.
This image above is an example of the sort of thing I am seeing more and more with each passing week being shared on photographic communities. I may have exaggerated this a small amount as it seems to me that this is very clearly AI generated, while some I see aren’t as clear to see without looking closer. In this case the reflection in the water doesn’t match the actual mountains among other telltale signs.
The problem I have is not with these tools being used by people to generate art (although there is valid reasons for having issues with that for other time), but it’s with those people who do it and then insist that it’s an actual photograph that they took.
I’ve commented on some that were very obvious to me saying that it’s a nice AI art generation only to be met with absolute hostility and rather rude and insistent denials. Despite it being very clear it isn’t a photograph.
This, more-so than the tools themselves, is where the problems with AI can be found.
With tools like ChatGPT (and others) so readily available to answer anything about anything, what need do we as humans have to learn and remember things for ourselves?
With AI art generation being capable of producing convincing images of just about anything you can imagine, what need have we of learning to do art or photography?
Will the rise of AI lead to lack of knowledge and skill in coming generations? Will people stop passing down knowledge to their children because AI can tell them anything they need to know?
Perhaps it’s a bit extreme to think that. To me, it seems like we have these great tools that can be used in excellent ways to great effect… BUT… can also be used in other ways that might not be so great. It also seems to me that when there is potential for abuse of a tool there will be those who will abuse it.
I don’t believe these AI tools are bad, just as I don’t believe a gun is a bad thing. Only that they will be used for bad by unscrupulous people.