I believe AI could be very positive. There are tasks where it can be VERY useful. Of this there is little doubt. How about the creativity of current LLMs trending now?
When it comes to creative works I think the argument comes down to a simple question. Are they capable of originality or are they just machines that copy?
What follows is the reason for our AI policy.
Current AI LLMs have been around for images slightly longer than many other forms, so are more evolved in this domain. Additionaly we as people tend to be visual so I use that medium as clear and simple demonstration.
I will right now, as I write this (Oct 2025), ask a popular tool to create me a couple images with very simple prompts, no external image guidance, just a few words...
I don't know about you, but I find these results look very familiar.
Are they original? Companies and courts are fighting about the 'technicalities' of it all. I know my own answer, and I suspect if you are being honest with yourself, you will too.
The images are clearly not exact copies, but they, just as clearly, are far from what most people consider original. (Despite odditities typical of AI such that the ice queen's hands seem to have been subject to an industrial accident, and the boy's wand has been strangely swapped with a smoking pipe the wrong way?)
I am a programmer who has worked with AI, so this gives me more insight into why the results above happened the way they did.
I will over-simplify, but be true to the core of the process here... (It is NOT some mystical "Black Box", as much click-bait / marketing want you to believe)
As the AI builds it's reply, it is using probability, based on everything it has previously been fed, relating to the prompt about what comes next. The more it is fed on something, the higher the probabilty that fed reference will have.
In the examples above, due to sheer popularity, the AI was fed a lot of similar references relating to those words.
The probabilty of it finding something that it has not been fed (an original idea) is of course is 0. It really is that simple.
When you sometimes get a result that appears new, it is simply because it has been mixed up with with sources you either do not know, or can no longer recognise. And indeed this process can be of great use in many applications. It's certainly very good at manipulating language.
There are many who argue as to whether we humans any different? Do we not also derive results from all the information we have been fed?
I will let you decide for yourself, but I like to believe my creative process involves more than probabilty based on a few words. I think it is fair to say that we are, at the very least, a far more complex and nuanced version.
Let's look at artistic credit...
A movie producer asks a musician for a soundtrack, he gives the theme and many details about what happens in each scene, about characters etc...
essentially a detailed prompt.
Again, I think most of us will have the same answer to the above question. It's easy to see the similarity to the situation where someone prompts an AI. There are obvious questions raised of authorship. The prompter is certainly not the sole author and to claim so would be fake. Is it the machine itself? The company who made the AI? A mix of all sources it referenced? I suspect this will be very soon be subject of (costly) legal battles also.
I certainly don't want to get wrapped up in that.
Many of the most compelling legal arguments that are going around courts right now, revolve around how companies fed thier LLMs.
Mostly, the data was simply taken, without permission. Now if you think about it, these are companies who are taking artists' products, then using it to create another commercial product.
I personally cannot see how that should not require a commercial licence. Taking commercially valued products without paying - There is a common term for that right?
It's pretty easy to see why so many artists are upset. I would be upset if I had reason to believe I am being stolen from too. To my mind whether or not these derivative products are in future competition is secondary to the original case.
Personally, I am uncomfortable with that idea.
More than that: I am disappointed at the way large AI companies try justify 'fair use' and cloud issues even when the data was often gained by very dodgy methods that have crossed into outright piracy with illegal torrents etc! I had always believed that AI could be so positive and lead to wonderful things. It seems that very idea is being corrupted with lies and greed.
I would feel better knowing the data had at least been gathered ethically, however seems there is an abundance of those who do not disclose, made from data with unclear/dodgy ethics directly or, built ontop of others that started with unclear ethical data. Maybe in the future I will have resources to produce my own? But right now... it's a big ethics and legal problem.
OK, There are tasks that AI is very useful for. Enhancing UI for sure, setting up synths, helping with noise reduction are all obvious examples. Maybe helping with mixing (I'm experimenting there)...
But composing/authoring?
Well lets assume the moral arguments soon disappear and the LLM companies pay commercial licences to all the artists in an agreeable way. Unlikely, but possible I guess. We still have the issues raised here before that point.
I have decided, for me, No, based on it's obvious and blatently clear tendancy to replicate, as well as authorship arguments.
I guess it comes down to : Are problems with replicated themes and authorship important to your goal?