Topic: Article on AI
Interesting article about what's really happening/not happening behind the scenes in AI implementations.
https://www.techpolicy.press/the-black- … hatgpt.com
TheNextBigWriter Premium → Article on AI
Pages 1
Interesting article about what's really happening/not happening behind the scenes in AI implementations.
https://www.techpolicy.press/the-black- … hatgpt.com
Sara Connor was the prepper.
But she taught him everything.
And he had a Terminator as a pet. LOL
I just started reading the article. Thanks, Dirk. It's interesting. AI itself doesn't scare me, but this article might change some of that. And it's who is using the AI that scares me and if they have a system that is far superior to others. AI is a tool. And tools can be used to conquer and control. AI engines function on statistics not on ethics as the article notes. But what if very unethical people construct it. We hope for neutral programmers.
Keep on reading!
AI are tools. And they're as "ethical" as they're programmed to be.
I wrote a story recently that involved a serial killer making their victims look like suicides.
Trying to research that one was a pain in the rear, the constant "Are you ok?" messages.
Another one, I had to write an Alternate History, and chose to utilize real historical figures in alternative roles.
So I kept getting told "That's not what happened" and had one program evaluate my characters as they really were, not as how I wrote them.
I ended up having to include a line at the begining telling the darn thing that the story was Alternate History and that I was well aware that this wasn't what happened.
Even then, it still would judge the characters based a lot on RL not AH.
That's not Artificial Intelligence. Intelligence would have a model of reality, an understanding of the world and a theory of mind. This is very sophisticated pattern matching that relies on not needing to model the world, because it can compare with what it was trained on.
Confounding the two is, aadly, natural stupidity.
It would be interesting to feed it a story that opens "Once upon a time, when the world was young, there was a Martian named Smith. Valentine Michael Smith was as real as taxes, but he was a race of one."
One of the competitions I'm doing has an active message board system, where things get discussed.
One of the hottest (by which I mean most divisive) topics is about AI and the potential use in writing competitions.
I've noticed that there seem to be two common stances that one particular side of the debate makes:
1: AI/LLMs can never be as creative as a human, so the stories will never be as good.
2: Anyone who uses AI/LLMs should be DQ'd and banned for trying to have an unfair advantage.
(I usualy prefer the term LLMs, because it's more accurate. Sometimes I fall into using AI because everyone else does, but accuracy is important.)
If LLMs aren't creative—and let's be honest, they have nothing which really approaches an imagination, so they can't truly be creative—then there should be no threat to people who are creative.
If LLMs are a threat to people who are creative, then either you're admitting that LLMs can be more creative than people.
Both statements can be true.
Back in old days, before human chessmasters started losing to "thinking machines," I could beat a chess program set to a certain level of forethought or below. Once I set it above that point, I had trouble, until either I learned what I needed to beat it, or I didn't and got frustrated and gave up.
Some people couldn't adapt. Some people could. And some people surpassed.
I don't think that LLMs will ever be more creative than most humans.
But they'll be more efficient.
And therein lies the issue.
Take any of the images I use as a standin on here for a cover.
For someone human to create any one of those pictures, it would take time and effort and work.
And they would need to be fairly compensated for that or give it up of their own free will.
And I'd have to choose asking someone to give up their free time or me paying someone for something, when—at this point in time—they are already a sink of my resources. I spend money to try to develop my skills. I can't afford to pay anyone for that sort of effort—yet. So, rather than a silly-looking plain red cover, I took roughly 5 minutes to use an LLM-powered graphic design program to create it for myself.
Because if I did it sans the LLM stuff? It would look even worse.
Writing's the same. I've read "AI-created" stories that I know for a fact are, because I explicitly asked them to create them. They are inconsistent, lose track of stuff, and have more holes in logic than one would imagine.
But there's people who write even worse.
So for something that they need a real quick, and not very creative piece?
They'll use LLMs.
I'm not going to fret or worry about whether a computer can do something better than me.
I already know it can. I can't draw. And I'm mediocre at chess.
It is what it is.
Pages 1
TheNextBigWriter Premium → Article on AI