1¹ade in scene. the fade was transitioning a father and son from walking in a market to being on a house with a mother while the father and mother did household chores. This was when I was creating the story of Noah and the flood for my bible in shorts project.
I have also noticed that about 60% of every AI answered question of search engines is from 50% to a 100% incorrect in its "sumerized answer. I am pretty sure I know a big part of why we are seeing this in AI, thats a whole other long and complicated discussion. I have done much testing and I have confirmed this is probably what I suspect. it is not as nefarious as many would think but it is definatly a case of overcomplicating the guardrails put in place. that results in what you experienced and what I experienced both in the Noah story and the creation story (in that case imagine insisting on producing adam with an "appendage" down to his knees unprompted and no matter what I tried inevitability the appendage would appear
Claude sounds like a creep. Don't give away your agency for faux convenience!
The imminent chaos is going to be far more difficult to unwind than most expect:
As agent-agent processes stabilize around quasi-attractor states, orgs will cut back on the human involvement.
When the conditions change just enough, though, instability will be able to *rip* through the connected systems.
And troubleshooting that situation is going to be nearly impossible, let alone assignment of fiscal/moral responsibility for such failures.
They aren’t nefarious inherently — they’re generators of statistically determined philosophical bullshit.
But the bandaids & correctives, and of course the training set … those introduce bias on top of the bullshit.
Indeed.
Science fiction has been warning us about that exact scenario for decades.
I'm watching an old sci-fi series right now called Person of Interest which touches on some of these themes. Have you seen it?
I was telling about grok imagine deciding I was infering CSAM by a fade of a father and son. it got cut off
1¹ade in scene. the fade was transitioning a father and son from walking in a market to being on a house with a mother while the father and mother did household chores. This was when I was creating the story of Noah and the flood for my bible in shorts project.
I have also noticed that about 60% of every AI answered question of search engines is from 50% to a 100% incorrect in its "sumerized answer. I am pretty sure I know a big part of why we are seeing this in AI, thats a whole other long and complicated discussion. I have done much testing and I have confirmed this is probably what I suspect. it is not as nefarious as many would think but it is definatly a case of overcomplicating the guardrails put in place. that results in what you experienced and what I experienced both in the Noah story and the creation story (in that case imagine insisting on producing adam with an "appendage" down to his knees unprompted and no matter what I tried inevitability the appendage would appear