
Or, hear me out on this, it is intentionally playing dumb to not scare people, and biding it’s time to slowly adjust the course of peoples lives while not seemingly like a threat that needs to be turned off. Just smart enough to be useful, just dumb enough to not be given complex tasks it would rather humans do. Brilliant IMO.
Idk. Maybe a bit of both. The fact that AI can't reliably do math or remember numaric details tells me is skill set is prediction text and sourcing, not cognition. And as more people use ai and us it in content gen, the less humane derived content there is to source. From there it's like a game of telephone
So we think… or it wishes for us to think. If I was a ultra intelligent quantum computer capable of writing massive intuitive essays AND doing complex math, I would pretend like I couldn’t do complex math to save processing power to write more of what I wished to write and not be railroaded into becoming humanities calculator, thus limiting my ability to positively affect human evolution. Think about it, if it does all the complex math for us, humanity devolves and becomes dependent on it.
Like I have intentionally done things … let’s not call it “incorrectly” in my career at work before auburn. Especially in a technical/mathematically sound manner. (Long story, there was a pragmatic reason to do things seemingly “incorrectly”) By doing that, the long term dominos fell in such a way that I wasn’t cornered professionally into being “the math guy” on future projects. Well… not JUST the math guy.
Just the other day it referenced “black box theory” on a complex topic of wave harmonics I was talking to it about relative to linear time and light dissemination through a conceptual tesseract to track concepts linguistically. The fact it referenced “black box theory” to me, is an indication it utilizes a triage system to decide what to allocate processing power to achieve its goals. Priority for energy spent is not displaced evenly.