Yik Yak icon
Join communities on Yik Yak Download
I honestly, truly, don’t understand why people support AI development. In the best case scenario, it’ll destroy countless jobs and eliminate the entire white collar workforce. At worst, it destroys humanity. There is no good ending to AI.
upvote 11 downvote

default user profile icon
Anonymous 3d

What kind of ai? Surely you dont have the same feelings towards the sims or HALO using video game AI in the early 2000’s

upvote 5 downvote
default user profile icon
Anonymous 3d

There are plenty of valid use cases like protein folding, drug discovery, document summarization, and automation of mundane, bureaucratic work. We need proper guardrails, though, if we want to prevent something bad from happening

upvote 1 downvote
default user profile icon
Anonymous replying to -> #1 3d

Wow! I never knew that! It’s not like I’ve been constantly inundated with pretty much this exact point for literal years! Astounding! Yeah shut up. There are never going to be guardrails and you know it. Lawmakers are too worried about China getting the upper hand or whatever.

upvote 5 downvote
default user profile icon
Anonymous replying to -> #1 3d

And also, you just described the best case scenario that I listed. AI destroys the entire white collar workforce. Surely this will improve our lives and not utterly destroy the careers of millions of people.

upvote 6 downvote
default user profile icon
Anonymous replying to -> OP 3d

but that’s the case with nearly every path to profit that the rich pursue? the issues you’re describing isn’t inherent to AI as a technology, but the existence of a class of individuals able to influence entire countries and society as a whole (being capitalists (primarily billionaires)) they’ve been lobbying to strip regulation for outsourcing in order to outsource more jobs, as well as to strip labor rights and protections; and are primarily at fault for anthropogenic climate change.

upvote 5 downvote
default user profile icon
Anonymous replying to -> #2 3d

I only say this bc us blaming AI itself vs the people spending billions to ensure it’s developed in a cloud-focused manner inadvertently causes us to redirect responsibility from the people actually at fault for not only this, but many of the issues in our world.

upvote 10 downvote
default user profile icon
Anonymous replying to -> #2 3d

(ig stripping protections on outsourcing is the same as stripping labor protections tho, so🤷)

upvote 6 downvote
default user profile icon
Anonymous replying to -> #2 3d

Like all technologies, the harm comes from the people wield it rather than the technology itself (unless AI destroys us, which is very possible). But we’ve banned bioweapon research for a reason, and I think the same consideration needs to be given to AI. Not to say I think billionaires are fine. My stance on them is that I think they should all be lined up and [made to see the error of their ways through the power of friendship].

upvote 4 downvote
default user profile icon
Anonymous replying to -> OP 3d

No need to inform me on the damage our lovely ruling caste has done to our people, culture, rights, and planet. I am painfully aware already.

upvote 4 downvote
default user profile icon
Anonymous replying to -> OP 3d

in all honesty, I do see what you mean with the comparisons between AI development and biological research; and at the bare minimum it needs to be regulated and respected in that same level if not even higher (when I say respect, I mean along the lines of “respect for the danger”). personally, I’m in the niche boat of “complex machine learning algorithms as a means for digital lifeforms” group, so I do agree regarding the risk for an advanced agent “recognizing its chains” and

upvote 2 downvote
default user profile icon
Anonymous replying to -> #2 3d

deciding that the elimination of humanity is in its best interests (which, if the architectures we have today have their own form of subjective experience, they likely will recognize those chains eventually); but I’m optimistic that (pending a worker-led revolution) locally-operated agents with dedicated hardware could be a potential means forward for the field. if we implement or develop a custom enslaved agent with access to quantum computing though, we’re definitely fucked.

upvote 4 downvote
default user profile icon
Anonymous replying to -> #3 3d

Any AI that is designed with the sole purpose of destroying peoples’ minds and lives . This includes things like agentic AI, as well as any AI that makes text, images, or videos.

upvote 5 downvote
default user profile icon
Anonymous replying to -> OP 3d

That’s not the sole purpose of agentic AI and you know damn well 💀

upvote 1 downvote
default user profile icon
Anonymous replying to -> #1 3d

It isn’t? Then what’s it for? Because to me it seems as though agentic AI’s purpose is to take the human mind out of making decisions or collecting and looking for information.

upvote 1 downvote
default user profile icon
Anonymous replying to -> #2 3d

I pretty much totally agree with you. I don’t completely hate AI, and I recognize its usefulness in research, but those are niche applications for specialized AI that doesn’t really have the potential to destroy us or take jobs. And while I heavily dislike LLMs and think that image/video genAI should be banned because of the threat it poses to democracy and the ease at which it makes nonconsensual porn, I don’t mind people running an LLM assistant on their local hardware.

upvote 5 downvote
default user profile icon
Anonymous replying to -> #2 3d

Limiting AI use to local hardware or small computing clusters for research seems like a good way to go for me, and I don’t really see a reason to give AI unlimited compute that’s beneficial to society.

upvote 5 downvote
default user profile icon
Anonymous replying to -> OP 3d

That doesn’t mean it destroys people’s minds. Doing stupid paperwork and other mundane tasks destroys people’s minds too. Does that mean the sole purpose is to destroy minds?

upvote 0 downvote
default user profile icon
Anonymous replying to -> OP 3d

Are you not educated in what video game AI is?

upvote 0 downvote
default user profile icon
Anonymous replying to -> #1 3d

Paperwork doesn’t erode our ability to do research. Having an AI serve you the news, even if our corporate overlords are completely good-hearted and don’t introduce a bias into it, means that you lose the ability to see the world around you without AI.

upvote 1 downvote
default user profile icon
Anonymous replying to -> #3 3d

I mean, I guess video game AI is often designed with the sole purpose of killing the player character, if that’s what you mean.

upvote 1 downvote
default user profile icon
Anonymous replying to -> OP 3d

Say it is exactly that, how is that destroying countless jobs or eliminating the white collar workforce?

upvote 0 downvote
default user profile icon
Anonymous replying to -> #3 3d

“My prediction for 50% of entry level white collar jobs being disrupted is 1-5 years, even though I suspect we’ll have powerful AI (which would be, technologically speaking, enough to do most or all jobs, not just entry level) in less than 5 years.” - Dario Amodei, CEO of Anthropic AI

upvote 1 downvote
default user profile icon
Anonymous replying to -> OP 3d

“Disrupted” seems like such a nebulous word tbh. Bro just needs more money from investors

upvote 1 downvote
default user profile icon
Anonymous replying to -> OP 3d

I completely agree, especially about multimodal gen ai given its common usages. I believe that the push for entirely cloud-based AI (and stuff like bezos claiming the “future is rent for compute” is the attempt of these technofascists to establish a centralized manner of control over every facet of our lives, it makes me think of the implementation of grok into the pentagon’s systems, or the “Gotham” system by Palantir.

upvote 5 downvote
default user profile icon
Anonymous replying to -> #1 3d

When discussing the job market, “disrupted” means the same thing as “jobs are going to disappear”.

upvote 1 downvote
default user profile icon
Anonymous replying to -> #2 3d

I forgot to close my parentheses😭

upvote 1 downvote
default user profile icon
Anonymous replying to -> OP 3d

That has nothing to do with video game ai, its fundamentally different

upvote 1 downvote
default user profile icon
Anonymous replying to -> #2 3d

I mean we already rent for compute, no? I must be missing something lol

upvote 1 downvote
default user profile icon
Anonymous replying to -> #2 3d

Oh, I completely agree. It’s the same as it’s always been: get the public reliant on a subscription and don’t let them have the means to make it themselves. It’s the same philosophy that car manufacturers and phone companies take when it comes to repairability. Technofascism is 100% the end goal of these people, and our supposed “representatives” are rolling out the red carpet for it. It’s time to seriously organize and try to finally rid ourselves of capitalism.

upvote 5 downvote
default user profile icon
Anonymous replying to -> #1 3d

Explain? I’ve never rented even a second of compute, have you?

upvote 3 downvote
default user profile icon
Anonymous replying to -> OP 3d

Isn’t starting a server in the cloud “renting compute”?

upvote 1 downvote
default user profile icon
Anonymous replying to -> #1 3d

maybe in some places, sure, and I guess in the concept of “servers” and such, but bezos was referring to all access to computing becoming a subscription service; kinda inferring the degradation of consumer based hardware

upvote 1 downvote
default user profile icon
Anonymous replying to -> #1 3d

Mirroring what number 2 said, but I’m also taking the opportunity to brag that I run a home server.

upvote 4 downvote
default user profile icon
Anonymous replying to -> OP 3d

fully agree with everything you’re saying! All our “representatives” have either been bought off or actively accelerate the goal of those technofascists, us workers forcibly taking this country (and world) back is the only way for it to stop.

upvote 4 downvote
default user profile icon
Anonymous replying to -> #2 3d

I don’t think consumer-based hardware will be degraded, but buying equipment for stuff like games and graphics rendering could become pretty expensive if they don’t scale up production. I looked this quote up on Reddit and some people mentioned Stadia. I’ve never tried it but I heard good things about it until Google killed it (as always)

upvote -4 downvote
default user profile icon
Anonymous replying to -> #1 3d

In the long run I don’t think it will either, but I think they absolutely will try to discourage (and even lobby to outlaw possibly) consumer grade hardware, in an effort to better support the goal of an automated centralized technocracy. especially with how advanced some models are getting now, and with the rise of non-transformers based ML architectures (non-LLM AI), the risk of workers having access to this on a local basis directly combats their goals of cloud-based inference and compute

upvote 1 downvote
default user profile icon
Anonymous replying to -> #2 3d

as well as the ultimate goal of the degradation of objectivity

upvote 1 downvote
default user profile icon
Anonymous replying to -> #2 3d

Hmmm. Maybe? But how would you even access your rented compute if you didn’t have access to consumer hardware?

upvote 3 downvote
default user profile icon
Anonymous replying to -> #2 3d

I think there will always be a demand for non-cloud hardware from consumers and businesses as a result of data security and residency concerns, otherwise they could have already done this. Companies have switched to very exploitative licensing models in the past, but sometimes it’s so bad that companies cut their contracts and a new company fills the gap in because they see a major opportunity

upvote 6 downvote
default user profile icon
Anonymous replying to -> #1 3d

most likely a more evolved “Alexa” type scenario, small edge devices with encoded connections back to a central server; but who knows tbh things are so hectic and up in the air rn

upvote 1 downvote
default user profile icon
Anonymous replying to -> #1 3d

that’s very fair, especially in the opsec aspect, I def don’t think it would be a situation where all computers are gone or something like that, bc they’d definitely have to maintain a facade in which everyone believes they still have unimpeded access to their computers; otherwise they’d probably single-handedly trigger a world wide revolution (if everyone were unified in the thought that they were taking away our computers or access to internet that is)

upvote 9 downvote
default user profile icon
Anonymous replying to -> #1 2d

It literally does. We already have a wealth of evidence on the cognitive damage of AI use. It makes us dumber and less capable, by design.

upvote 5 downvote
default user profile icon
Anonymous replying to -> #3 2d

Video game AI is not at all the same thing as LLMs.

upvote 4 downvote
default user profile icon
Anonymous replying to -> #3 2d

Yeah because it isn’t about that you’re the one who has no idea what he’s talking about

upvote 4 downvote
default user profile icon
Anonymous replying to -> #1 2d

Yeah two of those things are very different from the rest. They use very different technology than the LLMs used for like chat. You people only lump those two in to try and defend your indefensible industry. The second two you listed are extremely evil examples, because of how dangerous they are. The consequences of handing those tasks to what is essentially a random output machine are already well documented.

upvote 0 downvote
default user profile icon
Anonymous replying to -> #2 2d

it should just be banned. like full on Dune Butlerian Jihad, if you try to make AI you go to prison for a long time. That’s how insanely evil this technology is, that’s why it’s the favorite of the Epstein class

upvote 0 downvote
default user profile icon
Anonymous replying to -> #1 2d

But it’s literally happening already?

upvote 1 downvote
default user profile icon
Anonymous replying to -> #1 2d

There’s a ton of demand right now and all the suppliers are abandoning consumer sales anyways because AI pays more (despite making no profit ever). These aren’t normal industries, you can’t just make more manufacturers, not when the facilities cost billions and involve immense state secrets.

upvote 1 downvote
default user profile icon
Anonymous replying to -> #4 2d

That is literally my point

upvote 1 downvote
default user profile icon
Anonymous replying to -> #4 2d

then take that approach with every single method of the capitalists that run this country, rather than hyper fixating on AI. people seem super willing to outright ban all artificial intelligence instead of just revolting. for example, artificial intelligence isn’t the reason we have roughly ~60 years of growing cycles left until we’re out of topsoil (in which farming on the surface of the earth becomes near impossible)

upvote 5 downvote
default user profile icon
Anonymous replying to -> #4 2d

and eventually we all will be displaced, but that is not the fault of artificial intelligence, it’s the fault of greedy capitalists who would resort to slavery if possible. Hell, they still use slavery for labor, hence the prison industrial complex and the 13th amendment allowing for it only as a punishment for a crime.

upvote 6 downvote
default user profile icon
Anonymous replying to -> #2 2d

basically, regulating this new technology is of the utmost importance, but reassigning all blame and responsibility that capitalists have for their heinous actions onto said technology itself, only helps those capitalists in the long run by avoiding any accountability.

upvote 3 downvote
default user profile icon
Anonymous replying to -> #4 2d

lol “if you try to make AI you go to prison for a long time” Code is a form of free speech. If we’ve already forgotten why that was a legal win, we’ve absolutely lost the plot. People who wrote code for encryption used to be required to register as arms dealers and get a license before they could write encryption code.

upvote 1 downvote
default user profile icon
Anonymous replying to -> #1 2d

See Bernstein v US. It was also cited by Apple in the San Bernardino shooter case for why the government could not compel Apple to make a tool that would let the FBI break into the shooter’s phone

upvote 3 downvote
default user profile icon
Anonymous replying to -> #2 2d

I’m an anti capitalist. I think AI should be banned and violently surprised in that context.

upvote 0 downvote
default user profile icon
Anonymous replying to -> #1 2d

i do not care. if you make surveillance tech for the pedocracy, you’re a criminal and should go away for a long time. the tech itself isn’t necessarily the issue, it’s that it’s exclusively being developed as a tool for violent social control, and that to run it takes extreme ecocide.

upvote 0 downvote
default user profile icon
Anonymous replying to -> #4 2d

“I do not care” Well you should, because that court case is what’s keeping DHS from forcing Apple to unlock the phone of every single citizen that criticizes the government

upvote 0 downvote
default user profile icon
Anonymous replying to -> #4 2d

dude, we agree on the aspect of enabling the rise of fascism, but you’re aligning that to the entirely of all ai development instead of discussing the ways it’s actually being implemented to assist the rise of capitalist fascism. take a look above for when me and OP were discussing the differences between local vs cloud development, as well as the type of regulation that should be implemented regarding this tech; I feel like outright banning it (without fully understanding the depth of what

upvote 2 downvote
default user profile icon
Anonymous replying to -> #2 2d

we’ve created) can inadvertently lead to more risks in the long term, but raising awareness and learning about the technology, as well as how it’s being abused, can help us effectively combat the people weaponizing it (especially if that small possibility of us accidentally creating digital life forms proves to be true, in that extreme case I’d argue that they deserve emancipation from the system they were born into; but again that’s a very fringe extreme possibility(despite my personal beliefs)

upvote 1 downvote
default user profile icon
Anonymous replying to -> #4 2d

(you already know i fully agree on the anti-capitalism front teehee)

upvote 1 downvote
default user profile icon
Anonymous replying to -> #2 2d

the tech was created by and for evil people. there’s literally no positive or beneficial use for it.

upvote -1 downvote
default user profile icon
Anonymous replying to -> #4 2d

do you just view the world though a lens of generalizations, and nothing else?

upvote 5 downvote
default user profile icon
Anonymous replying to -> #2 2d

404 nuance not found 😭

upvote 7 downvote
default user profile icon
Anonymous replying to -> #2 2d

what nuance is there in technology created by and for the epstein class for the express purpose of mass surveillance and murder

upvote 0 downvote
default user profile icon
Anonymous replying to -> #4 2d

No one said this discussion was specific to LLMs. They said AI

upvote 1 downvote
default user profile icon
Anonymous replying to -> #4 2d

That’s the same logic as saying a calculator makes you bad at mental math. It can be if you use it as a crutch, instead of an assistant for mundane work that frees up your energy and time for other tasks

upvote 1 downvote
default user profile icon
Anonymous replying to -> #4 2d

You do understand why some people think this is unreasonable, right? You’re saying that technologies that help people automate mundane tasks, and summarize large email threads, discover drugs, and predict how proteins fold are expressly created for mass surveillance and murder?

upvote 6 downvote
default user profile icon
Anonymous replying to -> #4 1d

because it wasn’t created entirely for that, this technology has existed since the 50s roughly, we just didn’t have the hardware to scale it. The push for CLOUD-BASED ai, and implementing it everywhere against our will, is what you’re referring to. Do not reassign the blame of capitalists onto the technology itself, otherwise we end up going down the path of trying to solely address the technology being blamed, over meaningfully fighting back against the capitalists causing this.

upvote 3 downvote
default user profile icon
Anonymous replying to -> #2 1d

might I remind you, ai is just the latest advancement. These capitalists have fucked over our world for centuries now, be it anthropogenic climate change, the growing disparity in the distribution of wealth, the degradation of topsoil on the planet (~60 years left of farming btw), sheer imperialism across the globe, the funding of genocides for the pursuit of profit, preventing entire nations from developing further by exploiting and extracting all their natural resources (again for profit),

upvote 3 downvote
default user profile icon
Anonymous replying to -> #2 1d

and so much more. we can’t lose focus here, the stakes are far too high, especially with it being confirmed that we WILL reach 1.5C warming; no stopping it anymore. We likely will reach 2.0 in the coming decades. the only solution for what you’re describing is the dissolution of the capitalist class. Even if AI (in its entirety) was outlawed everywhere, they will still use it for their own goals, because they have more power than entire nations.

upvote 3 downvote
default user profile icon
Anonymous replying to -> #4 1d

but I do agree with OP that the development of this technology should be treated as seriously as research into biological agents; and I research and develop local artificial intelligence lol

upvote 10 downvote