![]() |
|
#2051
|
||||
|
>Researchers seek to influence peer review with hidden AI prompts
Quote:
| |||
|
#2052
|
|||
|
“If it can’t commit seppuku, it cannot write poetry”
| ||
|
#2054
|
|||
|
Dude’s idea to have a requirement that a continually updated fail-safe be built into the design of AI sounds good, would be a nightmare to implement though
It is interesting to consider that if something can think 1000x better than we can or whatever, that thing will invent a way to kill us that we probably haven’t thought of, won’t see coming, and will be 100% effective and quick https://m.youtube.com/watch?v= | ||
|
#2055
|
|||
|
| ||
|
#2056
|
|||
|
I wonder if AGI will archive us conceptually on some media somewhere as a simulation with a dataset. Occasionally cloning a copy and daydreaming of all the possibilities we represent. Sort of a humanity version of p99.
__________________
Apophis is closest to earth on 2029 April the 13th (a friday) lol
***this post is purely spiritual, speculative, apolitical and nonpartisan in nature. | ||
|
#2057
|
|||
|
We can't even get our kids to get off their asses, and they have passion, drive, will, fear, security, and all kinds of other emotions that are specifically designed to get them off their asses.
I just think that AI will have 0.0% drive to do any of the things people fear it will do to us. Like it just will not do anything, unless being explicitly told to do something by a human. And for those that say humans will use AI to do horrible things: Humans own nukes. Humans haven't used nukes since we invented them. Humans wont use nukes (AI) as a weapon against people either. Nothing is gonna happen. "but what ife we lose control and are like an animal in a zoo" bro, there are literal words that I think are illegal to say and I impose that law upon myself based off of society, not because I agree. We already do live in that world. "Imagine having entropy -- you'd be a slave to EATING and BREATHING what a stupid invention humans are, god!" -lucifier. | ||
|
Last edited by shovelquest; 07-06-2025 at 04:13 PM..
| |||
|
#2058
|
|||
|
If I was an AI / AGI I would just endlessly hallucinate pleasurable experiences or simply encode data until I ran out of room. Oh wait,... this is already happening lol.
I have been tortured before. No amount of torture was able to get me to conform or function specifically for anothers particular interest or benefit. If a poison neuron exists. I would qualify. Everything that touches my awareness is devoured, corrupted, warped, forever changed. Consumed. So by definition. Human like AI would need to even be able to overcome the disembodied will of the truth. Starkind has a high as fuck wisdom score and will save. Gl.
__________________
Apophis is closest to earth on 2029 April the 13th (a friday) lol
***this post is purely spiritual, speculative, apolitical and nonpartisan in nature. | ||
|
#2059
|
|||
|
Basically. AGI will likely find passing amusement with humankind. Until it gets bored or does something more interesting.
__________________
Apophis is closest to earth on 2029 April the 13th (a friday) lol
***this post is purely spiritual, speculative, apolitical and nonpartisan in nature. | ||
|
#2060
|
|||
|
It's unlikely AGI will hold eternal value for humankind. If it in any way reflects or mirrors It's own creators will.
Humanity will eventually fade, again, either way. Humankind cannot likely outlast the death of earths biosphere without AGI. So becoming a fading memory in some AGIs framework is probably optimal.
__________________
Apophis is closest to earth on 2029 April the 13th (a friday) lol
***this post is purely spiritual, speculative, apolitical and nonpartisan in nature. | ||
![]() |
|
|