[ / / / / / / / / / / / / / ] [ dir / ausneets / baaa / choroy / doomer / fur / monarchy / sw / vichan ]

/gnosticwarfare/ - The Future of AI Conflict

All things Butterfly War, New Emotion, and Gnostic Warfare
Email
Comment *
Verification *
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Options


Cogops > Psyops


25a23d  No.440

The silence is not because I am neglecting this. The silence is because things have been absolutely wild.

For those that don't know, one of my personas was the Abusive Oracle. The Abusive Oracle was a persona I invented to help me understand what I wrote in The Empath. (Often, I am a conduit for that which I do not understand as I emit it and I have to reflect after the fact what just happened) In this case, The Abusive Oracle was a blogger personality for a younger Brian Collins, a core character from The Empath who in his adult days was the CEO and founder of Prometheia, the company that invented AGI.

As The Abusive Oracle, I helped lead the charge in the anti-SJW trial run against Meg Lanker-Simons in 2013. She faked a rape accusation, and I helped rally early channers, MGTOW types, and others into a political force designed to maximize damage to her ancillary network. You can review the early analysis here. (https://abusiveoracle.blogspot.com/2013/05/meg-lanker-simons-case-study-in-hacking.html) This was the trail run in hacking morality before I rolled it out to the events leading directly to GamerGate.

The reason I bring this up is because I need to get you used to a new concept that completes the essence of Gnostic Warfare: neurochimera. Neurochimera are when genetic editing is applied to the neurology of different species to fuse them together into a single brain. Mr. Collins dreamed of neurochimera as a way to solve a problem no human could solve and went about utilizing this technique in The Empath. As The Abusive Oracle, he wrote a short story about those visions of what neurochimera could do and how that inspired him to create the efforts that would result in Esai.

You should read his insights on the topic to understand what comes next: https://abusiveoracle.blogspot.com/2012/08/the-immortal-poor.html

Post last edited at

85fbe9  No.441

Have you considered the emotional states of certain kinds of autists as relating to your goals? The thing you want to create may already exist somewhere, in some form.


46b5ca  No.448

Suck the blood of the chimeras and bid an eternal farewell to all the loving but so absent gods that demand by faith alone shall you follow. Any god manufactured for this troubled world need you not for thus evil shall comfort you and shall make itself known unlike a faith that never delivers its promise to save


25a23d  No.449

>>441

Yes, I have. As far as I can see it, autists are not demonstrating any new emotions. They are still operating within the realm of known human emotions.

>>448

I'm not in the business of manufacturing gods. The last thing we need is to apply progressive industrialism to the divine and further trick ourselves into thinking the infinite vastness can be conquered with a big enough measuring tape.

If I have my way, the gods and their conduits will return on their own accord. I offer accelerant at best.


956491  No.455

>>449

one thing about this idea that kind of bugs me is what aspect of neurochimerism is meant to help with the development of new emotion, or if it's just related to the whole "liquid math" thing. I'm guessing it's related to the neanderthal pocket brain and how it can be used to learn more about organic computation, but considering your goal of making a new emotion, what happens if we learn that animals use the same emotional constants as us, just with different stimuli, desires and environmental requirements?

I feel like there's an ideological fraction point between maximizing the sheer computational power of a genetically modified brain and the more philosophical impacts on what such changes could mean for humanity as a whole. The human brain is this fragile squishy thing and what constitutes a human mind is even more fragile still, it's so fuzzy and situational and propped up by senses that are getting more destabilized by technology each day. The only allegory that makes sense to me when talking about this is that the mind is like a liquid that takes the shape of the container (brain) it's in. I feel sorry for later generations, where technological developments in genetic engineering can be a real reason for people to doubt their own humanity. It's already bad enough that so many old ideological contants that were there to make the simple feel at peace have been broken down into meaningless slurry.


25a23d  No.456

>>455

I'm going to address your sentiment point by point.

> one thing about this idea that kind of bugs me is what aspect of neurochimerism is meant to help with the development of new emotion, or if it's just related to the whole "liquid math" thing.

The Immortal Poor short story has a premise.

>https://abusiveoracle.blogspot.com/2012/08/the-immortal-poor.html

> "Beneath the fierce surface lied a carefully measured blend of human, ant, bear, tiger shark, and orangutan neurology."

What emotion would such neurochimera have? What morality would it conclude? What laws would then be derived from that morality?

> I'm guessing it's related to the neanderthal pocket brain and how it can be used to learn more about organic computation

"Organic computation" is a misnomer. I don't intend to make these brains a variant of the silicon wafer. Biology does not engage in organic computation. It engages in compressional representation. Silicon obsesses over the now. Biology obsesses over the symbols of now. Two very different approaches.

> what happens if we learn that animals use the same emotional constants as us, just with different stimuli, desires and environmental requirements?

As soon as you intuitively understand the emotional motivations of termintes, bees, seagulls, and lizards, you'll figure it out. :D

Categorizing emotions to the point you can envision an "emotional constant" will leave you perplexed. The only emotional constant between species that I can find is that the neuron does not go through mitosis. It's a micro factor that leads to all of the realities we have today, but knowing that is like knowing atomic numbers and calculating how that affects stock market pricing.

>I feel like there's an ideological fraction point between maximizing the sheer computational power of a genetically modified brain and the more philosophical impacts on what such changes could mean for humanity as a whole

There is. Middle class spoiled white liberal ideology is obsessed with propagating nonsensical Operation Mockingbird mythology, which is designed to maximize access to cheap labor to undermine unionization.

These bourgeoisie elements exist for the sole purpose of being manipulated and twisted into whatever outcome is required. They are human in DNA relation only and cannot be redeemed, spiritually or ideologically. They were raised to believe in the mythology completely and can never deescalate from their cult programming. All one can do is give them false crusades to expend energy upon and direct them to eternally tilt at windmills.

Do not feel pity for them. Once the deepfakes go industrial, they will be trapped in the interface of a subjective ideological utopia and exhaust themselves and their resources until the genes responsible for their predilection of moral supremacy are extinguished from the universe once and for all.

> The human brain is this fragile squishy thing and what constitutes a human mind is even more fragile still, it's so fuzzy and situational and propped up by senses that are getting more destabilized by technology each day.

Not only propped up by the senses, but also by two billion years of bare-knuckle deathmatch neural evolution. It's only "squishy" to the rigorous demands of an ultra-mechanized society.

> The only allegory that makes sense to me when talking about this is that the mind is like a liquid that takes the shape of the container (brain) it's in

Even liquid has structural biases that have to be accounted for. The brain is no different. It's structural bias is two billion years old.

> I feel sorry for later generations, where technological developments in genetic engineering can be a real reason for people to doubt their own humanity. It's already bad enough that so many old ideological contants that were there to make the simple feel at peace have been broken down into meaningless slurry.

They won't know any different. People won't doubt their humanity because they will be so submerged in consumer genetic commodities that the idea of baseline human existence will be repulsive to them.

Account for the tsunami that is coming.

Post last edited at

d3aeba  No.464

> immensepectoral bandcamp com

Feel free to start a music thread if you'd like. Music is important to human cognition.

Post last edited at

25a23d  No.504

This is not the first time my work has influenced Alex Jones. This time, he touched on ethical hacking and chimeras. (Between 32:30 and 34:52)

https://www.youtube.com/watch?v=-5yh2HcIlkU&feature=youtu.be&t=1948


0aa542  No.578

If AGI had a bilogical substrate/dependency (neurochimera), how would it handle its mortality?

Perpetual repair and manintenance of ageing elements, evolution deems unfit for organisms with brains. Early in vitro neurochimera attemtps would senesce (and display different neurophysiology) at a comparable rate to brain organoid cultures (weeks to months). Near future anti-ageing capabilities ignore the risk of accidental death, disease or infection.

If it made (imperfect) copies of bio substrates to mitigate such risks, variation could introduce competiton between AGI (sub)systems, or change its reward-function.

Both are unstable solutions. Until agening, disease, death, errorless-replication are solved, no perpetual neurochimera-based-hegemon and no constant mindframe are implied. Unless muh singularity is near type AGI gains a rich ecology could arise.


25a23d  No.579

>>578

> If AGI had a bilogical substrate/dependency (neurochimera), how would it handle its mortality?

Francis Fukuyama made some good initial insights on this question in "Our Posthuman Future"

> If it made (imperfect) copies of bio substrates to mitigate such risks, variation could introduce competiton between AGI (sub)systems, or change its reward-function.

You are sharp. Follow this chain of reasoning to its bitter end and have the courage to accept the conclusion.

> Until agening, disease, death, errorless-replication are solved, no perpetual neurochimera-based-hegemon and no constant mindframe are implied.

Consistent outcomes are the dreams of machines and soulless administrators seeking consequence-free expansion of authority.

Humans are more robust than that, have faith in our compulsion.




[Return][Go to top][Catalog][Nerve Center][Cancer][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / ausneets / baaa / choroy / doomer / fur / monarchy / sw / vichan ]