[ / / / / / / / / / / / / / ] [ dir / agatha2 / baaa / choroy / dempart / mde / randamu / revel3 / vichan ]

/b2/ - /b/ 2.0

Only global rule and no intentional flooding - free /b/
Winner of the 77nd Attention-Hungry Games
/x/ - Paranormal Phenomena and The RCP Authority

April 2019 - 8chan Transparency Report
Name
Email
Subject
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Embed
(replaces files and can be used instead)
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options
dicesidesmodifier

Allowed file types:jpg, jpeg, gif, png, webm, mp4, swf, pdf
Max filesize is 16 MB.
Max image dimensions are 15000 x 15000.
You may upload 5 per post.


Global rule | Dost test


File: 48dcffaf8671863⋯.jpg (91.43 KB, 500x356, 125:89, lain data.jpg)

 No.95570

LAIN thread

where is that "bro cringe" guy?

 No.95588

cause i like lain and i want at least someone to talk to aout it


 No.95602

File: 59fd38ccb276914⋯.jpg (21 KB, 339x273, 113:91, broyoupostedcringe0.jpg)

File: 6d108950c420dc2⋯.jpg (125.69 KB, 1044x664, 261:166, broyoupostedcringe1.jpg)

File: 72ee3ebb735c4d3⋯.jpeg (59.28 KB, 640x331, 640:331, broyoupostedcringe2.jpeg)

File: 3b4cc36e0c34e01⋯.jpg (49.37 KB, 680x588, 170:147, broyoupostedcringe3.jpg)

File: b65a914fef63c59⋯.jpg (91.01 KB, 1080x723, 360:241, broyoupostedcringe4.jpg)

>>95570

Lian is peak cringe.


 No.95664

File: 00c93f041cf4987⋯.mp4 (14.71 MB, 320x240, 4:3, ENDROLL1.STR[0].mp4)


 No.95771

File: 629dad0527ffe78⋯.png (117.29 KB, 455x364, 5:4, lain bro you just posted c….png)

>BO removing my posts

>still calls this place free speech


 No.95786

>>95771

Fuck you tripfag

Op lain is gay faggotry for traps.


 No.95803

>>95771

lol what was your second post?

>>95786

i know most people who like lain are discord trannies but i'm not one of them and i want someone who isn't a leftist to talk about lain with


 No.95821

>>95803

Shut the fuck up op. Stop being a trap in denial. Go buy a dress already and sell your faggot trap ass to niggers. If you watch lain that is your future


 No.95823

File: 53362400d20c2fd⋯.jpg (62.37 KB, 1520x1080, 38:27, 1502523642080.jpg)

>>95803

what do you want to talk about lainon?


 No.95829

File: 1fcf0fedab261b1⋯.jpg (38.62 KB, 465x576, 155:192, pepe handshake.jpg)

>>95823

i don't know, i didn't think i would get this far


 No.95832

>>95823

Nothing with you tripfag


 No.95836

>>95829

Might as well talk about how great it'll be to wear panties and call yourself barbara


 No.95839

>>95836

op has anal leakage that will stain his panties


 No.95840

>>95839

Well he will when the niggers are done with him


 No.95850


 No.95854

>>95850

>being this much a trap in denial


 No.95861

>>95803

>Thinking you enjoy lain and don't want to cut your cock off


 No.95866

Op it is physically impossible to enjoy lain and not be a trap


 No.95867

>>95602

>>95771

YOU ARE GOING TO LOOSE SUBSCRIBER!


 No.95874

File: a939b9a24a9bd03⋯.png (1.65 MB, 1920x1080, 16:9, anime scream.png)

>>95861

>>95854

>>95866

I'M NOT A FUCKING TRAP


 No.95880

>>95874

>posts cuckime

>I'm not a trap

Stop living in self-denial


 No.95882

>>95874

Just accept it. You like lain, therefore you harbour a secret desire to be a woman


 No.95941

If you love lain you're a faggot


 No.95962


 No.95969

Just so you lnow op. I'm the one that kills all your lain threads. I don't even know what it is. I just hate the name lain


 No.96298

File: 0a4f3e634540abf⋯.jpg (442.5 KB, 2048x1204, 512:301, remember this.jpg)

>>95882

Honestly the show really did have a big tranny vibe. It's about Lain being a "special girl" who turns out to not actually be real (or whatever), who's passive and effeminate overall but still really good at computers (unlike a real woman). "She" ends up driving her actual family away and receiving positive engagement solely through fleeting and pointless voices on the internet, akin to that of a twtitter or discord hugbox. No wonder it ended up encouraging so many internet trannies and faggy obsessive types. Like you, OP.


 No.96299

File: 02e3e32dec3b1ae⋯.png (144.99 KB, 539x648, 539:648, 73321052_p5.png)

Here is a raer Lain for your folders. I haven't watched SEL but feel like I should someday.


 No.96336

File: e995f633cc87b6b⋯.jpg (337.68 KB, 1920x1080, 16:9, 1438968605815-1.jpg)

I loved lain long before I became a trap, is that really so common among fans or did lainchan brainwash me?


 No.96358

>>96336

I can only answer that question if you post pics


 No.96363

>>96336

Lainchan groomed you like a fat weeb mexican. You've been made, sucker.


 No.96410

File: 2a88c5458d01687⋯.gif (453.92 KB, 758x866, 379:433, 560495920191e0e6767aace928….gif)

File: 4c04b79050c8333⋯.jpg (47.01 KB, 998x713, 998:713, 3d303091c75533651868d5e812….jpg)

File: 2c05da65e0b3b7e⋯.png (126.58 KB, 267x400, 267:400, 1440085023619.png)

File: a57f64dcf301994⋯.jpg (120.7 KB, 800x600, 4:3, lain.jpg)

>>96358

I'm not posting myself, but here's evidence I'm a huge faggot. It's bizarre being so specifically called out.


 No.96428

>>96410

assuming you're OP it's because you keep making these lain threads, dumbass


 No.96443

File: a2bd5c42ed599fd⋯.jpg (82.05 KB, 956x719, 956:719, 6789d205b15af7789a9ab0385a….jpg)

I want to love Lain too.


 No.96447

>>96298

>yuru yuri

>cs degree

>trap

>three monitors, one of which is portrait (the stand lets them rotate freely, it's fucking great, highly reccomended)

>programming books

>art of war

>love felix

>oversized black curtains to block light due to fucked up sleep schedule from being a NEET

>stash of toys under bed

>i regularly sleep chained to my bed by my collar

>most shamefully: mlp, once, long ago

The only inaccurate thing is the filth, gun (no money for one), and weeb figurines (no money).

>>96428

not OP, just saw lain


 No.96464

>>96447

I think it's meant to be an airsoft gun, but nevertheless you should consider buying a real gun so you can kill yourself. Are you really the person who'd post felix nonstop here and back on /b/?


 No.96473

>>96447

Also it seems really stupid to sleep chained to your bed if in the morning you're just going to unchain yourself, kind of defeats the purpose imo


 No.96483

>>96464

Nah, I haven't been on any imageboards for a few years after /pol/ took over my life for a while. Just came back this week.

>>96473

It's comfy and I like to imagine I'm a slave while falling asleep.


 No.96508

File: c143d9b7bdf8454⋯.png (246.51 KB, 522x524, 261:262, i-see.png)

>>96410

The answer to your previous question is that society brainwashed you and Lain helped you reach full awareness


 No.96553

>>96298

missed the sjw shit on the staircase, fuck that, ancap all the way.

>>95570

If you could throw a switch and instantly become a cute girl, and have everyone remember you as always having been that way, would you do it?


 No.96572

File: 71b3f3d4bacacd7⋯.gif (569.74 KB, 735x890, 147:178, htghffffhjtusrjhdtkjhtdydu….gif)

>>96508

doubt


 No.96577

File: 32bd82608fe0233⋯.png (556.97 KB, 1400x1400, 1:1, 2d9817c6b407cca2f55aa6975f….png)


 No.96580

File: e84dfdf6476e41c⋯.jpg (59.98 KB, 876x968, 219:242, very displeased furry.jpg)


 No.96598

File: 62f185fc2b7dd80⋯.gif (1.36 MB, 540x556, 135:139, 62f185fc2b7dd804d2b9bf2ad9….gif)

File: 00b7aa81e87403a⋯.jpg (100.34 KB, 411x600, 137:200, 1401163932025.jpg)

File: f0c6a25c8381c93⋯.gif (45.33 KB, 462x700, 33:50, f0c6a25c8381c931f58269263e….gif)

File: 5ff17bade60160f⋯.png (28.19 KB, 1725x1080, 115:72, 1.png)

File: 913a04084965c8a⋯.png (61.66 KB, 1866x1080, 311:180, 6.png)

>>96577

>>96580

genuinely mystifying


 No.96615

File: 41027fafdd20aae⋯.gif (1.85 MB, 360x204, 30:17, enough.gif)

>>96572

>Crumbposter is a crumbbum

Saw it coming tbh


 No.96646

File: aad9ba51ae06fdc⋯.jpg (818.7 KB, 900x900, 1:1, Iwakura.Lain.full.1840503.jpg)

File: f0ca9cf1c94a7a4⋯.gif (473.46 KB, 500x355, 100:71, 1447161446375.gif)

File: be4115f07db1bfd⋯.gif (72.48 KB, 500x380, 25:19, be4115f07db1bfd01d6fbdd6f8….gif)

File: 32be1f5b357327e⋯.gif (486.75 KB, 450x343, 450:343, 32be1f5b357327ee0961ffffe3….gif)

File: 3fac06f2bd799be⋯.jpg (239.02 KB, 600x800, 3:4, 1437982582344.jpg)

I haven't watched it in ages, just started rewatching it, thanks for the reminder OP :)

Sorry for derailing the thread, I was just taken aback at how accurate >>96298 was. Here's some nice art.


 No.96650

>>96598

At least you're proving my theory. The anime handled the themes of being unmoored from reality by means of technological advancement, with a wholly effeminate, passive and artificial aesthetic and philosophy. It's no wonder it appealed to and helped create people who think they can turn water into wine with a few pills and some makeup.


 No.96651

File: ecb0e6cffc810f8⋯.png (152.3 KB, 853x764, 853:764, 1445480797903.png)

>>96598

these wallpapers arent bad. If only they were any other anime


 No.96689

File: 05920c28b5b2c17⋯.png (48.51 KB, 1727x1080, 1727:1080, 2.png)

File: f63ed8efc61b1a9⋯.png (26.22 KB, 1726x1080, 863:540, 3.png)

File: 121d1aa0f2b71d8⋯.png (31.93 KB, 2162x1080, 1081:540, 4.png)

File: 27f4369e726745c⋯.jpg (1.74 MB, 2531x3282, 2531:3282, 1406319191596.jpg)

File: 62c288f9fe7c493⋯.jpg (1.31 MB, 2184x3123, 728:1041, 1406319307690.jpg)

>>96650

I don't think I can become a woman via HRT and SRS, and I wouldn't do that anyway since it only works while I'm still young. When I age out of being a trap, I'll put some muscles on and be a guy. My dysphoria doesn't bother me as much as some, so I get the best of both worlds.

I do think I'm a whole brain emulation playing an MMO right now though, but that's Bostrom's fault not Lain's. Once this game is over, I fully intend to simulate a different body to my current character.

>>96651


 No.96739

File: 686ff5f809cf7fd⋯.jpg (509.64 KB, 700x875, 4:5, toronto maybe.jpg)

File: 4ae2f9e578613b6⋯.jpg (185.9 KB, 1500x995, 300:199, norilsk.jpg)

File: 6c904813b47c1e5⋯.gif (6.32 MB, 1920x2035, 384:407, hongkong.gif)

File: 30ff5e75e1f1b54⋯.jpg (37.26 KB, 500x500, 1:1, gambling ad.jpg)

File: 616f3ece3bf8f3f⋯.jpg (173.04 KB, 1536x1024, 3:2, ASYrBLA.jpg)

Here's some stuff you may enjoy.


 No.96746

>>96689

Oh yeah, I happened to spot you from the front page in that thread about gay porn or whatever. I'm going to once again seriously recommend you deeply reconsider your spiritual beliefs before you become a statistic. If not to save your life than at least to ease my own karma. I know little and run mostly on intuition, but I can tell it's a bad place you're heading towards and with just a little work it could be a lot better.


 No.96774

File: 2b2a276300451f0⋯.jpg (1.82 MB, 2385x3000, 159:200, 1406319341092.jpg)

This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a “posthuman” stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. A number of other consequences of this result are also discussed. - https://www.simulation-argument.com/

>>96746

I won't act on it without a great deal more consideration. I assign a pretty high likelyhood to this being a simulation, but the strength of the argument if valid and sound isn't the strength of that belief. I have to multiply that by the likelihood I assign to my having judged the soundness of the argument correctly. The final likelyhood is very high, but low enough to stay my hand due to the stakes.

So I wait.


 No.96775

So, op's finally decided to become a trap then?


 No.96808

File: 0b7fb796e3ca0cc⋯.pdf (234.44 KB, simulation.pdf)

Here's the argument which establishes the disjunction a) we'll never create more WBEs who believe they're embodied than those who are embodied OR b) we're more likely than not to be a WBE.


 No.96816

File: 31b0ffc45da0284⋯.pdf (2.36 MB, brain-emulation-roadmap-re….pdf)

Here's another of Bostrom's papers, establishing the credibility of WBEs in our future.


 No.96821

>>96816

I clicked this file now I cant stop thinking about killing myself. What should I do?


 No.96823

File: 595dd01c2000957⋯.pdf (725.55 KB, 0704.0646.pdf)

Here's an interesting paper I don't have a strong opinion on, but it is interesting enough to be worth your time.

>>96821

wrong paper faggot, that's just placing bounds on the problem of creating a WBE to establish that they're probably in the tech tree.


 No.96840

File: 73483e792ff6c68⋯.pdf (5.03 MB, the surprising creativity ….pdf)

File: 51a38547fff9d5c⋯.pdf (343.83 KB, an evolved circuit.pdf)

These two are completely unrelated, but very fun to read. They're just some anecdotes from researchers about utility functions being satisfied in unexpected ways, and a red flag that the control problem is serious.


 No.96850

>>96443

it's very easy


 No.96886

File: 4a9ad1820b41d06⋯.pdf (2.13 MB, Tversky_Kahneman_1974.pdf)

This is also completely unrelated, but if you're interested in self improvement it's damn useful to know some of the common ways humans estimate probabilities incorrectly.

>>96808

>inb4 this crude summary is attacked

It's meant to give you a vauge idea where he's going with it, not be an accurate representation of his argument. It isn't hard to show that this summary is flawed as fuck.


 No.96931

File: 1a59a5bc4e9037a⋯.png (559 KB, 2422x1500, 1211:750, 10.png)

Better summary from the actual paper:

Many works of science fiction as well as some forecasts by serious technologists and futurologists predict that enormous amounts of computing power will be available in the future. Let us suppose for a moment that these predictions are correct. One thing that later generations might do with their super-powerful computers is run detailed simulations of their forebears or of people like their forebears. Because their computers would be so powerful, they could run a great many such simulations. Suppose that these simulated people are conscious (as they would be if the simulations were sufficiently fine-grained and if a certain quite widely accepted position in the philosophy of mind is correct). Then it could be the case that the vast majority of minds like ours do not belong to the original race but rather to people simulated by the advanced descendants of an original race. It is then possible to argue that, if this were the case, we would be rational to think that we are likely among the simulated minds rather than among the original biological ones. Therefore, if we don’t think that we are currently living in a computer simulation, we are not entitled to believe that we will have descendants who will run lots of such simulations of their forebears. That is the basic idea. The rest of this paper will spell it out more carefully.


 No.96957

>make lain thread

>come back to a homosexual who wears chains pretending to be me

this is the last straw.


 No.96959

File: 578b507fea7561d⋯.pdf (3.1 MB, NickBostrom_Superintellige….pdf)

Changing topic, this book is about artificial superintelligence. I haven't checked that this PDF is an accurate copy of the book, but it's the top result on google and I only have a physical copy otherwise.

Stuart Russell (the AIMA author), Max Tegmark, Bill Gates, and Elon Musk recommend reading it. Musk's recommendation doesn't mean much since it's well outside his area, but I guess it helps demonstrate it isn't total shit.


 No.96970

File: c08ebb247e59e5a⋯.gif (37.07 KB, 188x286, 94:143, 1429983112027.gif)

>>96957

>>95874

>I'M NOT A FUCKING TRAP

>>96336

>I loved lain long before I became a trap

It's pretty clear I'm not you OP.


 No.96983


 No.96987

>>96970

bost bics of you in chains to put the matter to rest


 No.96998

Be sure to hide this thread.


 No.97051

File: 3dab93894593b52⋯.jpg (105.19 KB, 800x600, 4:3, no.jpg)

>>96987

No!

I've been doxxed once already tyvm.


 No.97086

>>96931

It's an interesting thought, but it's worth taking into consideration that this planet has an expiration date. If we're living in a simulation, then what layer of a simulation would we be in? Because there would be simulations within simulations within simulations, etc (at least theoretically this would be the case). And even if the simulations would be run on some kind of infinite supply of energy, the universe will eventually reach a point in which all matter will decay and become pure energy (or something along those lines).

If the big bang was the creation of the simulation and the simulation is more or less identical to base reality, then base reality would probably no longer be able to support ordinary matter at this point.


 No.97158

>>97086

Adding details can only lower the probability. Since my conclusions aren't conditional on the answers to those questions, making answers to them part of my statement can only reduce the probability my statement is correct. See https://www.lesswrong.com/posts/QAK43nNCTQQycAcYe/conjunction-fallacy for a nice explanation of the details.

Excerpt:

Another experiment from Tversky and Kahneman (1983) was conducted at the Second International Congress on Forecasting in July of 1982. The experimental subjects were 115 professional analysts, employed by industry, universities, or research institutes. Two different experimental groups were respectively asked to rate the probability of two different statements, each group seeing only one statement:

>A complete suspension of diplomatic relations between the USA and the Soviet Union, sometime in 1983.

>A Russian invasion of Poland, and a complete suspension of diplomatic relations between the USA and the Soviet Union, sometime in 1983.

Estimates of probability were low for both statements, but significantly lower for the first group than the second (p < .01 by Mann-Whitney). Since each experimental group only saw one statement, there is no possibility that the first group interpreted (1) to mean "suspension but no invasion".

The moral? Adding more detail or extra assumptions can make an event seem more plausible, even though the event necessarily becomes less probable.

I don't expect that they simulated 13,700,000,000 years of nothing much happening, but I won't burden my proposition with that assumption.


 No.97191

>>97158

PS, if you found that experiment interesting you'll definitely enjoy this paper >>96886 which goes over a number of ways humans consistently think incorrectly. If that paper isn't enough, you'll probably really enjoy Rationality: From AI to Zombies by Eliezer Yudkowsky (he wrote the linked blogpost IIRC, I didn't check).

I need to sleep, it's just gone 5am.


 No.97231

>>97158

>Adding details can only lower the probability

Ok, sure. I'm not saying you're wrong. But it's not something you can prove either way.

>adding more detail or extra assumptions can make an event seem more plausible, even though the event necessarily becomes less probable.

It's plausible. Plausible is plausible. I don't know how you could say it's more or less, given that there is no hard evidence that points in either direction.

It's interesting, but futile.


 No.97270

>>97231

Bostrom makes a compelling argument that we actually do have strong evidence per a Bayesian view, but I really do need to sleep. The argument isn't "it's possible therefore it is", it's a bunch of stuff followed by "therefore either this condition holds or it's almost certain".

I'll respond properly tomorrow.


 No.97308

>>97270

>The argument isn't "it's possible therefore it is", it's a bunch of stuff followed by "therefore either this condition holds or it's almost certain"

Yeah, I can pretty much guess where he goes with it.

Anyway, I'll check back tomorrow for your response.


 No.97640

>>97051

so why do all lainchanners end up trannys?


 No.99410

File: cf2ef3f00167c59⋯.gif (216.4 KB, 838x650, 419:325, cf2ef3f00167c59182ea15385a….gif)

>>97308

>>97231

WBE is short for whole brain emulation, our brains run on physics and physics can be simulated, our brains can therefore be simulated and would either act the same under simulation (including talking about free will etc) or the simulation is buggy. The simulation need not be down to the subatomic level, but at least down to the protein level is probably required. This is super abbreviated, but not the topic under discussion, WBEs are here assumed possible. See >>96816 if you want a real argument they're possible.

Bayes' Theorem is P(H|E)/P(¬H|E) = (P(E|H)/P(E|¬H)) * (P(H)/P(¬H)). If you're not familiar with the notation, P(A|B) means the probability of A assuming B. H stands for the hypothesis, and E for the evidence. ¬ is negation, so ¬H means "H is false" and ¬E means "we don't see the evidence E". This gives the likelyhood ratio of H to "anything but H", conditional on some evidence E. The math modeling uncertainty is identical to the math modeling randomness. This theorem tells one how to update their beliefes in light of evidence, it's also a theorem that any deviation from it means one can be dutch booked (will accept a bet with negative expected value).

Let's consider the hypothesis "this is a simulation of the 21st century" (abbreviated "sim"), and how the evidence "we appear to be living in the 21st century" (abbreviated "looksReal") should alter our belief in that hypothesis. The following is just Bayes' theorem written out but replacing H and E by those abbreviations, and is still a theorem: P(sim|looksReal)/P(¬sim|looksReal) = (P(looksReal|sim)/P(looksReal|¬sim)) * (P(sim)/P(¬sim)). Now we need to estimate P(looksReal|sim), P(looksReal|¬sim)), P(sim), and P(¬sim) and we can calculate the likelyhood ratio of sim and ¬sim.

P(looksReal|sim) represents the probability that a simulation of the 21st century would appear to those WBEs living within it to be the 21st century. This is pretty much 1, see >>96816 for the argument it's feasible.

P(looksReal|¬sim) represents the probability that the real 21st century would appear to those meatbrains living within it to be the 21st century. This is also pretty much 1, I hope this doesn't need arguing for :)

Since those two terms are pretty much equal, Bayes' theorem simplifies to P(sim|looksReal)/P(¬sim|looksReal) = P(sim)/P(¬sim). Since the evidence that we are, by all appearances, living in the 21st century would be present in both sim and ¬sim, our observing it can't shift our belief one way or the other, and our belief in whether we're an ancestor simulation is almost entirely determined by P(sim)/P(¬sim).

P(sim) represents the probability that this is an ancestor simulation, PRIOR to observing that it appears we're living in the 21st century.

P(¬sim) represents the probability that this is the real 21st century, PRIOR to observing that it appears we're living in the 21st century.

So what should our priors be? Let's consider some hypotheticals to gain intution for this. If we assume that exactly one ancestor simulation of the 21st century will be run (and it contains the same number of WBEs as the 21st century did brains), then P(sim)/P(¬sim)=1, since they're equally likely. If we assume that exactly two such simulations will be run, then there are twice as many WBEs as brains (we're restricting our event space to us being either a WBE or a brain which appears to be in the 21st century for obvious reasons) and so P(sim)/P(¬sim)=2/1 (aka twice as likely). If three run, 3/1 (3x as likely). In general, if n such ancestor simulations of the 21st century are run, then P(sim)/P(¬sim)=n/1 (nx more likely). As the number of simulations becomes large, the probability we're experiencing the single time the 21st century actually happened becomes small.

Therefore: EITHER we don't run more than one ancestor simulation of the 21st century over the entire future of the universe OR we're probably in a simulation.

I expect we'll run many, and so I assign high probability to this being an ancestor simulation; but unless you expect we'll run ZERO at ANY point in the entire future, you are not consistant in assigning higher probability to this being reality than an ancestor simulation.

The probability you should assign to this being an ancestor simulation is P(the above argument is valid and sound) * (1-(1/(n+1))), where n is the expected value of the number of ancestor simulations including the 21st century which will be run over the entire future of the universe.


 No.99425

>>99410

PS: If you want to try debunking this, you'd do better to attack the actual paper rather than my ad hoc rephrasing of it. Nick Bostrom is _far_ smarter than I am and I've probably made a bunch of mistakes he didn't.


 No.99471

>>99410

oof, should have proofread that more carefully, there's some minor errors, but nothing significant or that'll trip you up if you're familiar with probability


 No.99625

>>99425

So last night I read the short paper on Bostrom's three propositions (>>96774). As far as his claim that at least one is correct goes, his argument from probability is pretty solid and intuitive. He does a good job of bringing numbers, in about the throughput of the human brain and the theoretical limits of processor speeds and so on. On the other hand I find his discussion of future technology overly optimistic. Frankly, I lean toward propositions 1 and 2.

I find it far fetched when he discusses planet conversions to supercomputers in post human societies. He doesn't really discuss the feasibility of doing this. Wouldn't it require molecular and sub-atomic manipulation to convert different compounds into circuits? What would the energy costs of doing this be? Assuming it is possible and economical, a civilization that does this would need to be space faring. What portion of resources are dedicated to simulation instead of, say, exploration and warfare? Since the this sort of simulation will only be done either as research or for entertainment, I would think the utility of converting whole planets into simulations is fairly low. So for this reason I put f_p and f_i as being very low. I doubt that these simulations are possible or valuable.


 No.99686

>>99625

WBEs are a) effectively immortal, b) embodied however they please, and c) live under whatever physics they wish due to controlling the simulated environment (e.g. magic could be added to the sim). Being a WBE is far better than being human. If I had the option, I'd be destructively scanned/simulated in a heartbeat because I don't buy the continuity problem. Not only that, but the increase in entropy per WBE is far lower than than of a meatbrain if we get anywhere near the limits, so a utilitarian doesn't have a choice given the universe can support far more WBEs than it can meatbrains before heatdeath. For this reason I expect a great many WBEs to exist in the future and a commensurately large economic incentive for computers to be built.

WBEs however would grow tired of near-omnipotence (unless we just alter how the brain works to avoid the hedonic treadmill), and so I expect would run such simulations so that at the end they once more can fully enjoy the benefits. For this reason (dodging the hedonic treadmill) I expect many WBEs to dupe themselves into temporarily thinking they're meatbrains many times. I expect I would do this to myself a great many times. Even if the hedonic treadmill is engineered out, I still would do so (though fewer times) for the same reason I play RPGs.

For the same reasons of efficiency, I expect we'll starlift the stars until they stop fusing. It's a wasteful increase in entropy without any real benefit (being pretty to look at is nothing against trillions of lives). The energy costs are huge, but spending energy to stop far greater reserves of resources being pointlessly destroyed is sensible.

My current intent is to live as long as I can, get cryonically preserved, and hope that the preservation is sufficient for future scanning (currently that isn't the case, but cognitive neuroscience is advancing rapidly, and once we've got a good idea of what needs preserving it's just a matter of time). If the preservation isn't sufficient, big deal, it's probably a sim anyway.


 No.99693

>>99686

Just shut the fuck up already trap faggot


 No.99767

>>99693

>>99686 is the extent of the nonstandard stuff I believe, so you won't have to put up with much more :)


 No.99774

>>99767

Fucking god damn fucking traps.


 No.99784

>>99686

Are you responding to someone else? I wasn't talking about WBE's, I was talking about the first paper about the probability of living in a simulation.

As far as your point "a)" goes, individual immortality is irrelevant to the probability involved in Bostrom's discussion. If your civilization is attacked by another post human civilization, the one which chooses to invest more resources into warfare is going to wipe out the one which simulates a large amount of ancient humans. Even though people will survive digitally, that doesn't change the fact that your resources for simulation will be dramatically reduced. I'm sure WBE tech will be pursued, but not to the extent necessary for proposition 3 to hold.


 No.99813

seriously this thread is still going? i just wanted to talk about SEL but it's all tranny fags and philosophy


 No.99819

>>99813

Op. You got what you wanted. I told you lain was for traps. You didn't listen


 No.99921


 No.100039

>>99784

Bostrom restricts himself to considering pure ancestor simulations, but any simulation in which people believe they're in the 21st century, and everyone is fully simulated, suffices. I was trying to argue that f_i is significantly large by providing reasons for creating a large number of such simulations, namely to reset the ability of WBEs to appreciate their state. These differ from the ancestor simulations he considers in that the minds already exist prior to the simulation and are merely suppressing memories for the sake of immersion.

My statements about the desirability of it are intended to argue that there is economic incentive to huge numbers of WBEs temporarily believing they're in the 21st century, so that if f_p is large we can conclude (3).

I didn't address your concerns about f_p being nearly zero, I agree that it's not certain we'll become posthuman and so consider (1) and (3) to be significant possibilities, but due to the reasons in >>99686 I don't give much credence to (2).

wrt f_p:

>Wouldn't it require molecular and sub-atomic manipulation to convert different compounds into circuits?

If we wanted extreme efficiency then yes, but the milky way is ~650ppm silicone already, and silicone is lighter than iron so we can gain energy from making it in principle. Whether our engineering will reach that point is another question, but there are theoretical ideas I am unable to judge the likelyhood of (https://www.youtube.com/watch?v=pzuHxL5FD5U). I'm generally unwilling to bet against humans managing something physics permits if we have compelling reasons.

>What would the energy costs of doing this be?

Huge, but probably far less than the energy costs of supporting a similar number of biological brains.

>would need to be space faring

Space travel is a lot easier when you only need to transport computers, data, and some robots.

>What portion of resources are dedicated to simulation

Bostrom argues only a tiny fraction are needed, but I didn't check his source.

>I would think the utility of converting whole planets into simulations is fairly low

>I doubt that these simulations are […] valuable

This is what I was mainly trying to address.

>I doubt that these simulations are possible

We don't need to get anywhere near the upper limits he discusses.


 No.100066

>>100039

s/silicone/silicon

>>99921

So you need to pick one of the following:

>(1) we'll go extinct before we can run large numbers of simulations of the 21st century

>(2) we won't want to run large numbers of simulations of the 21st century when we're able

>(3) we're very likely to be in a simulation right now


 No.100583

File: efe0cb51468f05b⋯.jpg (369.67 KB, 1920x1080, 16:9, 1431462542413.jpg)

Is anyone aware of other works with similar art direction as SEL? It's so god damn pretty.


 No.101273

>>100066

i dont think any of those are right tho

reality is painfully real

call me old fashioned, but i leave it at that


 No.101306

>>101273

>i dont think any of those are right tho

Not having a strong opinion can be rational, but having a strong opinion that all are false is irrational unless Bostrom's argument is invalid or unsound. Irrational means deviating from Bayes' theorem.

>reality is painfully real

>implying your experience of a simulation would feel less real

>call me old fashioned, but i leave it at that

Doesn't matter, you'll find out soon enough.


 No.101333

>>101273

>>101306

PS: The idea you can tell if you're in a simulation or not by how real it feels is attempting to argue his argument is unsound since he assumes otherwise.


 No.101370

>>101306

>Bostrom's argument

Just more delusional, schizophrenic mumbling.


 No.101384

>>101370

try reading the paper faggot


 No.101445

>>101370

You do realize that >>96808 doesn't argue we're in a simulation right? That the author thinks it's more likely we're in base reality than not? Or are you attacking something which lives only in your imagination?


 No.101465

>>101445

No it lives in your imagination.

Imagine thinking that your ideas are real?

“I don’t believe anything, but I have many suspicions.” ~ Robert Anton Wilson


 No.101466

File: 45026cff0dcb3fd⋯.jpg (61.71 KB, 432x432, 1:1, Anonymoose_9a8b54_876839.jpg)

>>101445

Oh you humans and your strange beliefs in reality. Computer simulation what nonsense. Moose learned the secret.to the universe long ago. You humans are barking up the wrong tree. The creator of the universe and all within it was of course a moose.


 No.101488

>>99410

>I expect we'll run many, and so I assign high probability to this being an ancestor simulation

Logically, perhaps Bostrom's propositions are sound. Im not quite sure tbh. However, the feasibility is what I question – and also whether we'll feel the need to preserve the species artificially. Of course it's ingrained in us to preserve the species – just like any other animal – but I really don't think we'd be very eager to put as much resources into this endeavor as it would require. I don't think human beings will be around long enough to reach the level of open-mindedness to understand that, whether it's artificial information or organic, it really doesn't matter (it's all just information regardless). And frankly, I'd like to think we'll eventually reach the level of wisdom to understand the fact that this cycle of suffering really isn't worth preserving, much less going to such great (and strange) lengths to preserve it – but I suppose that's another matter.

Also, there's the possibility that I raised earlier about the simulation being virtually identical to base reality. And if that's the case, the universe probably wouldn't be able to support ordinary matter at this point.

The fact is, the human race will eventually perish, and I think clinging to it in such a desperate way is rather pathetic. But that's me.


 No.101492

>>101384

I skimmed it and found a Soliloquy for mystical thinking.


 No.101501

>>101492

Mysticism doesn't tend to be rooted in rigorous logic and tangible evidence. Don't get me wrong, I don't necessarily think we're living in a simulation, but it's really not that crazy when you think about it.

Not anon btw


 No.101503

>>101488

>I'd like to think we'll eventually reach the level of wisdom

Humans are to stupid for that, 99% are riddled with schizophrenia.


 No.101508

>>101501

Everything humans argue about was invented by another monkey.


 No.101517

File: a0109e0301c742f⋯.png (4.63 KB, 225x225, 1:1, images-1.png)

>>101508

That's why they way of the moose is best. No filthy primates muddling things up.


 No.101527

>>101517

I agree Moose for president.


 No.101543

>>101503

Yeah, but you'd need a lot of people to support spending the money on the resources we'd need to create a sustainable simulation. The religitards wouldn't support it for obvious reasons; the reasonable people wouldn't support it because the nature of life is predicated on suffering and it would be utterly absurd to preserve the species in this fashion; and most of the rest of the population wouldn't care because it doesn't affect them and theirs.

You're right, most people aren't particularly bright of rational. However, whether it's smart people or dumb people, I think the vast majority of people wouldn't care enough about this hypothetical program for it to take off.


 No.101562

>>101488

>Im not quite sure tbh

Very sensible, there's many places it could be wrong and sometimes it takes a while for subtle errors to be noticed.

>the feasibility is what I question

I don't think I can argue that point better than Bostrom does himself in §3.

>I really don't think we'd be very eager to put as much resources into this endeavor as it would require

I argue for the desirability of a similar effort in >>99686 (with >>100039 explaining how it differs from Bostrom's scenario).

>I don't think human beings will be around long enough

This corresponds to credence in (1), that'll we go extinct before we're able to do so. Bostrom himself puts similar credence in all three (listed here >>100066).

>this cycle of suffering really isn't worth preserving

This corresponds to credence in (2), that we won't bother. Bostrom very briefly considers the ethics and concludes that humans existing is generally seen as preferable to us not existing.

Consider that in a situation where we are simulating brains, we could just re-engineer the brain to be incapable of suffering. It might not be a good idea, but it is possible to fundamentally redesign the human condition.

>simulation being virtually identical to base reality

Bostrom thinks the simulation is only as detailed as it needs to be, and explains how that could be achieved without us noticing in page 5, but doesn't assume it is the case.

>the human race will eventually perish

Sure, it looks very much like entropy is going to fuck us all in the end no matter what we do.

>I think clinging to it in such a desperate way is rather pathetic

No more so than hospitals existing is pathetic since everyone in them is going to die anyway.


 No.101591

>>101562

Ok, the first 4 points are basically just "it's possible". Sure… stuff is possible.

>this corresponds to credence in (2), that we won't bother. Bostrom very briefly considers the ethics and concludes that humans existing is generally seen as preferable to us not existing

I don't believe most people would be open-mineded enough to consider simulated humans as being equal. You underestimate the irrationality of religious people (who also happen to make up the majority of the population). But I don't think it's just religious people who wouldn't be interested in this. People want to preserve life on earth, not in some simulation.

>Consider that in a situation where we are simulating brains, we could just re-engineer the brain to be incapable of suffering

I don't see how this would be possible, given that happiness, joy, bliss, etc, is predicated on it. Suffering and happiness exist in relation to each other. Suffering and constraints are what make us human – and we want to be human.

>Bostrom thinks the simulation is only as detailed as it needs to be

Right, this is where it gets highly speculative.

>sure, it looks very much like entropy is going to fuck us all in the end no matter what we do

There's always the possibility that the universe is eternal and cyclical, and if it retains the information that makes up ordinary matter from one iteration of the universe to the next, then we'll live our lives (and perhaps different versions of them) for all eternity.

>no more so than hospitals existing is pathetic since everyone in them is going to die anyway

I do find it a bit pathetic when I see terminally ill or elderly people clinging to life in hospitals. Maybe pitiful is a better word for it. But… I suppose I'll probably be doing the same thing someday. Fucking nature.


 No.101594

>>101562

>>101591

Btw I need to run to the grocery. I'll be back in a bit.


 No.101597

>the lain thread getting this bad

dios mio


 No.101702

>>101591

>Sure… stuff is possible.

The actual arguments for his assumptions and for his conclusion are in the paper and there's no point copy/pasting them here, it isn't a jargon or math heavy one, and is accessible to people without any background. It's pretty short too for how interesting it is. (there is one page with a bit of math, but you can skim it and get the idea)

He doesn't conclude we're in a simulation, he concludes that one or more of the following hold:

>1) we'll go extinct before we can create simulations (he argues it's possible if we don't go extinct in the paper)

>2) everyone who can, for the entire future of the species, will not do so

>3) we're probably in a simulation (all that's required is 2 in the entire future and that's a ~67% chance, hence if this isn't true then 1 or 2 have to be).

This isn't the old brain-in-a-vat/descartes' demon argument that it's possible, therefore we can't be sure it's reality. It's a new argument. Besides, if you buy that one or more of them has to hold then you buy the simulation argument, just not necessarily the simulation hypothesis (which is that 3 in particular is the case).

>I don't believe most people would be open-mineded enough to consider simulated humans as being equal

I agree, that'll be a hellish new civil rights issue.

>people […] wouldn't be interested

Given the huge populations even a single star can support, only a tiny fraction of people need to be interested. You need to argue that _nobody ever in the entire future_ would be interested (who has the ability to do so) since if it happens once then that's 50/50 odds, twice and it's 2/3 odds, etc.

>happiness, joy, bliss, etc, is predicated on [suffering]

>Suffering and happiness exist in relation to each other

In current humans sure, but perhaps not necessarily. Regardless of this, many humans currently consider life worth living, so it doesn't really matter. It's worth noting that it isn't actually even the case in current humans, there's at least one case of wire-heading having happened IRL, but it's fucking twisted and I'd rather not go that route.

>we want to be human

I don't care about being human, I care about enjoying life, but since this is optional it also doesn't matter.

>Right, this is where it gets highly speculative.

That isn't part of his argument but an aside. See >>97158 for why he doesn't make any unnecessary details like that part of his argument.

>but doesn't assume it is the case.

>There's always the possibility that the universe is eternal and cyclical

You'll probably find this interesting: https://en.wikipedia.org/wiki/Poincar%C3%A9_recurrence_theorem . _If_ it applies to whatever physics happens to be, then the universe is cyclical. I have no opinion on whether it does or not. Here's a numberphile video on the topic: https://www.youtube.com/watch?v=1GCf29FPM4k

It's also worth noting that ancestor simulations aren't the only reason to have emulated brains which think they're living in the 21st century, I'd do it for fun if I was an emulated mind, especially given WBEs are effectively immortal so 80 years in a hyper-immersive MMO doesn't mean as much.


 No.101778

YouTube embed. Click thumbnail to play.

>>101594

>>101702

I've got to sleep myself, it's gone 4am. I'll respond tomorrow. Here's a decent PBS Space Time video on the topic, but skip to 3:10, there's some promotional waffle that muddies the waters at the start.

Here's a 2 page summary written by Bostrom without any math: https://www.simulation-argument.com/computer.pdf


 No.101857

This isn't the full argument, this is a short summary taken from https://www.simulation-argument.com/computer.pdf

Do We Live in a Computer Simulation? - Nick Bostrom

Science has revealed much about the world and our position within it. Generally, the findings have been humbling. The Earth is not the centre of the universe. Our species descended from brutes. We are made of the same stuff as mud. We are moved by neurophysiological signals and subject to a variety of biological, psychological and sociological influences over which we have limited control and little understanding.

One of our remaining sources of pride is technological progress. Like the polyps that over time create coral reefs, the many generations of humans that have come before us have built up a vast technological infrastructure. Our habitat is now largely one of human making. The fact of technological progress is also in a sense humbling. It suggests that the most advanced technology we have today is extremely limited and primitive compared with what our descendants will have.

If we extrapolate these expected technological advances, and think through some of their logical implications, we arrive at another humbling conclusion: the “simulation argument”, which has caused some stir since I published it three years ago.

The formal version of the argument requires some probability theory, but the underlying idea can be grasped without mathematics. It starts with the assumption that future civilisations will have enough computing power and programming skills to be able to create what I call “ancestor simulations”. These would be detailed simulations of the simulators’

predecessors – detailed enough for the simulated minds to be conscious and have the same kinds of experiences we have. Think of an ancestor simulation as a very realistic virtual reality environment, but one where the brains inhabiting the world are themselves part of the simulation.

The simulation argument makes no assumption about how long it will take to develop this capacity. Some futurologists think it will happen within the next 50 years. But even if it

takes10 million years, it makes no difference to the argument.

Let me state what the conclusion of the argument is. The conclusion is that at least one of the following three propositions must be true:

>1) Almost all civilisations at our level of development become extinct before becoming technologically mature.

>2) The fraction of technologically mature civilisations that are interested in creating ancestor simulations is almost zero.

>3) You are almost certainly living in a computer simulation.

How do we reach this conclusion? Suppose first that the first proposition is false. Then a significant fraction of civilisations at our level of development eventually become technologically mature. Suppose, too, that the second proposition is false. Then a significant fraction of these civilisations run ancestor simulations. Therefore, if both one and two are false, there will be simulated minds like ours.

If we work out the numbers, we find that there would be vastly many more simulated minds than non-simulated minds. We assume that technologically mature civilisations would have access to enormous amounts of computing power.

So enormous, in fact, that by devoting even a tiny fraction to ancestor simulations, they would be able to implement billions of simulations, each containing as many people as have ever existed. In other words, almost all minds like yours would be simulated. Therefore, by a very weak principle of indifference, you would have to assume that you are probably one of these simulated minds rather than one of the ones that are not simulated.

Hence, if you think that propositions one and two are both false, you should accept the third. It is not coherent to reject all three.


 No.101862

>>101857 cont.

It should be emphasised that the simulation argument does not show that you are living in a simulation. The conclusion is simply that at least one of the three propositions is true. It does not tell us which one.

In reality, we don’t have much specific information to tell us which of the three propositions might be true. In this situation, it might be reasonable to distribute our credence roughly evenly between them.

Let us consider the options in a little more detail. Proposition one is straightforward. For example, maybe there is some technology that every advanced civilisation eventually develops and which then destroys them. Let us hope this is not the case. Proposition two requires that there is a strong convergence among all advanced civilisations, such that almost none of them are interested in running ancestor simulations. One can imagine various reasons that may lead civilisations to make this choice. Yet for proposition two to be true, virtually all civilisations would have to refrain. If this were true, it would be an interesting constraint on the future evolution of intelligent life.

The third possibility is philosophically the most intriguing. If it is correct, you are almost certainly living in a computer simulation that was created by some advanced civilisation. What Copernicus and Darwin and latter-day scientists have been discovering are the laws and workings of the simulated reality. These laws might or might not be identical to those operating at the more fundamental level of reality where the computer that is running our simulation exists (which, of course, may itself be a simulation). In a way, our place in the world would be even humbler than we thought.

What kind of implications would this have? How should it change the way you live your life?

Your first reaction might think that if three is true, then all bets are off and you would go crazy. To reason thus would be an error. Even if we are in a simulation, the best methods of predicting what will happen next are still the familiar ones – extrapolation of past trends, scientific modelling and common sense. To a first approximation, if you thought you were in a simulation, you should get on with your life in much the same way as if you were convinced that you were leading a non-simulated life at the “bottom” level of reality.

If we are in a simulation, could ever know for certain? If the simulators don’t want us to find out, we probably never will. But if they choose to reveal themselves, they could certainly do so. Another event that would let us conclude with a high degree of confidence that we are in a simulation is if we ever reach a point when we are about to switch on our own ancestor simulations. That would be very strong evidence against the first two propositions, leaving us only with the third.

Nick Bostrom is the director of the Future of Humanity Institute at the University of Oxford.


 No.101936

>>101702

As for his 3 propositions, I'm on board. It seems completely reasonable.

>I agree, that'll be a hellish new civil rights issue

As far as human beings in this reality on this planet in this particular time, most of the world either has a tepid view on abortion or doesn't believe in the right to life – and this is in REAL LIFE (or, at least real to us), not a hypothetical simulation.

>Given the huge populations even a single star can support, only a tiny fraction of people need to be interested.

Regardless of the fraction in relation to how many people will exist in all of history, I don't think enough people will have the desire to come together in a given time period to make this happen. All we can do is base our analysis on what we know about human beings NOW, not some hypothetical future.

>since if it happens once then that's 50/50 odds, twice and it's 2/3 odds, etc

But the odds only apprach 100%, they never reach it. It goes on ad infinitum. So, it's not really saying that much.

>in current humans sure, but perhaps not necessarily

Perhaps perhaps. It's barely worth speculating about. I just don't see how it would be possible, much less able to be achieved.

>regardless of this, many humans currently consider life worth living, so it doesn't really matter

Yes, because we're biologically programmed to preserve the species – and I don't think this would ever be extended to a simulation. But, I guess we're kind of going in circles with this.

>It's worth noting that it isn't actually even the case in current humans, there's at least one case of wire-heading having happened IRL

Given the immense amount of constraints that we face in life – as well as a propensity for boredom – I don't see how it would be possible to live in a state of bliss for one's entire life. That actually seems dangerous. I think we need a certain amount of suffering for self-preservation and for the preservation of the species.

>I don't care about being human, I care about enjoying life, but since this is optional it also doesn't matter

My point is, we would have to reprogram life to such an incredible degree that it wouldn't really be human life anymore. So, then we get into the further issue of preserving not HUMAN life, but some nebulous version of it – further complicating the "civil rights" issue.

>You'll probably find this interesting

Yeah, I'm familiar. I'm much more interested in the Cyclic Models than the Simulation Hypothesis tbh.

>>101778

I've actually seen that already. I'll check out the pdf.


 No.103149

File: d16cb80366d268a⋯.pdf (1.6 MB, Compulsive thalamic self-s….pdf)

>>101936

Bostrom himself puts similar credence in (1), (2), and (3). I think (2) is very unlikely due to the benefits of becoming WBE via scanning. I'm not concerned about continuity.

>I don't think enough people will have the desire to come together in a given time period

WBEs being widespread would mean one doesn't need any great effort, just that enough WBEs exist and tweak their simulations to temporarily suppress their memories for the sake of immersion. This is a lot more likely than any nationstate level effort.

As for WBEs being widespread, since they massively extend lifespan (maintaining data is easier than brains), mostly solve the problem of desire surpassing reality (by just altering the sim to match one's wildest desires (waifufags rejoice)), and are by far the easiest route to colonizing other stars, there'll be huge economic incentive for WBEs becoming widespread if computers ever get cheap enough, along with simple reproduction whereby extant WBEs create more to raise them as children. It's a straight upgrade from being a biological human.

>But the odds only apprach 100%[…]So, it's not really saying that much.

It's saying that if neither one or two hold then we're probably in a simulation, the odds approaching 100% mean the odds of this not being a simulation approach 0%. Holding the position with 1/3 likelihood instead of the position with 2/3 likelihood isn't sensible.

If you demand certainty, you could never accept _any_ conclusion since an assignment of probability 0 or 1 requires infinite (or infinitely strong) evidence. Even theorems don't get a probability of exactly 1 since there have been mistakes in published proofs before. While theorems may be certain in some meaningful sense, our confidence in a specific proof can never be perfect.

>constraints [necessitate suffering]

A fixed level of bliss would remove all hedonic incentives to do anything, but a baseline which can be increased by achieving goals would not. Whether people's baseline happiness can be altered longterm is an open question, wikipedia has a summary: https://en.wikipedia.org/wiki/Hedonic_treadmill#Empirical_findings

>That actually seems dangerous.

Absolutely agree. Redesigning the brain is incredibly dangerous. Attached is a case study of a woman who got addicted to using a brain implant for pleasure and did pretty much nothing else for two years. Excerpt:

>A 48-year-old woman with a stimulating electrode implanted in the right thalamic nucleus ventralis posterolateralis developed compulsive self-stimulation associated with erotic sensations and changes in autonomic and neurologic function. […] This case establishes the potential for addiction to deep brain stimulation […]

>At its most frequent, the patient self-stimulated throughout the day, neglecting personal hygiene and family commitments. A chronic ulceration developed at the tip of the finger used to adjust the amplitude dial and she frequently tampered with the device in an effort to increase the stimulation amplitude. At times, she implored her family to limit her access to the stimulator, each time demanding its return after a short hiatus. During the past 2 years, compulsive use has become associated with frequent attacks of anxiety, depersonalization, periods of psychogenic polydipsia, and virtually complete inactivity.

>During the stimulation session, the patient expressed an irresistible urge to momentarily maximize stimulation every 5– 10 min[sic].

>[noting she may not be typical] Further study of the long-term effects of stimulation is needed to clarify the relationship between this extreme response and the usual pattern.

>My point is, we would have to reprogram life to such an incredible degree […] preserving not HUMAN life, but some nebulous version of it

Definitely a big problem with it as well, but smaller changes (such as the mere stimulation of existing structures above) may produce substantial differences.

If it isn't desirable to redesign WBEs significantly, then they're going to have the same hedonic treadmill that we do. This implies there's substantial benefit to fooling themselves into temporarily believing they're not WBEs, so that when the delusion is removed they're able to once again fully appreciate being WBEs, and having access to literally any pleasure they can articulate clearly enough to simulate.


 No.104618

>>103149

>WBEs being widespread would mean one doesn't need any great effort, just that enough WBEs exist and tweak their simulations to temporarily suppress their memories for the sake of immersion

So, you think it's possible to upload your consciousness to something outside of your corporeal material? If that's the case, I think you're dead wrong. It wouldn't be you, it would be a replica. It wouldn't be the same agent.

>If you demand certainty, you could never accept _any_ conclusion since an assignment of probability 0 or 1 requires infinite

This is a stretch. We're talking about something in which the probibility is purely abstract and relies heavily on countless obscure hypotheticals. And I think these probabilities you're assigning are rather arbitrary. 2/3 could be broken down into infinitesimal fractions given all the variables involved, and then you'd end up with a mess that maybe ends up being in the vicinity of 2/3.

>but a baseline which can be increased by achieving goals would not

Well, obviously. Human beings do this right now. Most people are fairly adept at dealing with suffering by means of distraction, anchoring and sublimation.

>that case study

Pretty horrific stuff. And that's just scratching the surface of the problems we'd run into in tweaking the human condition.

>this implies there's substantial benefit to fooling themselves into temporarily believing they're not WBEs, so that when the delusion is removed they're able to once again fully appreciate being WBEs, and having access to literally any pleasure they can articulate clearly enough to simulate

What are you suggesting here? That you currently might be fooled into believing you're not a WBE? Seems like wishful thinking to me.

I'm have some stuff to do, but I'll check back in a couple hours.


 No.104630


 No.104913

File: ca193c47067cffd⋯.jpg (1.11 MB, 1633x2585, 1633:2585, 1401164223422.jpg)

>>104618

>you think it's possible to upload your consciousness

I think that the ability to run WBEs, combined with sufficiently detailed measurement of a brain, implies the ability to create WBEs which act and experience as the original brain (all else being equal). This is because I believe the brain is entirely responsible for the mind.

>It wouldn't be you

There would be two entities, and then they would have different experiences and diverge. In time neither would still be me any more than I _am_ my past self. I'm not actually my past self, just derived from him. I see no reason to prefer either branch unless the dualists are correct (and I've no reason to think they are).

>it would be a replica

It would, and part of what it would replicate is my sense of self (unless you posit that that isn't within the brain). This doesn't mean it is me, just that it would act exactly as if it was.

>[so people wouldn't do so]

With the alternative being death once medicine can no longer maintain a given brain, is everyone else literally willing to bet their lives against that assertion by refusing a prosthetic brain? Only a very small number need to be willing since they'd eventually have children which were immortal WBEs from the start.

Even if many initially are unwilling, once WBEs start walking around saying "of course it's me, how dare you" (which they would do since they behave the same) then many others will give it a shot. It also hugely reduces the suffering of one's loved ones following the brain's death, since a person identical in every way but substrate would still be around to love and care for them. There may be reason for you to care about the distinction, but there's no reason at all for anyone external to you to care.

>these probabilities you're assigning are rather arbitrary

I don't assign 2/3 to the simulation hypothesis. I thought you were saying that since the odds of being in a sim never reach 100%, regardless of the number of simulations, it doesn't mean much to talk about it, when in reality all one needs is odds >50% to switch to thinking it is more likely than not. Two sims in the minimum which provides that (at 2/3).

>purely abstract and relies heavily on countless obscure hypotheticals

I agree, I mention that in >>99410 and implicitly state that P(Bostrom's assumptions are sound) is significantly less than 1. (quote modified a little)

>The probability you should assign to this being a simulation is P(the above argument is valid and sound) * (1-(1/(n+1))), where n is the expected value of the number of conscious entities believing they live in the 21st century over the entire future of the universe.

>Human beings do this right now.

Yes, but it means the baseline could (at least in principle, if not necessarily the case of our current brains) be higher without crippling us.

>you currently might be fooled into believing you're not a WBE?

The minds within ancestor simulations are WBEs. I find (3) more probable than (1) inclusive-or (2).

I do not strongly believe anything particular about the motives by which I came to be simulated, the speculation about resetting the hedonic treadmill is just that and I don't put much credence in it, instead I'm pointing out reasons to reject (2) by arguing for the desirability of actions which cause minds to falsely believe they're in the 21st century.

>Seems like wishful thinking to me.

I don't particularly believe that I existed prior to this simulation; I assign a little more credence to it being generally considered morally wrong to extinguish a mind without good reason (weakly implying continuing brain emulation with a different environment once one dies in the simulation) but not much at all (<10%, I haven't put much thought into it since I expect I'm far too biased to consider such an emotionally loaded hypothesis even remotely objectively).


 No.104932


 No.104946

File: 1ea4b9c9c93d4b9⋯.png (376.4 KB, 846x753, 282:251, 1ea.png)


 No.104963

File: 1354e1d09dadd0e⋯.jpg (156.08 KB, 640x480, 4:3, Cracky-chan-013.jpg)


 No.105086

>>104913

>I think that the ability to run WBEs, combined with sufficiently detailed measurement of a brain, implies the ability to create WBEs which act and experience as the original brain (all else being equal).

Right, but it wouldn't be your consciousness. Why would it be?

>this is because I believe the brain is entirely responsible for the mind

So do I. Which is why I think it would be impossible to upload consciousness.

>in time neither would still be me any more than I _am_ my past self. I'm not actually my past self, just derived from him

Wow, I'm sorry, but you're very confused here. Of course we're not the same exact person from one moment to the next, but there's a certain continuity that's contained within the same general corporeal material – that's a lot different from using one's brain as what would essentially be a blueprint for AI.

>This doesn't mean it is me, just that it would act exactly as if it was

Ok, I'm glad you clarified this. But if it's not you, then why would you care about creating it. It makes no sense to me.

>With the alternative being death once medicine can no longer maintain a given brain, is everyone else literally willing to bet their lives against that assertion by refusing a prosthetic brain?

Ok, wait a minute. You seem to be waffling a bit here. Would it be their consciousness or a replica? If it's a replica, then it wouldn't exactly be their brain.

>when in reality all one needs is odds >50% to switch to thinking it is more likely than not

I think it's too reductionist. It's not just 2 out of 3 things, it's countless variables that may or may not add up to 2/3.

>Yes, but it means the baseline could (at least in principle, if not necessarily the case of our current brains) be higher without crippling us

The baseline would have to be significantly higher for me to support something like this. And it would also require modifying our need/desire to slaughter tens of billions of animals every year.

>I do not strongly believe anything particular about the motives by which I came to be simulated, the speculation about resetting the hedonic treadmill is just that and I don't put much credence in it, instead I'm pointing out reasons to reject (2) by arguing for the desirability of actions which cause minds to falsely believe they're in the 21st century

I think both 1 and 2 are related. See, I think human beings are going to need a lot more time to become receptive to this idea. As things currently stand, most people would dismiss it as absolute nonsense – and I don't think that's going to change any time soon. And at the rate we're going, I don't think human beings aren't going to be around for long enough to think of simulated reality as being akin to procreation.

That said, I think we've exhausted the speculation about 1 and 2.


 No.105191

I love Lain so much but these threads are so fucking cringe. The trap itt is probably a suicide cultist.

>>>/x/31835

>>>/x/40095

Here's a local podcast about some retarded kid that became an hero after falling for a lain themed suicide cult. https://www.cbc.ca/player/play/1279496259945


 No.105199

>>105191

i love lain a lot too, wanna make a thread anon?


 No.105257

>>105086

>it wouldn't be your consciousness. Why would it be?

Because consciousness, whatever it ends up being, has physical embodiment within the brain which can be measured and emulated.

>Which is why I think it would be impossible to upload consciousness.

Do you think it's impractically difficult, or do you think that there's something special about neurons which silicon can't replicate, or something else?

>continuity [of material]

I don't see why it matters unless you can argue that it impacts the way the brain operates (and so impacts your experience). If it doesn't alter your experience, in what sense does it matter?

>But if it's not you, then why would you care about creating it.

I'm not saying it isn't me, I'm saying that replicating the sense of self is necessary but not sufficient.

>essentially be a blueprint for AI

Sufficiently detailed measurements include your memories, and everything else impacting experience which resides within the brain.

>why would you care about creating it

Because I don't see the branch which shares spacetime continuity as special since that continuity doesn't effect experience. Perception of the continuity does, but that can be faked and has the same impact when faked, so it isn't the continuity per se which effects it.

>Would it be their consciousness or a replica?

Both.

>in time neither would still be me any more than I _am_ my past self. I'm not actually my past self, just derived from him

>It's not just 2 out of 3 things, it's countless variables that may or may not add up to 2/3.

Consider the hypothetical situation where we know with certainty that 100 billion people will live through the 21st century, and that 200 billion WBEs will be simulated as living through it. Consider also that you know you are one of these 300 billion, but have no indication which set you're in.

Would you assign 2/3 odds, and so conclude you've a ~67% chance of being simulated?

My point was about abstract and perfect knowledge. It was later amended to account for our ignorance in the matter of how many will live/be simulated when I mention "the expected value of the number of entities which will believe they live in the 21st century."

>slaughter tens of billions of animals

There's no reason to simulate their minds properly, and WBEs which know they're WBEs only need computing power not food, they can just simulate food directly without simulating the cow.

(to be cont.)


 No.105259

>>105086

>>105257

>most people would dismiss it as absolute nonsense

Most normal people would sure, but the blue brain project has 10 billion EUR in funding IIRC to simulate human brains. WBEs aren't their goal, but they're not as far in the future as you seem to think. Notably the blue brain project isn't attempting to scan specific brains, just create typical human brain. The software they're researching, funding, and testing is pretty much what's needed for simulation though.

>The Blue Brain Project is a Swiss brain research initiative that aims to create a digital reconstruction of rodent and eventually human brains by reverse-engineering mammalian brain circuitry. […] Blue Brain has founded "Simulation Neuroscience".

>The project [uses] a Blue Gene supercomputer running Michael Hines's NEURON software, the simulation involves a biologically realistic model of neurons and an empirically reconstructed model connectome. It is hoped that it will eventually shed light on the nature of consciousness.

>The initial goal of the project, which was completed in December 2006, was the creation of a simulated rat neocortical column, which is considered by some researchers to be the smallest functional unit of the neocortex, which is thought to be responsible for higher functions such as conscious thought. In humans, each column […] contains about 60,000 neurons. Rat neocortical columns are very similar in structure but contain only 10,000 neurons and 108 synapses. Between 1995 and 2005, Markram mapped the types of neurons and their connections in such a column.

>In November 2007, the project reported the end of the first phase, delivering a data-driven process for creating, validating, and researching the neocortical column.

>By 2005, the first cellular model was completed. The first artificial cellular neocortical column of 10,000 cells was built by 2008. By July 2011, a cellular mesocircuit of 100 neocortical columns with a million cells in total was built. A cellular rat brain is planned[needs update] for 2014 with 100 mesocircuits totalling a hundred million cells. A cellular human brain equivalent to 1,000 rat brains with a total of a hundred billion cells is predicted to be possible by 2023.

>It also aimed to simplify the column simulation to allow parallel simulation of large numbers of connected columns with the ultimate goal of simulating a whole neocortex, which in humans consists of about a million cortical columns.

From: https://en.wikipedia.org/wiki/Blue_Brain_Project

The Human Brain Project is also interesting.

>simulated reality as being akin to procreation

To an already simulated brain, it's plainly so.

>That said, I think we've exhausted the speculation about 1 and 2.

Sure. I'm still interested in why you feel spatial continuity matters if it doesn't impact experience.


 No.105267

>>105191

>The trap itt is probably a suicide cultist.

Multiplying by the probability the argument is sound fixes that problem if you encounter anyone who's fallen for that.


 No.105330

>>105259

>the blue brain project has 10 billion EUR in funding IIRC to simulate human brains

Just checked, it's 1 billion EUR + and unspecified amount, sorry for not checking earlier.


 No.105348

>>105259

>notably the blue brain project isn't attempting to scan specific brains, just create typical human brain

Yeah, I heard about this, actually. As far as I'm concerned, this isn't as close to the simulation scenario we've been talking about as you seem to think. Obviously we're taking steps towards it, otherwise speculating about all this would seem even more absurd than it already does.

>to an already simulated brain, it's plainly so.

That's only if people reach a degree of certainty that causes them to believe, beyond a reasonable doubt, that they're simulated. Ffs, billions of people still believe that a deity will send us to hell if we misbehave.

Now, I suppose you could argue that the simulation hypothesis could replace religion, but I can't imagine that happening. The vast majority of people aren't deep thinkers and indulge in wishful thinking to an extreme degree. They want to believe they are living in base reality and that there's ultimate meaning to life, and that there's going to be some kind of grand payoff in the end (that is, if they're good boys and girls). Just think about how little the human race has changed over the past few thousand years when it concerns our philosophical development. Only a small percentage of people have made significant advancements in reason, and an even smaller percentage have kept up with the technological developments.

Also, like I said earlier, most people have a tepid-to-non-existent belief in the right to life as it is.

>I'm still interested in why you feel spatial continuity matters if it doesn't impact experience

You're muddling things. The type of continuity matters here. The idea that we could transfer our individual consciousness to another system is implying that there's something transcendent about consciousness (that it's not merely contained within our grey matter).

Now, if we could somehow figure out how to reverse the aging process and hack our actual brains matrix-style, I suppose it would be possible to create an artificial reality that would allow our consciousness (as an agent) to be preserved.


 No.105359

>>105199

There are so many right now. I'm always lurking for lain threads that larping trannys haven't ruined. I blame lainchan, that board seems to breed cancer.


 No.105371

>>105359

fuck

wanna start an irc channel?


 No.105427


 No.105471

>>105348

>this isn't […] close to the simulation scenario we've been talking about

It's very far off a full blown ancestor simulation, but it is relevant to WBEs coming into existence long before we go extinct (unless you place that in the next century or two).

>That's only if people […] believe […] they're simulated.

I'm not talking about the members of an ancestor simulation here, but regular WBEs which know they are from the start. I expect these will come prior to ancestor simulations, because they're a strictly simpler technology.

>the simulation hypothesis could replace religion

Hopefully not, people have gotten it into their heads that religion should be respected and not criticized, and that it's a choice people can make rather than a conclusion compelled or precluded by evidence.

As for being simulated after death, drawing that conclusion requires strong assumptions about the motives of the simulators which I don't feel are justified.

Bostrom does lightly touch on an interesting point that if we do run an ancestor simulation, then we should treat the members well incase we ourselves our simulated and our behaviour influences our treatment (and implicitly argues we treat those we simulate initially well and later (if we let them run their own sim) as they treat their simulations). I don't buy it personally, but I'm not very knowledgeable about game theory.

>people won't come to buy it

I think if we get to the point of running a bunch of ancestor simulations then that might change if we're lucky, but until then you're absolutely right, this'll remain fringe regardless of merit. It sounds too much like scifi for most people to even consider it seriously.

>The idea that we could transfer our individual consciousness to another system is implying that there's something transcendent about consciousness (that it's not merely contained within our grey matter).

It isn't a matter of transferring, but duplicating. There would simply be (for an instant) two of the person before the WBE and the brain diverge. I argue we should identify equally with both until the split occurs and then we should identify with whichever we find ourselves being. I think the main difference we have here is that I don't think my past self is me, just very similar.

>hack our actual brains matrix-style

You'll enjoy this:

>Leia, from Dagenham in east London, had no inner ear or hearing nerve, meaning that even standard hearing aids or cochlear implants wouldn't help her.

>[She was] one of the first children in the UK to be given an auditory brainstem implant, requiring complex brain surgery

>Now, after lots of regular speech and language therapy, she can put full sentences together, attempt to sing along to music and hear voices on the phone.

>The cutting-edge surgery involves inserting a device directly into the brain to stimulate the hearing pathways in children born with no cochlea or auditory nerves.

>A microphone and sound processor unit worn on the side of the head then transmits sound to the implant.

>This electrical stimulation can provide auditory sensations, but it cannot promise to restore normal hearing.

https://soylentnews.org/article.pl?sid=19/04/21/177238

There are also implants connecting to the optic nerve to provide vision. It seems that implants which block normal sensory input and provide their own (from a simulation) are a realistic path.


 No.105482

>>105257

I saw your comment below before this one, so I already addressed some of the points. But I'll at least touch on them a bit more.

>because consciousness, whatever it ends up being, has physical embodiment within the brain which can be measured and emulated

Right, WITHIN the brain. That's important.

>do you think it's impractically difficult, or do you think that there's something special about neurons which silicon can't replicate, or something else?

Special? It depends on what you mean by special. A copy of something is not exactly the same as the the original. Does that clear things up?

>If it doesn't alter your experience, in what sense does it matter?

See: above

>I'm not saying it isn't me, I'm saying that replicating the sense of self is necessary but not sufficient

Not sufficient?

>sufficiently detailed measurements include your memories, and everything else impacting experience which resides within the brain

Right. A blueprint which would allow our brains to be a template for artificial intelligence.

>perception of the continuity does, but that can be faked and has the same impact when faked, so it isn't the continuity per se which effects it.

Ok, I'll illustrate the problem in terms of a Cyclic Model of the universe: if the universe is eternal and cyclical, then every possibility has occurred Ad Infinitum. If the information in each universe is retained from one universe to the next, then I believe we have experienced our exact lives Ad Infinitum – we would be the same exact agent, made up of the same information. However, if the information that makes up ordinary matter is LOST from one universe to the next, this exact universe would still happen, but it would be a copy… a replica. It would not be our consciousness, it would be a facsimile.

>in time neither would still be me any more than I _am_ my past self. I'm not actually my past self, just derived from him

You are an assemblage that includes your corporeal material and your memories, and the continuity that exists between the two. Remove your corporeal material and you remove your experience as an agent.

>would you assign 2/3 odds, and so conclude you've a ~67% chance of being simulated?

Within those perameters, sure.

>There's no reason to simulate their minds properly, and WBEs which know they're WBEs only need computing power not food, they can just simulate food directly without simulating the cow

Frankly, this whole idea is a recipe for serious delusional thinking and behavior. Are you suggesting you don't think animals suffer and that I am just a product of your simulation? If so, this is getting a bit too woo woo for me.


 No.105515

>>105191

>Here's a local podcast about some retarded kid that became an hero after falling for a lain themed suicide cult

ebin


 No.105521

>>105515

i was disappointed they didn't talk about the actual cult, i wanted to hear a boomer say "lain"


 No.105523

File: 62f51cabb973be6⋯.jpg (177.36 KB, 1200x1200, 1:1, tarp.jpg)

>>105359

srsly tho

why are all lainchanners tarps?

whats the correlation?

programming socks?

im confus


 No.105559

>>105523

Any board this rulecucked will attract fags who would get bullied elsewhere.

https://www.lainchan.org/rules


 No.105568

>>105471

>I'm not talking about the members of an ancestor simulation here, but regular WBEs which know they are from the start

Ok, since I'm not very familiar with this subject, I was a little confused about the differende between WBE and ancestor simulations. I get it now.

IF we can develop tech that allows us to upload consiousness, then I think most people would be on board.

>Bostrom does lightly touch on an interesting point that if we do run an ancestor simulation, then we should treat the members well incase we ourselves our simulated and our behaviour influences our treatment

In which case, if we're living in a simulation, our creators were fucking assholes.

>there would simply be (for an instant) two of the person before the WBE and the brain diverge

Hokum. Sorry.

>I think the main difference we have here is that I don't think my past self is me, just very similar

It's not exactly a difference. I don't think the past me is me, because the past me doesn't exist anymore. There is only the present. This isn't because the past me isn't essentially the same me, it's because of the nature of time and entropy.

>there are also implants connecting to the optic nerve to provide vision. It seems that implants which block normal sensory input and provide their own (from a simulation) are a realistic path

I'd be ok with a WBE in which I wouldn't be resposible for a universe of creatures who would suffer as a result of my clinging to life.

Be back in an hour or 2.


 No.105571

>>105482

>WITHIN the brain

It is not the molecules which matter, but the conscious experience they give rise to. Anything giving rise to the same experiences is good enough for me to identify with it.

Consider if I instantaneously swapped out all the proteins in your brain for exact duplicates in the same positions. Would you consider the resulting entity to be you?

>A copy of something is not exactly the same as the the original

Provided the experience is identical I just don't care.

>Not sufficient?

I'm just hedging against the possibility that my sense of self could be duplicated within an entity with substantially different behavior, in which case I wouldn't identify with it. I don't particularly think it's possible, I was just being pedantic.

>[cyclic universe with/without conservation of information]

I think you've put your finger on our difference, I don't draw a distinction between those two agents.

>You are […] your corporeal material […] and the continuity that exists between the two

(I've elided memories since they're part of my corporeal material)

I don't include that continuity as part of my identity. I see no reason to care about it, since it's literally incapable of effecting my experience.

>Are you suggesting you don't think animals suffer and that I am just a product of your simulation?

Not even slightly, I assign extremely low probabilities to both those statements.

Consider if the human brain project succeeds and develops the capability to simulate a human brain. Consider that they do so, providing sensory input from a simulation/interpreting nerve impulses to control a character. Consider that this WBE is informed of its nature. That WBE, knowing it's effectively inside a video game, can just conjour up a steak without ever bothering with any animals.

In an ancestor simulation then that isn't the case, and the brains of animals would need to be simulated (and this suffer) in order to act sensibly. The same is true of humans in a simulation, they would also need to be fully simulated and thus capable of suffering.

WBEs need not be in ancestor simulations, and in those cases spawning virtual food our of thin air is by far preferable to slaughtering simulated animals (which would be a sadistic waste of computing power).


 No.105592

>>105482

>>105571

PS

>exact duplicates in the same positions

I think exact duplicates five feet to the left is a better question


 No.105602

>>105559

thats a good point tho


 No.105608

i hate this thread so fucking much. i just wanted to talk about lain


 No.105649

>>105608

nope

muh simulations

4eva


 No.105671

>>105568

Consider the case where a person is anesthetized, scanned, and then two WBEs are created. Prior to going under, the person is given the following choice: either the body can be given $10, or both simulations can be given $10.

Which option would you choose prior to being put under, completely setting aside your morality and desire to improve the world for a second, and being as selfish as possible to maximize the chance you get $10?

I would choose the simulations, because 2 of the 3 entities will then receive $10.

I expect that should I repeat this experiment many times, I will perceive myself winning 2/3rds of the time and losing 1/3rd of the time.

>Hokum. Sorry.

I'm not claiming that the WBE and the brain would share experience or anything supernatural like that, just that I consider agents with identical behavior to be the same agent for our current intents and purposes.

They are two distinct things, differing for example by their physical composition, but the differences between them aren't relevant.

In the same sense I would say a burrito, and an exact duplicate except for one missing molecule, are the same burrito. They can be distinguished, for example by position on the table, but there's no good reason to do so unless the things about them which differ effect you in some way you care about (IRL I would care which was closer).


 No.105685

>>105671

>>105568

PS: This isn't to imply I think there's 1 'real' me being randomly assigned, it's rather that I think all three branches derive from me as much as I derive from my prior self. There would be three, rapidly diverging, entities each time, none of which (including the brain) are the person who went under because they've since diverged.


 No.105774

>>105571

>It is not the molecules which matter, but the conscious experience they give rise to

I'm just saying, the molecules matter if one cares about continuing one's experience as an agent.

>Would you consider the resulting entity to be you?

I don't think that would be possible, given that you wouldn't be able to instantly replace them in the exact same position and the exact same atomic/subatomic composition; you would inevitably have to break the continuity. The closest thing to a "replacement" scenario in which my consciousness would be preserved is some kind of teleportation that uses non-locality (or something along those lines).

>provided the experience is identical I just don't care

That makes no sense to me, but whatever.

>I'm just hedging against the possibility that my sense of self could be duplicated within an entity with substantially different behavior

I see. Yeah, you're wise to not get your hopes up.

>I think you've put your finger on our difference, I don't draw a distinction between those two agents

I think you're letting your emotions cloud your judgement. I don't feel any desire, much less responsibility, to create more agents (whether it's clones of me or otherwise) that are just going to suffer through life until they whither and die. I think it's wiser to let go than to hang on – then again, perhaps I'm just being too pessimistic.

>I see no reason to care about it, since it's literally incapable of effecting my experience

The continuity isn't a part of your identity; it's what allows you to maintain your consciousness. It's what allows you to go through life as if you're a person who is essentially the same agent from one moment to the next.

>WBEs need not be in ancestor simulations, and in those cases spawning virtual food our of thin air is by far preferable to slaughtering simulated animals (which would be a sadistic waste of computing power).

I see. Well, I suppose I can understand your desire to explore these possibilities, but I can't share your enthusiasm.

>Which option would you choose prior to being put under, completely setting aside your morality and desire to improve the world for a second, and being as selfish as possible to maximize the chance you get $10?

I'd go with the body, because I don't care about a simulated me.

>just that I consider agents with identical behavior to be the same agent for our current intents and purposes

We've reached an impasse on this point.

>In the same sense I would say a burrito, and an exact duplicate except for one missing molecule, are the same burrito. They can be distinguished, for example by position on the table, but there's no good reason to do so unless the things about them which differ effect you in some way you care about (IRL I would care which was closer).

Yeah, I understand where you're coming from here, but a person is not a burrito.


 No.105903

>>105774

>I don't think that would be possible

A simulation could actually do this kind of thing if you're (reasonably) concerned about considering impossible thought experiments.

Would you consider yourself to have been killed and a replica created, or would you consider it yourself. This is pretty much the star trek transporter problem, and is particularly gnarly since one's stream of current thoughts isn't interrupted.

>continuity […] allows you to maintain your consciousness

If I could be drugged such that all neurons in my brain ceased firing, and medical devices ensured my heart still beat etc., then I would no longer be conscious. When I was woken, I would have broken continuity. Would I have died and a different agent come about?

>[continuity] allows you to go through life as if you're […] essentially the same agent from one moment to the next

The future containing an agent almost identical and derived in predictable ways from me, in such a context as I can crudely predict and thereby plan for, is what permits me to act as an agent. If I was to only exist every other second, with temporal continuity being totally broken (a simulation could actually pull this off, though we couldn't do it IRL), I would still be able to act as an agent within the simulated world.

>We've reached an impasse on this point.

Yeah, I can't see any way either of us could shift the other, because we agree on _what reality would be_, but not on _what we should value_, and one can't get an ought from an is.


 No.105944

>>105903

>>105774

I need to sleep, thanks for the interesting conversation :)

I'll be back tomorrow if you think the questions in >>105903 are interesting.


 No.106013

>>105903

PS: w.r.t. the simulation being paused for every other second while the environment simulation progresses, the data which is my 'brain' as a WBE _does_ have temporal continuity. Not a great example, ah well.


 No.106115

>>105903

>a simulation could actually do this kind of thing if you're (reasonably) concerned about considering impossible thought experiments

That's like saying "God could do it if he wanted to". It doesn't hold much water to me.

>Would you consider yourself to have been killed and a replica created, or would you consider it yourself?

It just depends on whether I "wake up" feeling like the same person I was before I was killed. If not, well… I suppose I won't exactly get an answer either way.

>this is pretty much the star trek transporter problem, and is particularly gnarly since one's stream of current thoughts isn't interrupted

There's a cheesy sci-fi film called The 6th day with Arnold Schwarzenegger about this. It's actually not bad, and I think they hit the nail on the head.

>Would I have died and a different agent come about?

No, because you would pick up where you left off, and it wouldn't include changing your "gear".

>The future containing an agent almost identical and derived in predictable ways from me, in such a context as I can crudely predict and thereby plan for, is what permits me to act as an agent

The practical nature of this doesn't extend to a replica, though. That's my main problem with your position on preserving conscious experience through simulations outside of your corporeal material.

>yeah, I can't see any way either of us could shift the other, because we agree on _what reality would be_, but not on _what we should value_, and one can't get an ought from an is.

Exactly

>>105944

Ok, I'll check back tomorrow for your response. I'm enjoying the discussion as well.

>>106013

But a pause is different from a replacement. Not sure if I'm understanding you correctly, though.


 No.108334

>>106115

>That's like saying "God could do it if he wanted to". It doesn't hold much water to me.

We do have reason to believe that a simulation of a brain (WBE) is physically possible, and that it could be used to do just that (duplicate you identically five feet to the left in the simulated environment). Granted a real WBE would probably simulate the brain separately to the environment for safety reasons, but it's quite possible to combine them in principle.

>It just depends on whether I "wake up" feeling like the same person I was before I was killed.

That feeling is the result of physical processes within the brain, acting upon the physical state of the brain. The state can be measured, and the processes are just physics which can be simulated. This will give rise to the same behavior of the system, including the things you say, which are controlled by your internal experience. Therefore a sufficient simulation would say things like "I feel like the same person." for the same reasons the meatbrain would, unless the physics simulation is inaccurate, the measurement of the brain is inaccurate, or there's something causally influencing your behavior from outside the brain (i'm ignoring minor stuff like the spinal cord since that's easy to add if one can do a brain).

>The 6th day with Arnold Schwarzenegger

I'll watch it.

>you would pick up where you left off

In terms of the self-reported conscious experience, so would a sufficiently accurate emulation of a sufficiently accurately measured brain.

>it wouldn't include changing your "gear"

I don't see why anything but the conscious experience matters.

Consider the following: A WBE is created from scratch (not from a scan), running on supercomputer A. It runs for a few subjective hours, during which it starts to read a book. The simulation is then suspended, saving it's current state exactly, the data comprising the WBE and simulation is transfered via the internet to supercomputer B. Supercomputer B then resumes the simulation. The simulation continues exactly as if nothing happened, the experience of the WBE reading the book is uninterrupted.

Has the WBE been killed and a replica created? If not, how does this break in continuity and substrate differ from the brain/silicon break which occurs between scanning/simulation?


 No.109754

>>108334

>We do have reason to believe that a simulation of a brain (WBE) is physically possible, and that it could be used to do just that (duplicate you identically five feet to the left in the simulated environment)

Yeah, that scenario isn't hard to imagine, I suppose. I actually missed >>105592. There's a very big difference between swapping exact duplicates in the same exact position and doing it "5 feet to the left".

>that feeling is the result of physical processes within the brain, acting upon the physical state of the brain

Right, we're on the same page when it comes to the material nature of consciousness, but the main issue here is whether your individual consciousness could be transferred to something outside of your body. The simulation might say "I feel like the same person", but it wouldn't be the same consciousness that it's based off on. This is another point I think we've exhausted.

>In terms of the self-reported conscious experience, so would a sufficiently accurate emulation of a sufficiently accurately measured brain

Not exactly, because it wouldn't be the same material

>I don't see why anything but the conscious experience matters

Because the conscious experience wouldn't be the same. 2 different agents… 2 different consciousnesses.

>Has the WBE been killed and a replica created?

Perhaps not, but organic information is not the same as digital. If we exist in a computer, then the possibilities become infinite. But I don't believe the universe is a computer; it's only analogous.

I'll check back in a couple hours. Pretty busy today.


 No.110427

>>109754

>it wouldn't be the same consciousness that it's based off on

You aren't the same consciousness that you will be in 1ms. You differ in your position in spacetime for example.

However, it's useful to define an equivalence class which includes your future selves in order to plan, and say they're the same for these intents and purposes. When you are considering which entities in the future should be part of what you consider yourself, you currently include your brain.

I argue there's no compelling reason to exclude a simulation which initially will be in a state your brain will occupy in the future. Neither the future brain nor the future simulation are literally the same consciousness as you, and I see no meaningful property of the former which the latter lacks. They're both different entities to you.

claim: In terms of the self-reported conscious experience, a WBE would perceive itself picking up where it left off (i.e.: the state of the brain at the time of scanning).

>Not exactly, because it wouldn't be the same material

>Because the conscious experience wouldn't be the same

This implies that the material impacts the self-reported conscious experience in ways which couldn't be simulated. My claim here is specifically about how the WBE itself perceives it's conscious experience, not a claim about the conscious experience itself.

>2 different consciousnesses

Being a different consciousnesses doesn't imply a different conscious experience. I could run the same simulation twice, and those would be two different consciousnesses, but have the same conscious experience.


 No.110462

>>109754

>>110427

>My claim here is specifically about how the WBE itself perceives it's conscious experience, not a claim about the conscious experience itself.

Oops, meant to say:

>My claim here is specifically about how the WBE itself perceives it's conscious experience, not a claim about the _consciousness_ itself.

It is a claim about the conscious experience itself.


 No.110495

>>109754

PS: If I was scanned and simulated, then the brain _after_ the scan would not consider the WBE to be him, nor would the WBE consider the brain to be him. Both versions of me would consider each other different people.

It is only my brain _prior_ to the scan which identifies with both.


 No.111541

>>110427

Hey, sorry it took me so long to get back. Been working on a project all day.

>You aren't the same consciousness that you will be in 1ms. You differ in your position in spacetime for example.

That's true, but you run into the continuity problem if you try changing material in attempts to transfer the EXACT consciousness.

>When you are considering which entities in the future should be part of what you consider yourself, you currently include your brain

Yes, MY brain (and all of it's fluctuations), not a replica.

>neither the future brain nor the future simulation are literally the same consciousness as you

There's a difference between the my future brain – as in, the exact grey matter – and a simulated brain that picks up where my brain leaves off. You seem to think this is an insignificant distinction, but it's the very thing that would prevent us from uploading our experience as singular agents.

It's possible that I just lack the vernacular to express my position precisely.

>this implies that the material impacts the self-reported conscious experience in ways which couldn't be simulated

Yes and no. I believe the same exact experience can be replicated; however, I don't believe one could continue "living" unless the corporeal material is used as the vessel. See: the Cyclic Model analogy I made earlier.

>Being a different consciousnesses doesn't imply a different conscious experience. I could run the same simulation twice, and those would be two different consciousnesses, but have the same conscious experience

I think this is the crux of the problem. It's the same experience, but through a different agent. And it doesn't make sense to me why you'd want to preserve the experience if you don't get anything out of it after you die/expire/whatever you want to call it.

>>110462

I think I understand where you're coming from.

>>110495

>Both versions of me would consider each other different people

Wait, have we been talking past each other this entire time? Let me get this straight: do you think it's possible to preserve your singular experience as an agent through a simulation?


 No.111599

>>111541

I think I should clarify something…

>as in, the exact grey matter

*the Grey matter that has evolved in a continuous fashion in spacetime.


 No.112595

>you run into the continuity problem if you try changing material in attempts to transfer the EXACT consciousness

There's no reason to care about physical continuity here; humans caring is just because we naturally split the world into spacetime-continuous objects. Physics itself has no concept of an object*, there's just fields (or whatever it ends up being). I argue we should set aside intuition and care about what actually matters to us personally: continuity of conscious experience.

If I knew I was going to be teleported somewhere tomorrow, I wouldn't use all my resources for a last hurrah and say my goodbyes since I'm going to die, despite the break in spacetime continuity, because I only care about conscious experience. Indeed I would plan _exactly_ as if that entity resulting from the teleportation would be me. Would you plan as if you were about to die, or as if you were about to be in a different place?

>There's a difference between my future brain […] and a simulated brain that picks up where my brain leaves off.

There are several differences, conscious experience we can agree is not necessarily among them, what is the difference which you care about?

>I don't believe one could continue "living" unless the corporeal material is used as the vessel.

Consider if I measured a single neuron in a brain, without removing it, so well that I could simulate it, and then constructed a mechanical replica. If this replica was then swapped out with the original neuron, in such a way that doesn't disrupt anything which matters to experience (e.g. a firing event), then that brain now contains one artificial neuron while not having disrupted the brain's consciousness at all. Consider that I do this until no organic neurons remain.

This isn't the ship of Theseus problem, because they modified the ship in ways which mattered to people, just in a small way each time. The problem of the ship of Theseus is people rounding near-zero impact to zero impact, being unspecific in what they mean by "the ship of Theseus", and assuming that objects as humans conceive of them are fundamental to physics. I claim this has _exactly_ zero impact on what we should care about. The brain itself would notice _nothing_ occurring at any point (otherwise we fucked up a replacement somewhere).

>[two identical simulations have] the same experience, but through a different agent.

Agreed.

>why you'd want to preserve the experience if you don't get anything out of it after you die

Because I consider the consciousness to _be_ me. Consciousness is a process, not a configuration of matter, those merely give rise to the processes we call consciousnesses. This isn't claiming any special status for processes in physics, anymore than saying that building a wall is a process, not a configuration of matter. If time stopped, there would be no building of walls, and no consciousnesses.

Render my brain unconscious (properly unconscious, not just sleeping), scan it, destroy my unconscious brain. If I have been killed, it was when I was rendered unconscious, not when the brain was destroyed. I do not fear general anesthetic, and so I do not fear so being rendered unconscious. I _do_ fear the future not containing an agent whose conscious experience picks up where I left off. I would prefer the neuron replacement thought-experiment above, but I wouldn't pay much more money for it.

>do you think it's possible to preserve your singular experience as an agent through a simulation?

In principle: Absolutely, via the neuron replacement thought experiment above which maintains a singular consciousness throughout.

In practice, and less certainly: Yes, via drug/scan/destroy as above.

*in the same sense we normally use the word, in which I am an object, and a piece of cheese is an object.


 No.112598


 No.112714

>>111541

>>112595

PS: I think that while properly unconscious (not just sleeping) there are zero consciousness in the world which are me.

drug/wake with no scanning or anything else (e.g. as probably happens for surgery) causes that number to drop to zero, and then rise back to one when the brain awakens and picks up where I left off.

drug/scan/destroy similarly causes that number to drop to zero, and then rise back to one when the WBE starts to run and picks up where I left off.

drug/scan+wake causes that number to drop to zero, and then rise to one when the brain awakens and continues where I left off, and then rise to two when the WBE starts to run a picks up where I left off. In this case the brain _after_ waking and the WBE consider each other different people, but the brain _prior_ to the split occurring considers itself equally both but does _not_ consider them the same person. If the prior brain is A, the posterior brain is B, and the WBE is C: A considers itself to be both B and C, but does not consider B to be C nor C to be B. This means that the relation of consciousness I identify with is not an equality relation.

The progressive neuron replacement thought experiment keeps that number at exactly one throughout and is clearly preferable for that reason.


 No.112731

>>112714

s/equality relation/equivalence relation/


 No.114139

Reality itself is a simulation. Running off itself to create new and interesting stories and ways of being.


 No.114153

I want to get married and start a family with lain.


 No.114175

Brain replacement is just an excuse to not have to think about the fact that you would die. It doesn't matter whether the computer is picking up your thoughts as you die, before, or after. The operable question you have to ask yourself is, would I literally kill myself if I believed a computer could think the same as me


 No.114239

>>114175

That's what we've just been discussing, I'm not going to type out all the same arguments again.


 No.114358

>>114239

and your arguments boil down to "if you don't know it's happening you're not dead". Which is BS because the half of you that is biological would experience failure as it is destroyed.


 No.114387

>>114358

If there's nothing having conscious experiences, then there's no consciousness.

See also the neuron-by-neuron replacement thought experiment in >>112595


 No.114670

>>114387

There is no difference between doing this and writing a program to emulate your neurons one at a time as they're destroyed. (each one not firing at that moment notwithstanding) Your actual biological brain would die either way. The replacement brain would exist the same either way.


 No.114671

>>96651

Fauxx makes some good stuff frin

https://fauux.neocities.org


 No.114673

>>114671

Is this the Faux?


 No.114694

>>114670

>There is no difference between doing this and writing a program to emulate your neurons one at a time as they're destroyed.

You'd need to feed the results of that simulation into the partially destroyed brain to ensure it didn't notice (as it wouldn't in the original case).

I agree it isn't fundamentally different to killing the brain, and then scanning it.


 No.114708

>>114694

>I agree it isn't fundamentally different to killing the brain, and then scanning it.

So we're back to the same question. Would you, if you believed a computer was thinking your thoughts, actively shoot yourself? You don't get to pretend the death of your physical brain isn't happening. If I showed you a supercomputer and you actually believed it was emulating your brain, would you commit suicide? That is the question.


 No.114835


 No.114971

>>114708

I say in >>112595 that I care about the WBE continuing my conscious experience where I left off, and in >>110427 that I only identify with WBEs which initially will be in a state my brain will occupy in the present or future.

I give more detail in >>112714 and explain that I wouldn't do so in that circumstance.


 No.118479

File: 5d7926ee5152e66⋯.gif (709.15 KB, 500x537, 500:537, 5d7926ee5152e6607fbd408a14….gif)

File: 81cdd6893da2860⋯.jpg (230.84 KB, 1600x1200, 4:3, 81cdd6893da2860e9a117a53f6….jpg)

File: 0abcf72abeb0a00⋯.jpg (2.91 MB, 2549x3441, 2549:3441, 1401164079245.jpg)

File: 8a961c758fbf51d⋯.jpg (1.97 MB, 2525x3428, 2525:3428, 1401164370016.jpg)

File: 4e591d4c2ff0909⋯.jpg (2.38 MB, 2358x3156, 393:526, 1401168203018.jpg)

more art




[Return][Go to top][Catalog][Nerve Center][Cancer][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / agatha2 / baaa / choroy / dempart / mde / randamu / revel3 / vichan ]