>>67Entropy. Desirable states are an extremely small subset of all states. A major change will therefore be towards a state that is undesirable unless specified. Personal power increases the size of changes with no particular preference for desirable states, because physics is amoral. Therefore the ability to create undesirable states increases, leaving the local mediocrity of murder and genocide and entering planetary or universal annihilation. This is not ideology, this is physics. Unless the universe is benevolent, an increase in power increases the relative ease of destruction.
>>66Utopia would have to not be run by fickle, hyper-selfish (not just seeking personal gain, but others' ruin) humans. Either forcefully change human psychology not to prefer getting $50 oneself over having everyone get $100, or hand rule to an artificial intelligence. The AI would make the universe available to (post-)humans benevolent, and so great power can be held safely.
>>981. They listen too much to base emotions and initial judgments, usually colored by their ideology.
2. I'm autistic enough to note they're powered by glucose, that's for sure.
3. No. Morality is not located in the limbic system. People willingly overpower their feelings for a reasoned greater good, and are often right to do so (e.g. overcoming addiction, doing your homework, etc.). It is separate.
>>120Look up Kahneman's "System 1"/"System 2" classification. Morality - perhaps ethics is a better word - exists in system 2 - it is slow and deliberate. But you can use it to motivate action.