The Great Race

Background information:
Search and track the following term (dictionary definition linked to give solid start):

Search and track: The Culture War

Very important:
https://www.youtube.comwatch?v=2g67LwjzV0I
Summary: There is only one standard of morality: “Are you a good communist?” No other law or moral standard exists. In other words, if you are a good communist, murder is basically a $50,000 speeding ticket.

Things you should be tracking:
real estate bubble.
retail apocalypse
debt bubble
inverted yield curve

You should also study and/or refresh yourself on what an economic collapse is and what it looks like.

Many years ago, my subconsciousness gave me a glimpse of the eve of the singulaity, who can say whether this is even close to being right, but the gist of it is that the time will be overwhelmed by a profound sense of anxiety. Regardless of the outcome, people will feel more and more powerless as the phase shift takes place. I do not advocate a “machine phase life” type of phase change, but rather I am referring to a situation were it is not even possible to retain the existing culture and this, rightly, freaks people out.

A slightly better than medeocre game called Frostpunk ( https://store.steampowered.com/app/323190/Frostpunk/ ). At the beginning of the game, you care about a lot of things, getting resources, building things so they run efficiently, scouting, harvesting, etc, etc. At the end of the game when temperature goes to hell, and everyone’s screaming at you about this problem or that problem, you are like “SHUT IT ALL DOWN! Don’t like how I’m doing business? Come back tomorrow morning…” As the temperature falls, less and less matters, at the end the single thing that you care about is the overload gauge on your city’s central heating plant and whether you can make it last the next eighteen hours when you know that you will absolutely have to run it flat out for the last twelve…

It would be wonderful if our civilization was not in the process of self destructing in such a spectacular way as it is now. Well, is the boiler going to hold until dawn or are we going to have to throttle back now a bit to get some elbow room to run it all out later? Don’t know what I’m talking about? that’s the first problem. That is, we need to figure out the variables that ACTUALLY MATTER, that is the variables that pose existential risks that are likely to manifest, and the variables that will determine whether each of us will be able to reach good outcomes post-singularity.

Obviously, the situation we have is a complete CF, while what exactly we’ll need from AGI is still open for debate in a lot of ways, and no telling when it’s coming too. So yeah, that’s the problem I see. It’s not just AGI safety, or getting to AGI, it’s also keeping enough civilization going long enough to get the chance to actually use it.

1 Like

I think that’s very important. But who is going to do that, really? There are a few organizations that seriously deal with x-risk amelioration. Maybe working with them might make sense.

Still, I feel that something more fundamental is missing here. We seem to assume that humanity and humans want to survive. But is that really true? And if yes, why so? Why do you want to live? What’s in this world that makes it worth struggling with immense personal, local, and global challenges – now, and in the future?

We are different persons. We want different things. What one loves, others may abhor and vice versa. And that’s just the tip of it. Even we as persons are no monolithic beings. Sometimes we want one thing, and the next moment, we want effectively the opposite of that. Our desires and moods can shift sometimes frighteningly fast. Levels of deprivation and saturation determine what we are motivated to seek. We usually seek something that we don’t have. And once we get that, we look for something else to strive for. That’s how it is.

One can try to escape from this maelstrom of desire. I can tell myself that I most deeply seek wisdom and strength. But then I push myself to hard and feel my body and mind getting brittle from the exertion, rather than getting stronger. We need to moderate ourselves, and even our pursuit of our highest goals. Being human is not easy.

Of course one can argue that this doesn’t matter, if there is no world left for us to live in. True, however the alternative situation of having a surviving world, but no sufficiently worthwhile life in it is similarly bad. We want things that make us feel good. Yet those are the things we all too easily get addicted to. And then we lose our freedom, or even degenerate. Staying strong in a world like this is hard. The gains we enjoy by getting stronger just make it easier for us to enjoy the next level of vices that sabotage our strength. On the other hand, pure ascetism doesn’t work either, because if there is no subjective reward, where’s the motivation to continue on that rocky path?

Psychology is a deep and nasty mess. And technology will make it much worse, before it can make it any better.

And through all of this my current depression is raising its ugly head. My health has reached a plateau that’s enough for existing comfortably, but not enough for making significant progress. It’s frustrating. Seeing and understanding all those limitations and biases, yet unable to use that knowledge to solve them.

Fixing the world and fixing yourself are two challenges that I increasingly see as equivalent. If you can fix one, you have already fixed the other.

Improving true human health would fix a lot of problems, but I don’t see that coming any time soon. Our bodies and minds are so complicated that we can hardly get any deeper understanding of how they really work in all their complexity. Understanding and piercing through complexity is the area in which we probably need AI the most. The irony of the situation is that AI is one of the most complex problems there are. So, we are forced to bootstrap our understanding and technology. The currently most effective method for that is called science. We need more of that! In all areas.

More science, less culture war.

1 Like

There is definitely some meat to that question. A distressing number of people seem to be actively fighting against institutions and moral values that, in many cases, are absolutely vital to the continuation of our civilization. It is difficult to comprehend.

Practically, however, we need to realize that life requires certain things be asserted independently of any externality and then do what we can to solve the problems those assertions cause.

The problem is that we don’t have the means to do serious science in a mountaintop bunker. Although I’ve sometimes fantasized about building a mountaintop academy, and how I would screen out applicants, as well as some of the cool technical problems of building in an extreme location and maintaining a livable environment without too much expense. In reality, we need to do science among filthy, stinking, irrational, humanoids… This actually requires that these humanoids have a fundumanetal level of respect for the scientific process and allow you to do your science the way it needs to be done without burning you at the stake for reaching politically inconvenient conclusions. So yes, the culture war is a Problem that needs to be dealt with one way or another. =\

1 Like

Let’s try a thought experiment. Let’s assume that we as simple human beings only have an illusion of control over the future, but no real control. Let’s further assume that the development of civilizations depends critically on initial conditions. And as much as we are already developed, we are probably way past the point at which we can have any influence on the eventual outcome. In other words: We just follow the trajectory of our cultural development that has been predetermined quite a while ago, possibly dozens of millennia ago.

With these assumptions there is nothing we can do. We are doomed to whatever destiny that has been paved for us by our past. So, that’s not very interesting as a mental scenario, except for the remote possibility of actually estimating what kind of trajectory we are on. But even if we managed to compute our future, we would be unable to change it.

So, let’s weaken our assumptions. Let’s assume that the development around the technological Singularity depends critically only on the parameters shortly before that Singularity. Then we still might be able to influence its eventual outcome. Yet, we are not even so far as to agree on what kind of Singulairty would be the best. The divergence in our values precludes a harmonic Singularity from happening. So, we are stuck with a dysharmonic Singularity. Meaning that billions of humans will be totally pissed off (or much worse) by the events that will unfold during a Singularity. So far, so unsurprising.

Now we faced with the prospect of a dysharmonic Singularity. So, what? Some people would want to prevent a Singularity at all, if they can’t control it. Let’s call these people the Anti-Singularitarians. And the rest of the world which isn’t categorically opposed to an uncontrollable Singularity the Singularitarians. It’s not clear which faction would win.

Scenario 0: Anti-Singularitarians win

Some crazed alliance of Anti-Singularitarians wins. The Singularity is postponed indefinitely. It could be Islamists taking over. Or Chinese communist control freaks. Or postmodernists of any particular flavour that isn’t compatible with superintelligence spitting into their plans. Or it could just be a rise of excessive conservatism or humanism that states that present-state human existence is the best we can ever have, and every deviation from that is just decay.

In such a scenario we aren’t absolved from any other problem. All local and global problems will hit us with their full force, unameliorated by any superior intelligence that could otherwise guide or even save us. Inequality will still be rising. Sea levels will still be rising. Global temperatures will still be rising. And technology and science will be locked in strong chains preventing it from creating any kind of Singuarity. The same chains will also prevent it from providing us with any kind of solution to those serious challenges. In other words, we will have to solve these extreme problems with about the tech level we currently have. The only scenario in which I could see that working out not too catastrophically is by adopting thorium based nuclear fission or with some serious luck nuclear fusion as primary power source. Total surveillance is unavoidable. Any deviating minds will be silenced. Dirty geoengineering solutions are applied. Global ecosystems deteriorate, but to a level which is barely survivable for a couple of billion of humans. Extreme wars and gigadeath will happen, but humanity will muddle on.

Until the system collapses. And then any kind of future will be open again.

Scenario 1: Singulariarians win

In this scenario humanity is crazy enough to seek its chances with an essentially uncontrollable Singularity whose results may be worse than any kind of hell that humans can come up with. On the other hand, the results may be better than any kind of heaven that humans can dream up. The result of this scenario is much, much, much less predictable than that of scenario 0. It’s all across the board. The spread of possible outcomes is extreme and only hope is left.

Conclusion

Considering what is at the stake, it’s not surprising to see any kind of nearly absolute insanity unfolding on this world in this very moment – approaching Singularity.

If we are lucky higher forces will protect us. If not, we are left to chaos.

But at least in chaos our greatest hopes are fulfilled. Don’t ask about our chances, though…