Mind Merging, Definitions of Identity, and their Societal Consequences

Continuing the discussion from Transhumanist thinking:

The following Aeon article seems to be relevant in this respect:

##Hive consciousness
####New research puts us on the cusp of brain-to-brain communication. Could the next step spell the end of individual minds?

So, recent research seems to point towards the difficulty to maintain your individual identity once you hook up with other minds via a broad neural connection. That sounds scary indeed, because as the article ends:

Immersed in that pool – reduced from standalone soul down to neural subroutine – there might not be enough of you left to even want to get out again.

But what if it really feels totally awesome to be a higher consciousness that consists of multiple previous individuals? Isn’t that like catching a glimpse with becoming actually one with the cosmos? Shouldn’t those who are interested in actual self-transcendence look forward to that happening?

Yet, isn’t this just a glorified way to commit suicide and give birth to a whole new person? You might not be able to predict how that new person will turn out, or whether you would even like being a part (subroutine) of it. But then, we are also not very good at predicting how our own children will turn out, or the AIs we create.

Identity

A big issue for society when it comes to mind merging is how to deal with the issues surrounding identity of those new group/hive minds. Does the hive inherit everything its component “subminds” possessed? What if the fusion is temporary and minds can retain individuality again? Would we treat that event as something like a divorce? Is the old identity of the joining and then quitting mind continued, or is the mind that enters the hive a different person from that who leaves it?

Should the voting power of a hive mind be 1 or the sum of its constituent “subminds”, or something in between (something like the logarithm of the number of its neurons divided the logarithms of the number of neurons of a standard person)?

Should there be some kind of time limit after which a joining mind becomes an integral and inseparable part of a hive mind?

Would it be safe to join a hive mind for just a fixed short time period like a hour or a minute to find out whether one really likes the experience? After the time is up, you are automatically separated from the hive and thrown back into your own individuality.

Aliens among us?

Would “normal” humans treat those hive minds as something completely alien ? As weird technological artefacts like artificial general intelligences or uplifted non-human animals? How would society deal with those “newcomers” – especially when considering that normal humans would probably be quite scared from them? What rights would it be willing to grant them? Wouldn’t it try to restrict their rights to maintain their own privileges? Probably yes. Do we need to anticipate decades of new civil rights struggles for these new entities? I think so. Will all of that slow down progress? Most likely.

The adoption of these new technologies and entities will be rather slow due the societal issues that come with them. Or won’t it?

Wouldn’t married couples want to merge, no matter what? Wouldn’t humans want to become one with their AI assistants?

The future that we might end in would be rather strange. But I don’t see it ending it huge catastrophes. What seems mind-bending and incomprehensible for us now might soon become the new normality for our future selves and our progeny.

1 Like

I suspect it wouldn’t feel all that different from how we feel as individuals. There’d just be more activity going on than before and of course, more total capacity. There’d probably be a very rapid convergence of beliefs in the individuals in the group. I suspect it’d be best to not add too many ex-individuals at once though. I also suspect that one of the individuals would need to be relatively skilled in self-introspection and to have deep understanding of how the mind works. Otherwise you might end up with a mad and/or depressed or otherwise psychotic multi-person for quite some time.

If this can be called suicide, then you’re committing suicide every time you communicate with someone else. It’s not something new, really. Just a much more effective way of communication than we’re used to.

At least at first, I suspect it would be prudent to require each individual to spend a couple of hours per week disconnected from the group. That way they can join the merged mind while still retaining the capacity to function on their own. I think it can be compared to being cut off from the internet after getting used to it. As long as you don’t allow yourself to become too dependant on the internet for basic living, you can make do without, even if you don’t like the experience.

I’m a bit uncertain if you wanted to speak of identity as a social construct or of identity as a mental construct. I suspect that at first, there’d be no legal framework for sharing ownership of things in this sort of an arrangement. Even when there is, it’s probably prudent to wait until all individuals agree to share their things before arranging such a thing in any case.

Identity as a mental construct, however, is a tool. If it’s useful, it stays, if not, it goes. Simple as that. I suspect each member of a merged mind would keep some kind of an identity for interacting with non-mergers or other merged-minds. There’d perhaps also end up being a shared identity that was used for conversations between merged minds that are not mutually connected. We already have examples of this in the form of companies that do form a kind of identity for themselves.

I suspect it might take far longer than that before you even start noticing the other members of the merged mind after joining for the first time in your life and far far longer before there’s any chance of losing your individuality. Even if you do lose it, it’s by choice. You’ll just realize you don’t need it anymore and proceed to toss it.

I suspect both of these will happen. I also suspect that once the first married couples are starting to be fully merged, they might start looking into cross connecting with other married couples, forming an even bigger “marriage”. In this case, both sides will already have an idea of what will happen in the process, so it’s not an unknown anymore.

That’s optimistic. Humans can hold conflicting beliefs within one single brain just fine. Being conflicted or having compartmentalized beliefs in a multi-brain mind should be even easier.

Right, if people cannot even figure out and fix the contradictions in their own thinking, then they will have a hard time to make a coherent whole out of a conglomerate of minds.

That reminds me of the following quote:

“No man ever steps in the same river twice, for it’s not the same river and he’s not the same man.”
– Heraclitus

If we take this seriously, we die every time we talk with someone – even if we talk with ourselves. The mind after the act of communication is not quite the same as the mind before it. Anyway, communication is unavoidable. Without any external or internal communication one would be pretty dead. Life is a process of perpetual dying and rebirth.

On the other hand, we don’t lose control over our own individuality when we talk with other people, unless they manage to take possession of our will somehow… but even then, the effect is usually only temporary – with the exception of Stockholm syndrome, for example.

Why should the ability to function on ones own be seen as necessary or even valuable, after one has merged with a group mind? It sounds about as meaningful as being able to function only using one hemisphere of your brain, if you actually have two functioning brain hemispheres.

So, you disagree with the message of the article that strongly connected brains give rise to a singular mind / person / identity? What makes you think that?

It’s certainly possible to hold conflicting beliefs. I just think that being able to more effectively communicate with other people will accelerate the speed at which those resolve.

Individuality is something you control? … I’m not quite sure what to think of that idea yet. I’ve been thinking of individuality as a particlar mix of beliefs. Each brain will form those on it’s own. Communication just makes many things more efficient in that respect. Assuming, of course, that the communication is not false.

Anyway, I think there’s something fundamentally wrong with the notion that being more connected to other people would make you lose your individuality by force.

Well, if you assume perfectly reliable communication links, perhaps it’s unnecessary. We don’t currently have those, though. It’d also serve to allow for introspection without interference from the merged mind so you can be sure that you want to continue being linked to it. Perhaps even a necessary breather that helps with adapting to the situation.

Some people have had the link between the hemispheres surgically cut. As far as I understand, they continue to function without major problems. But anyway, as I stated above, it’s a matter of reliability of the communication link. In the case of a brain, that link is rather reliable.

No disagreement here. I just don’t think it’d necessarily happen fast or that it’d be inevitable. The people involved in such a merged mind would most likely end up understanding each other to a much deeper degree than people who’re not a part of the merged mind. The deeper the understanding and therefore trust, the less need there is for separate identities. So, I’d consider it strange for that to not happen but would not be surprised by it because it most likely is possible keep from having a deep understanding of the others, despite the link.

1 Like

I think those are the most interesting statements in your reply, which I otherwise pretty much agree with. Individuality as “something you control” is a perspective that I’ve come up with by a thought experiment (inspired by reading the novel Diaspora by Greg Egan), in which sentient beings can perceive every thought and feeling other sentient beings experience, too, as if they were their own. How do you define a boundary between “you” and “not you” in such a situation? By testing out your area of control over the actions of the sentient beings that are present. That what you are able to control is “you”. What you can’t control effectively is “not you”.

Of course, you can try to “control” the actions of any sentient being indirectly via sending messages, for example. But that’s only influence, not real control. You only have control over your own actions (if at all).

Defining individuality as a set of beliefs or preferences is also a valid approach, but since beliefs and preferences can change a lot over time, this gives rise to a very fluid sense of individuality. That’s not any kind of judgement on that idea of individuality. In fact, I see different definitions of individuality as tools (for whose ends? – it depends).

Nevertheless there is a definition of identity that I find very poetic: You are your own complete history.

Regarding losing your (belief-)identity in a group mind, I think it may happen quickly that you as group mind acts on believes that you as individual do not adhere to. It would be interesting to think about the psychological consequences of that happening. I assume that it must feel like some weird kind of group pressure. Less “fortified” minds might find themselves quickly following the lead of the general group mind consensus, even if clashes with their (previously) held beliefs and values.

This sounds to me like it’d require extremely dense linking. Perhaps neuron to neuron linking between brains to actually be possible. That’d essentially make them the same brain. Well, perhaps you could get the effect with slightly less intensive linking, but a brain as a physical single unit starts to feel completely unnecessary if such linking is possible.

Ok, so this includes not just beliefs but also memories. Did I miss something?

Ah, yes, kind of like employees in a company might well act in ways that have nothing to do with their personal beliefs.

Group pressure of some kind, most likely. Whether it’s weird or new in some way, I wouldn’t be so sure. This
does happen when people act as group, even today.

Yeah, a lot, I think, depending on the extent of a “snapshot” of your identity:

  • Your personal preferences
  • Your desires, wants, and likes
  • Your knowledge about the world
  • Your skills
  • Your habits
  • Your plans, hopes, and dreams

All of this are mental concepts. When we include the physical realm, we would also need to consider:

  • Your genome
  • Your microbiome
  • Your connectome
  • The status of all parts of your body

And that’s only one snapshot in time. If you are your whole history, you are at least the complete history of all of these things, regardless whether you can remember them currently or not. And we might also need to add your history of interactions with people, the environment, and the technium.

That’s what I mean with “you are your own history”.

It certainly depends on how you link people’s brains together. If you link them together so that inputs from other brains would be classified as “foreign”, or weakened substantially, then it should be easy to retain relatively distinct identities. Also, people should be able to choose which parts of the cognition of others they want to perceive. You’d have to consciously “tune in” to others to perceive what they perceive – and only if they allow that kind of access to their cognition. There should be a sophisticated system that can be used to define different kinds of access rights to one’s brain very granularly.

That we are discussing group pressure in the context of group minds is an unsettling prospect. Can we assume that people join group minds voluntarily? What does “voluntary” even mean exactly?

Coming back to this thread after a while. I kind of burned myself out thinking about these after that last message.

Be careful, if you go far enough, you’ll end up including everything that’s happened in the whole universe ever to this definition… Well, that being said, it probably already is included. I intended to discuss this further, but once I realized this, it just kind of felt pointless to try to go further down that line of thinking. Identity is rather messy to define it appears.

To underline this point, here’s a link to an article: http://edition.cnn.com/2013/05/07/tech/brain-memory-implants-humans/index.html

The things described in this article are rather mild. An implant that merely fixes long term memory for some patients. Doesn’t sound too important, until you realize that this device can do everything that’s needed to have a single set of long term memories for any number of humans. At the very least for anyone who’s implanted with the system from a very young age. I wonder how long it’d take for a baby to reach adult level of intelligence with access to several lifetimes of memories.

Not quite a shared mind, but shared memories go a long way towards blurring the contours of identity. I suspect this alone would be enough to completely decimate our current ideas of identity. People would likely learn to tell apart which memories are from which body. However, I suspect most would not think that to be an especially important detail.

In computer terms, that’d be like a shared hard drive but individual RAM and processor caches. It’s also not difficult to imagine what it’d be like to remember someone else’s memories like that. Just take one of your own memories and imagine you were someone else.

You know, I’m starting to wonder if “voluntary” is a term that’s even meaningful for this discussion. There have been times when even my own mind has been internally conflicted. That often ended up in a kind of forced choice of one of the options. I can’t quite say those were voluntary choices but it doesn’t quite feel correct to call them involuntary either.

So, in an event where people’s minds have already been joined, it’s very difficult to fathom what, exactly, voluntary means anymore. The separation between entities that the word seems to imply just simply isn’t there anymore.

1 Like