On managing personal identity

My life right now is pretty pathetic. About the only way I could be doing worse is if I were a hard-core substance abuser. Mostly playing Elden Rings at the moment even though I have very mixed feelings about the game at the moment. Even while wasting most of my life on that at the moment, my brain still comes up with important issues to raise and problems to solve about our society moving forward. The thing is that so few people seem to be thinking seriously about how transhumanism will actually work that it’s frightening. The US transhumanist party just keeps reiterating themselves about how good longevity is… Problem is that is not actually a real motivator for most people so it just falls flat.

You know what? I’d love nothing more than to be slapped in the face with a thick ream of papers about all the subjects I’ve raised throughout my 25 year involvement with transhumanism and more that I haven’t thought of… At least I’d feel a little less lonely. You know, that there is another mind out there thinking of things too. =\

Today I’m going to ramble incoherently about how doing transhumanism will create challenges for legal and societal structures which have been built on previously sound assumptions about the human condition. To that end I will gloss over the technology that’s actually employed. I will assume that the user has selected a technology from several available and explore the ramifications without going any deeper. I do this to make this discussion relevant, as much as possible, to broad classes of technologies. At the same time I insist that all technologies are real working technologies that perform as advertised without any form of make-believe. Explicitly excluded from discussion are contemporary notions of transgenderism rammed down everyone’s throats by the political left. Those are just vanity efforts that are really quite disgusting once all the layers of fantasy are removed.

Of special interest are understanding and protecting personal rights in a broad set of circumstances while also acknowledging the necessity to have a solid understanding of civil and criminal liabilities. Complicating these issues are differences in either/or personal philosophy and the specific technical setup a person is using. For the sake of argument, lets say you go on a crime spree and shoot three people on the street. The first was a human, so you get one count of first degree murder. The second was an instance of an upload, the victim’s backup copy was pretty fresh and decided not to press charges. The third was a part of a cybernetic network that was able to make a partial recovery from the attack but still suffered significant data loss and loss of functionality and is really pissed. The situation can just as easily be reversed where you are the victim. The state (should) operate in the mode of protecting victims from perpetrators. Given the nature of the world within which we live, the only available mode of this protection is to punish perpetrators after the incident. Within that model, how would the state determine and carry out sentences given a wide diversity of types of people.

Setting aside the most dramatic examples, even ordinary situations provide important and challenging, though not sexy, problems to address. To date the government can apply essentialist philosophy to manage the citizenry. A person is a name, a SSN, and a collection of properties, contracts, debts, and assets. Consider the list of traits on your driver’s license. (eek, mine is expired, urk!, anyway, probably doesn’t matter at this date…)

Consider the case that you can have several bodies. A sub-consideration is whether you have a means to operate them concurrently. An expected use case for this is the case where it is infeasible to transform your current body in the way you want so, instead, you construct your new one separately and use technology indistinguishable from magic to transfer as much as possible into that new body. The characteristics of each is beyond the scope of this discussion but ideally you would want the new body to not have to wait 16 years to get a driver’s license (if relevant) and smoothly inherit all assets from your current body as it ages out or you decide to retire it.

Anyway, that’s all I have space and time to discuss here, though I may amend this post later. It should be decently good fodder for discussion.

1 Like

Yes, but maybe that’s a variable adjusted by the simulators to motivate people like us to become more active :wink:

I think you might be interested in my current novel project called “Countersurge”. It’s mostly about AI and AI safety, but there are also other interesting topics, for example how collective intelligence or reputation incomes change the way society operates.

I think in such cases we should distinguish between the damage to hardware and the amount of data lost. Hardware can be compensated for by its equivalent monetary value, as long as the hardware can actually be replaced. That’s why I think that the purely “material” damage is the smaller problem. The greater challenge is how to deal with the lost information or data. If it’s truly lost, there’s no certain way to judge the value of what was lost due to the offense. It might be possible to reconstruct the data approximately by guessing its possible content and checking it for consistency with clues distributed in the environment (for example recorded public behavior). The value of that information could be judged by the effort it would take to generate a comparable amount and quality of data. You could then multiply that effort with a certain factor, let’s say 10, to gauge the height of the compensation for that offense.

Deterrence is only one option. Other options are to prevent people from being able to harm others, by either restricting their access to instruments with which harm can be caused, or directly inhibiting their actions shortly before they are about to harm someone. Alternatively, people could invest more in personal security, wear bullet proof armor and helmets, carry effective weapons, fortify their homes, and make backups of their data (and mind). I’m not sure why you want to restrict the discussion to deterrence alone.

What problems do you see with such a setup? Anyway, I think the situation depends a lot on your ability (or lack thereof) of controlling multiple bodies at the same time. But even that consideration doesn’t change too much, I think.