Discussion about this post

User's avatar
Victor Breum's avatar

Idle side-note, probably not original, just a pleasant thought to me:

When reading the remarks about sci-fi stories that turn bad once they move beyond relatable human characters, it struck me how games actually seem better suited for these scenarios than stories, as the player is their own "relatable" agent interacting with a foreign system. Obviating that need for characters. It then struck me how that's what Universal Paperclips does, and well, that was a nice mental loop back to the article.

Thanks for the transcriptions/summaries, useful and entertaining, especially the sic burn. The idea of partiality across species will stick with me, I think.

Expand full comment
persona non-sequitur's avatar

The core of Robin's argument is intriguing.

The ways in which cultures have evolved over time is a topic I have found very interesting to ponder for awhile. The contingency of our current moral intuitions and the genuine diversity of possible ways of being is something that is easy to not explore. There are two particular people I think of when considering this topic. One is of course Nietzsche, but another is the historian Tom Holland who makes the case that much of our cultural waters that we swim in (in the west) are a consequence of Christianity. In particular the very notion of the universalism of human dignity or even the distinction between the secular and non secular. It is interesting to consider the argument that Christianity's universalism may have given it some sort of evolutionary advantage allowing it such influence in the west.

So to me, the idea that the worldviews and morals of people in the west before Christianity were in some sense fundamentally different to our own, seems rather plausible. That said, I also think it reasonable that a lot has stayed the same. They still had families that they loved, enjoyed friends, felt envy and shame and pride and lust and heartbreak and fear et cetera.

Robin is getting at a fundamental difficulty of universalism when taken to its limit. At what point do we stop extending the reach of moral value? If every possible configuration of being is worthy, do we end up with a kind of nihilism where the idea becomes meaningless?

I like the idea of extending the moral value of beings beyond human. To animals and plants and things. Are they still morally less than human? Why? Because we are conscious? Do we only extend moral worth to AIs that are also conscious?

My hunch is that morality is an imperfect generalisation of the types of relationships people form with people they know. People make it up (and just because we make it up doesn't mean it's unjustified, it is just people who have the responsibility for the justification, not some absolute truth). So we shouldn't expect this stuff to generalise too well. So I don't try to think too hard about the precise universal conditions for moral valuehood.

And to some extent it can be healthy to have a limit to how general ones moral universe is. It is healthy to have some notion as to what one values.

Expand full comment
6 more comments...

No posts