Hide table of contents

Hi everyone! I've been reading rational fiction for a while, and it was an important part of how I found the EA community. Currently I'm working on a podcast about how rational fiction and EA interact, and came across several grants and writeups about the effects and processes that rational fiction entails (see here and here). It was also great to see the discussions about connecting EA and art through the EAGxVirtual Slack and Unconference these past weekends. I am wondering what experiences with rational fiction that people on this forum have (creating or discussing or reading), and whether people would be willing to share their stories in an audio format. In particular, what do people think about the following:

1. The learning/information-gathering impact of reading rational fiction as compared with more traditional formats of conveying information (blogs, essays, sequences).

2. Leaning on the emotional impact of forms like stories, artwork, other expressive modes for promoting doing the most good (as opposed to reasoning about what does the most good and leaving instinctual emotions out).

3. Effects of rational fiction on interpersonal interactions? It seems to me that a lot of fiction in general is character-driven, so I am curious as to how your experiences with others, whether EA or not, have been affected by rational fic.

Feel free to answer all or none of these, or just give general thoughts. Again, I would love to compile some audio for this project, so please reach out if you would be willing to share your story in that way.

29

0
0

Reactions

0
0
New Answer
New Comment

3 Answers sorted by

I shared some thoughts on this topic on a similar thread posted last year. An excerpt: 

"The key is that you need to show people using an EA mindset (thinking about consequences and counterfactuals, remembering that everyone is valuable), even if they aren't working on EA causes. Show people characters who do incredible things and invite them to contemplate the virtues of those characters, and you don't need to hammer too hard on the philosophy."

...so I suppose I'd say that (1) is important, but mostly when blended with (2). Rational fiction isn't uniquely instructive; instead, it takes lessons a reader could learn in many different ways and drives them deeper into the reader's identity than other media might be able to. There's an element of "I didn't know people could be like this" and an element of "this is the kind of person I want to be." 

I'd guess the second element is more important, since most people have heard about actual moral heroes outside of fiction, but they may not have a sense of how such people think about/experience the world.

The biggest effect of rational fiction for me was feeling the “warm” glow of the ingroup in the fiction I consumed. I could empathize with the characters. I think this kind of effect is inherently good, as feeling like you’re a minority with no culture is bad and encourages homogenization.

I was already a “rationalist” before reading rational fiction, and the fiction works have always struck me as much weaker than the content on our forums. On using them to “convert” other people to rationality … Well one of my friends really took to HPMOR, and they did have a high point of vowing “to always be a scientist,” but it didn’t have any observable effect and they didn’t even finish reading HPMOR. HPMOR is also probably the most educational of all the current stories.

Hi I'm curious as to what you mean by "weaker" - does this mean it teaches fewer concepts or it teaches them less well, or maybe that the concepts don't stick as well in reader's minds? Would the more widespread appeal of fiction be able to account for this (say, lower probability of retention complemented by higher numbers of readers)?

1
Inda
I definitely think the more broad appeal of fiction does make it worthwhile as an outreach effort (though it needs to be explicitly educational. Mother of Learning, for all its good writing, doesn’t teach how to think better.). The concepts touched in the fictional works (that I remember) were all very low-inferential distance from the common culture, so they were confined to beginner concepts without an in-depth overview. For example, the Frozen fanfic by Wales touches on AI safety and effective altruism, and is fun and beautiful, but I did not learn anything from it. As you say, fiction teaches less concepts and teaches them less well. I do think it might teach more memorably though.

Fiction, at first glance, seems like a great way to reach people who prefer art over numbers.

However on second glance, fiction thrives on the individual, on the specific, on one person's story.

We want to convey that doing 100x more good is a great outcome. But how does focusing on one perspective help us to see that the other 100 lives that we're not seeing are each just as valuable?

Is fiction even suitable for communicating EA ideas at all?

I worried that the answer to this question might be no.

However, with a bit of creativity, I think these challenges can be overcome.

Here's my approach for generating creative EA fiction ideas

  • Identify the features of humanity that lead us to not do the most good. Example 1: our desire to do good is all about signalling, and optimising for signalling isn't the same as optimising for good. Example 2: our empathy is not scope-sensitive, aka the one-death-is-a-tragedy,-a-million-deaths-is-a-statistic effect.
  • Imagine a world which is "tweaked" so that those features are no longer true in some way

Examples of ideas:

  • Imagine a person whose empathy was scope sensitive. What would her life be like if every time she reads about the death toll in WW2, she feels 100,000,000x more empathy than at the thought of one person dying?
  • Imagine if every morning when you woke up, your appearance was tweaked to make you more attractive if you had done more good and less attractive if you had done less good? And that this followed a utilitarian calculus?

These ideas are just meant to be illustrative -- I hope that others can come up with much better ideas.

One problem from a fiction-writing perspective is that such tweaks could lead to a genuine utopia. And straightforward utopias don't make for good stories.

Unfortunately authors tend to resolve this by making the utilitarian/good-maximising behaviour a subterfuge for evil. Which is sad.

I think there are other better ways of still generating a good story. These include:

  • Imagine a world where just one person has this tweak and everyone else is normal. This helps us to question whether we (who don't have scope sensitive empathy, for example) are the weird ones.
  • Focus on the transition. If the world suddenly changed, and everyone's desire to signal was now perfectly aligned with doing the most good, what would it mean for the mild-mannered middle-class tobacco marketeer who is suddenly signaling to the world how much harm they have done?
  • Be inspired by other genres. What would a zombie novel that conveyed EA ideas look like?

What if you had a world where karma is discovered to be real, but the amount of good karma you get is explicitly longtermist consequentialist and focus on expected utility? It'd be a great way of looking at effectiveness, and you'd also be able to explore really interesting neglectedness effects as people pile into effective areas.

4
Sanjay
I toyed with this idea too. I imagined a world where people could remember their past lives, and maybe there was also some way of making this public (some way of linking facebook profiles of your current life with your previous lives?) This was partly interesting because of the implications it had for people's attitudes to animal welfare. (Hindu vegetarianism appears to have been unusually driven by a desire to promote animal welfare, as opposed to some other religious dietary restrictions which originated from human health needs). However I think I preferred the world mentioned earlier in the post, where the same consequentialist utilitarian framework causes your appearance to update. It means that the feedback loops are faster. And I think people care more about being good-looking than they do having a nice time in their next life (even if they had good reason to believe that the next life were real). The appearance-oriented idea is also a great mechanism for highlighting the fact that in the real world virtue and appearance are different (despite the fact that films and other art sometimes seem, horrifically, to confuse the two)
1
alex lawsen
I'd also love to see a fictional world with a moral system that was explictly a karmic-utilitarian moral system. That is, the consequences of actions for particular agents matter proportionally to the amount of utility previously generated by those agents.
1
Inda
Expected utility as the doer believes? Otherwise the system is too complex for the karma to actually work well. It’s also probably deterministic ...
1
vernonbarth
yes that's how the world is!! yourmortgageonline “Like gravity, karma is so basic we often don’t even notice it.” – Sakyong Mipham
More from vn256
Curated and popular this week
Relevant opportunities