I'm so glad I found this post. This is the first I've come across another person that feels strongly about increasing empathy potentially being a powerful lever for suffering prevention. I read the post, and plan to go through all the links you included to do further research.
You inspired me to post an essay I wrote a few months ago with a similar theme.
Your mention of Veil of Ignorance gave me a bit of an idea that I brainstormed with ChatGPT, and that you might find interesting:
User: Is there anything here: i think the veil of ignorance could be a powerful motivator for empathy. (Im not sure if this is exactly it): if you imagine you have no way of knowing which body you wake up in tomorrow, or next year, youll be more inclined to promote policies that will help the worst off to reduce the risk if you end up in that body.
Now if this could somehow be minified into a meditation, or game, or something like that, it might train the mind to consider those in the worst situations, and be more likely to help them.
Its possible that empathy is not increased, and it is only selfishness at the thought of risk.
Then again, one can imagine we are not separate beings at all, and are one consciousness being experienced through an infinite number of sentience-moments. Then there is no selfishness, only helping a part of yourself you hadnt been thinking much about.
But just training your brain during the game that you should consider how bad some people have it, might keep you thinking that way after.
ChatGPT: Your idea touches on a fascinating intersection of philosophy, psychology, and technology. The Veil of Ignorance, proposed by philosopher John Rawls, indeed encourages empathy and a broader sense of fairness by asking individuals to design society without knowing their own place within it. Translating this concept into a meditation, game, or interactive experience could be a compelling way to promote empathy and suffering-focused values.
Meditation:
A guided meditation could involve users imagining waking up in different life situations, and asking them to consider the kind of world they'd want to create from that vantage point.
Game:
A role-playing game could be designed where the roles you play are randomly assigned, and choices made during gameplay affect the overall well-being of the game world. Players could then see the impact of their decisions from different social and economic perspectives.
Interactive Experience:
Virtual Reality or Augmented Reality could offer an even more immersive experience of 'walking a mile in someone else's shoes,' thereby promoting empathy and compassion.
Ethical Considerations:
- Selfishness vs Empathy: While the game might initially tap into self-interest, over time, repeated exposure could cultivate genuine empathy.
- Universal Consciousness Perspective: Integrating the idea that we're all interconnected could add a philosophical layer to the experience, deepening the impact.
- Ongoing Influence: The goal would be that the mental models formed during the game or meditation could carry over into real-world decision-making and behavior.
Spending a few minutes a day brainstorming and outlining how to create such a meditation or game could be an excellent way to contribute to your long-term goals of fostering empathy and reducing suffering.
User: Maybe i could try writing a book from the perspective of someone who is thrust into this situation. He occasionally has opportunities to make decisions that affect the population, maybe as a politician, and he wakes up every day (month, year?) in a new body, experiencing everything from the extreme joy of love to depraved torture to boredom
ChatGPT: That's an intriguing idea for a novel. Writing a book from such a perspective could offer readers a vicarious journey through diverse human experiences, encouraging them to think more deeply about empathy and the reduction of suffering. The protagonist's transition through different bodies and circumstances could serve as a vivid illustration of the Veil of Ignorance concept, particularly if he or she holds some sort of influence over society, like a politician.
You could spend a few minutes each day outlining chapters, developing characters, or even just jotting down ideas and scenes that you'd like to include. This small daily commitment could gradually build into a substantial work that aligns with your broader goal of increasing suffering-focused values in the population. Even if you only write for a few minutes a day, each session would help solidify your ideas and keep your project moving forward.
I'd suggest that you consider computer science and business (with a focus on project management if at all possible).
I like these suggestions. It fits with the suggestions of the other commenter.
Regardless of the field/area you choose, I recommend trying it out a little before committing to a full degree program. You could take one or two courses at your local university or community college, try a few MOOCs in the subject, do some Sophia.org courses to dip your toe in the subject.
Good idea. I was hoping to start at Uni in the fall, which doesn't give me a lot of time to take trial classes before then, but I'll look around. Edit: Oh, I assumed it was too late to schedule summer classes at the local college, but I guess not.
Thanks for the advice!
That's true, changing people's character is an uphill battle, but it seems that the potential reward is massive enough that all possible interventions should be explored.
Fundamentally, we know that there exist people with varying levels of empathy. (Or at least people who act empathetically in varying degrees.) At the very least, we should figure out what the difference is, through a method like psychology or neuroscience. That would give us a much firmer footing to understand if empathy can be increased, and how.
For now, I still don't have the mental bandwidth to pursue this in any kind of formal capacity. Still not at 100% mental energy after a long period of depression. I'm currently working on becoming an electrician, but I hope I can find a way to steer my career in this direction in the future.