DR

david_reinstein

Founder and Co-Director @ The Unjournal
3827 karmaJoined Working (15+ years)Monson, MA, USA

Bio

See davidreinstein.org

I'm the Founder and Co-director of The Unjournal;. W  organize and fund public journal-independent feedback, rating, and evaluation of hosted papers and dynamically-presented research projects. We will focus on work that is highly relevant to global priorities (especially in economics, social science, and impact evaluation). We will encourage better research by making it easier for researchers to get feedback and credible ratings on their work.


Previously I was a Senior Economist at Rethink Priorities, and before that n Economics lecturer/professor for 15 years.

I'm  working to impact EA fundraising and marketing; see https://bit.ly/eamtt

And projects bridging EA, academia, and open science.. see bit.ly/eaprojects

My previous and ongoing research focuses on determinants and motivators of charitable giving (propensity, amounts, and 'to which cause?'), and drivers of/barriers to effective giving, as well as the impact of pro-social behavior and social preferences on market contexts.

Podcasts: "Found in the Struce" https://anchor.fm/david-reinstein

and the EA Forum podcast: https://anchor.fm/ea-forum-podcast (co-founder, regular reader)

Twitter: @givingtools

Posts
55

Sorted by New

Comments
804

Topic contributions
9

Project Idea: 'Cost to save a life' interactive calculator promotion


What about making and promoting a ‘how much does it cost to save a life’ quiz and calculator.

 This could be adjustable/customizable (in my country, around the world, of an infant/child/adult, counting ‘value added life years’ etc.) … and trying to make it go viral (or at least bacterial) as in the ‘how rich am I’ calculator? 


The case 

  1. People might really be interested in this… it’s super-compelling (a bit click-baity, maybe, but the payoff is not click bait)!
  2. May make some news headlines too (it’s an “easy story” for media people, asks a question people can engage with, etc. … ’how much does it cost to save a life? find out after the break!)
  3. if people do think it’s much cheaper than it is, as some studies suggest, it would probably be good to change this conception… to help us build a reality-based impact-based evidence-based community and society of donors
  4. similarly, it could get people thinking about ‘how to really measure impact’ --> consider EA-aligned evaluations more seriously

While GiveWell has a page with a lot of tech details, but it’s not compelling or interactive  in the way I suggest above, and I doubt  they market it heavily.

GWWC probably doesn't have the design/engineering time for this (not to mention refining this for accuracy and communication).  But if someone else (UX design, research support, IT) could do the legwork I think they might be very happy to host it. 

It could also mesh well with academic-linked research so I may have  some ‘Meta academic support ads’ funds that could work with this.
 

Tags/backlinks (~testing out this new feature) 
@GiveWell  @Giving What We Can
Projects I'd like to see 

EA Projects I'd Like to See 
 Idea: Curated database of quick-win tangible, attributable projects 

Thanks, this is helpful Is it worth adding links to this?

I'm trying to understand... what does "exempt" mean in the phrase "exempt, salaried employee"?

Do you mean that your salary is part of the expenses of a tax-exempt nonprofit, so people who donate to PauseAI (partly to pay your salary) can deduct this from their taxes if they itemize their returns? And I'm trying to understand the connection between this and the idea of claiming pro-bono hours? Thanks!

Do you actually take the salary and donate it, or do you just claim a lower salary and call some hours 'pro-bono'? Obviously the latter is more tax-efficient.

What was your thinking on Humane League UK vs Humane League?

I just wanted to make sure The Unjournal  was eligible @Toby Tremlett🔹 .  We made this post and tagged it but you state,  "only projects that post or answer + message me are eligible for next week's Donation Election". I hadn't seen that earlier, so I'm messaging you now. (Maybe other orgs also overlooked that?)  

The Unjournal (unjournal.org) commissioned the evaluation of one of the biosecurity-relevantpapers you mention (Barberio et al, 2023). See our evaluation package here, with links to each evaluation within.

The evaluators generally agree about the importance and promise of this work, but also express substantial doubts about its credibility and usefulness. (They also make specific suggestions for improvements and extensions, as well as requests for clarification.) The evaluation manager echoes this, noting that the “limitations of the paper as it stands make it far less valuable than it could be.”

Note that The Unjournal has commissioned evaluations of Naidu et al (2023). See the summary and ratings here, and the linked evaluation reports. The second report offered substantial critiques and questions about the methods, interpretations, and context.

NB: The Unjournal commissioned two expert evaluations of this work: see the full reports and ratings here.

From the first evaluation abstract (anonymous):

Pro: Raises important points and brings them to wider attention in simple language. Useful for considering individual RCTs.

Con: Not clear enough about intended use cases and framing. ... Guidelines need more clarity and precision before they can be genuinely used. I think best to reframe this as a research note, rather than a ready-to-use ‘guideline’. Unclear whether this is applicable to considering multiple studies and doing meta-analysis.

From the second evaluation abstract (by Max Maier):

The proposal makes an important practical contribution to the question of how to evaluate effect size estimates in RCTS. I also think overall the evaluation steps are plausible and well justified and will lead to a big improvement in comparison to using an unadjusted effect size. However, I am unsure whether they will lead to an improvement over simpler adjustment rules (e.g., dividing the effect size by 2) and see serious potential problems when applying this process in practice, especially related to the treatment of uncertainty.

I'd love to know if this work is being used or followed up on.

A good summary. Note that marginal per paper cost does not include our overhead, communications, building our network and tools, etc.

Load more