AD

Amber Dawn

3574 karmaJoined

Bio

I'm a freelance writer and editor for the EA community. I can help you edit drafts and write up your unwritten ideas. If you'd like to work with me, book a short calendly meeting or email me at ambace@gmail.com. Website with more info: https://amber-dawn-ace.com/

Comments
179

Topic contributions
25

This is such a genius idea, thank you!

Application forms for EA jobs often give an estimate for how long you should expect it to take; often these estimates are *wildly* too low ime. (And others I know have said this too). This is bad because it makes the estimates unhelpful for planning, and because it probably makes people feel bad about themselves, or worry that they're unusually slow, when they take longer than the estimate. 

Imo, if something involves any sort of writing from scratch, you should expect applicants to take at least an hour, and possibly more. (For context, I've seen application forms which say 'this application should take 10 minutes' and more commonly ones estimating 20 minutes or 30 minutes).

It doesn’t take long to type 300 words if you already know what you’re going to say and don’t particularly care about polish (I wrote this post in less than an hour probably).  But job application questions —even ‘basic’ ones like ‘why do you want this job?’ and ‘why would you be a good fit?’-- take more time. You may feel intuitively that you’d be a good fit for the job, but take a while to articulate why. You have to think about how your skills might help with the job, perhaps cross-referencing with the job description. And you have to express everything in appropriately-formal and clear language.

Job applications are also very high-stakes, and many people find them difficult or ‘ugh-y’, which means applicants are likely to take longer to do them than they “should”, due to being stuck or procrastinating. 

Maybe hirers put these time estimates because they don’t want applicants to spend too long on the first-stage form (for most of them, it won’t pay off, after all!) This respect for people’s time is laudable. But if someone really wants the job, they *will* feel motivated to put effort into the application form.

There’s a kind of coordination problem here too. Let's imagine there's an application for a job that I really want, and on the form it says 'this application should take you approximately 30 minutes'. If I knew that all the other applicants were going to set a timer for half an hour, write what came to mind, then send off the form without polishing it too much, I also might do that. But as far as I know, they are spending hours polishing their answers. I don’t want to incorrectly seem worse than other candidates and lose out on the job just because I took the time estimates more literally than other people!

‘Aren’t you just unusually slow and neurotic?’
-No; I’d guess that I write faster than average, and I’m really not perfectionist about job applications.

Suggestion: if you’re hiring, include a link at the end of the application form where people can anonymously report how long it actually took them.

Just echoing the others that I like job listings - I've often applied for things because I saw them advertized on the Forum (or sent them to others who were looking)

This is a generous offering that will hopefully help a lot of people, so I feel uncomfortable posting a critical comment, but:

I don't like how this is framed as 'boost your mental health to grow your impact', + the repeated references to productivity. I worry that this perpetuates a belief (or alief or attitude) that EAs' individual wellbeing only matters inasmuch as it contributes to their impact and productivity. I disagree with this: we have a right to be happy regardless of our impact.

On the one hand, it's true that EAs care a lot about impact. Some care so much that they would only consider mental health interventions worth doing if they were proven to make people more impactful. Also, improving people's mental health does tend to make them more productive and therefore more impactful at their goals. So I understand why this framing is tempting. 

The problem is, a lot of EAs' mental health problems are specifically related to worries about impact, productivity, and their intrinsic value apart from those things. (I know some people who are writing a post discussing this - I'll link it here when it's posted). For these people, I suspect engaging in mental health work with the specific aim of becoming more impactful, might be counter-productive or superficial. 

It's kind of a paradox: people with solid self-esteem and wellbeing are in fact more impactful, but also an over-intense focus on impact can make you crazy. So ironically, to be more impactful, you have to let go of your hyper-focus on impact...? 

As a more personal story: over the past through years I've been a mental health crisis, followed by a recovery or re-building of sorts that has, I think, left me mentally better off than before the crisis. I have more equanimity, higher self-esteem, more perspective on my emotions, and better ability to process things. Has all this mental health work made me more productive? It's honestly not clear to me that it has. I think what it has done is helped me steer and be truer to myself - which is one interpretation of 'impact', but not the narrow one that I mostly worried about pre-crisis. And even if I turn out to be more productive thanks to work on my mental health, the process has been so non-linear and full of twists and turns, that I wouldn't have stuck to it if I'd been overly focussed on impact. But it's absolutely worth doing. The one person's subjective wellbeing you have most control over is your own - don't neglect that responsibility. 
 

Thanks for this! This has occurred to me too - I've not heard labour power discussed as a lever in AI governance (though maybe I've just missed that discussion), and it seems like something people should at least consider, as strikes and labour organizing have effectively changed company norms/actions in the past. 

(So my aim was less to propose a norm, more to challenge an implicit preconception I've heard of (elsewhere in EA too!) - that a person who highly values honesty will, necessarily, end up hurting others' feelings. I don't really agree with "proposing norms" as an activity - I'm just reacting a certain way to certain people, and they can react to my reaction my changing their behaviour, or not doing that.

You seem to be worried that advocating for a norm that's already strong  critiques tends to lead to unfair punishments for transgressors. I don't really think there's a basis for this. Are there many instances in EA where you think people have been punished excessively and disproportionately for minor transgressions? Is this a pattern? Fwiw I don't want to "punish" people who radically honest in hurtful ways - I just want them to understand that they can be honest and also kind/empathetic.

In general, I think that the way norms stay strong is by people advocating for them, even if people already mostly agree. It teaches newcomers the norm and reminds older community members. It can be worth stating the obvious. But my original point doesn't seem to be that obvious, given that the original letter-writer was having problems with people "breaking" this supposed "norm".

 

Hmm, that's interesting. I guess I had seen both of those discourses as having similar messages - something like 'it doesn't matter how "effective" you are, common sense virtue is important!' or 'we are doing a bad job at protecting our community from bad actors in it, we should do better at this'. (Obv SBF's main bad impact wasn't on EA community members, but one of the early red flags was that a bunch of people left Alameda because he was bad to work with. And his actions and gendered harassment/abuse both harm the community through harming its reputation). 

I do think it's reasonable to worry that these things trade off, fwiw. I'm just not convinced that they do in this domain - like, integrity-maxxing certainly involves honesty, but I don't see why it involves the sort of radical-honesty, "blurting out" thing described in the post. 

 

You don't have to be an asshole just because you value honesty 

In Kirsten's recent EA Lifestyles advice column (NB, paywalled), an anonymous EA woman reported being bothered about men in the community whose "radical honesty" leads them to make inappropriate or hurtful comments:
 


For example: radical honesty/saying true things (great sometimes, not fun when men decide to be super honest about their sexual attraction or the exact amount they’re willing to account for women’s comfort until the costs just “aren’t justified.” This kind of openness is usually pointless: I can’t act on it, I didn’t want to know, and now I’m feeling hurt/wary).

 

 

An implication is that these guys may have viewed the discomfort of their women interlocutors as a (maybe regretful) cost of them upholding the important value of honesty. I've encountered similar attitudes elsewhere in EA - ie, people being kinda disagreeable/assholeish/mean under the excuse of 'just being honest'.

I want to say: I don't think a high commitment to honesty inevitably entails being disagreeable, acting unempathetically, or ruffling feathers. Why? Because I don't think it's dishonest not to say everything that springs to mind. If that were the case, I'd be continually narrating my internal monologue to my loved owns, and it would be very annoying for them, I'd imagine. 

If you're attracted to someone, and they ask "are you attracted to me?", and you say "no" - ok, that's dishonest. I don't think anyone should blame people for honestly answering a direct question. But if you're chatting with someone and you think "hmm, I'm really into them", and then you say that - I don't think honesty compels that choice, any more than it compels you to say "hold up, I just was thinking about whether I'd have soup or a burger for dinner".

I don't know much about the Radical Honesty movement, but from this article, it seems like they really prize just blurting out whatever you think. I do understand the urge to do this: I really value self-expression. For example, I'd struggle to be in a situation where I felt like I couldn't express my thoughts online and had to self-censor a lot. But I want to make the case that self-expression (how much of what comes to mind can you express vs being required to suppress) and honesty are somewhat orthogonal, and being maximally honest (ie, avoiding saying false things) doesn't require being maximally self-expressive. 


 

I would have died as a baby - I needed a complex heart operation.

Load more