Hi everyone,
I wasn't sure this should be an entire post (Topics: AI, governance, survey):
We are inviting you to take part in a research study which Pour Demain and the Vrije Universiteit Amsterdam are conducting on development-phase risk assessments for AI systems. The link to the study is here. The survey will close on the 12th December 2023 at 23:59 CET.
You can pick a time here to do the survey so you don't have to remember/use up your mental RAM (this creates a calendar event for you).
About:
As you may know, the majority of a group of experts recently surveyed by Schuett et al. indicated that conducting a development-phase risk assessment is an important practice for AI labs.
Our study aims to better understand how development-phase risk assessments could be carried out in practice and what they would reveal. This will inform Pour Demain's contribution to standard setting for the EU AI Act.
The study involves participating in an up to 30-minute creative exercise to identify and analyze potential risks. To minimise time requirements, we recommend doing this with a keyboard/voice-typing. There are no right or wrong answers - the goal is simply to generate ideas and think through different risk scenarios.
For every participant we're donating a contribution to charity (also giving more you spend longer, to show this doesn't go by unseen!).
We greatly appreciate you taking the time to support this research. We would further be very grateful if you would forward this survey to three other experts in your network who you think have something to contribute on the topic.
Hi James thanks for opening this up for feedback,
This is a tough one, overall it looks good!
My general point of feedback would be to be more cause-agnostic OR put higher emphasis on "priorities research". For example I could suggest making 1/5th content about priorities research, promoting it as a category of its own, as seen below.
The reason for this is because I would argue that cause areas & meta have their own communities/conferences already, priorities research on the other hand may not so much. And priorities research represents EA's mission of "where to allocate resources to do the most good" most holistically. Then again I haven't done the thinking you have behind these weights!
It may be worth making a survey with 1-100 scales?
Priorities research 5%