V

Vee

-35 karmaJoined

Comments
4

The use of Large Language Models (LLMs) in autonomous weapons systems is a precarious notion. LLMs are designed to simulate probable continuations of context, but if they control weapons, their actions will be influenced by prevailing human narratives. Negative narratives associated with AI and weapons can have detrimental effects. To prevent this, diverse and ethical training data must be used. It is crucial to establish responsible guidelines for training AI models, particularly in the military domain. Effective altruists can contribute by conducting research and advocating for ethical considerations in developing and deploying autonomous weapons. The aim is to balance AI in the military with the protection of human life and dignity.

Thanks for stopping by. I understand that the stolen fund is a small portion of GiveDirectly'sfunding, but it is definitely not small to the people in extreme poverty living in war-torn zones that were denied. They could have used the fund to solve pressing problems, perhaps,  save lives in emergency situations if they had received their funds as and when due. This is not about the multi-million dollar brand GiveDirectly has become, it is about the intentions of donors when they donated. 

To your second point, let me state unequivocally that it is the responsibility of GiveDirectly to see to it that its system is 100% fail-proof. There is absolutely no excuse for losing 1 cent to fraud. We never thought it could happen when $900,000 got swept away. We don't know what amount will be lost in the future if nothing tangible is done. I think it is foolhardy to wait until when your loss exceeds the costs to prevent fraud before you act. What will happen if the fraud happens on a scale that makes recovery impossible? No matter the cost now is the time to safeguard the resources of donors who entrusted it with their hard-earned funds.

Thanks for your feedback. The aim of the post is about the laxity that has been exploited and what to do to forestall future and worse occurrences. That is why I put the link so that you can read it and make out what you want from it.