134

0
0

Reactions

0
0
Comments8
Sorted by Click to highlight new comments since:
[anonymous]17
8
0

I loved this post but the argument falls apart for me because:

If the AI owns overpowering physical violence, its notion of "trade" could look more like "provide human slaves the minimum amount of food and entertainment so they do my work and don't kill themselves, and provide them no other options for survival" rather than "humans have a lot of freely available sources of pleasure (like they do today) and trading with me is yet another source by which they can obtain pleasure/resources/etc"

We do trade with our microbiome.  We feed it.  It helps us digest.

Our microbiome communicates with us.  It creates chemical signals that affect digestion and possibly your feeling of hunger.  The extent of the influence of your microbiome on your brain is not well known, but the pathways for that influence are. The gut microbiome has been shown to produce various chemicals and signaling molecules that can influence the function of the digestive system and the immune system.   It produces short-chain fatty acids that may cross the blood-brain barrier.  

Perhaps the analogy here is better than the one with ants.

Or maybe both analogies are correct? Then the question is how can we be like gut bacteria for the AI and not ants?

Or maybe analogies just add more confusion and we should go back to first principles xd

I mean, there's an extremely narrow range of final goals for which flesh-and-blood humans are physically optimal infrastructure. Human arms can carry materials, human brains an solve problems, etc.; but if something is keeping us around just for that purpose, and not out of any concern for our welfare, then we'll inevitably be phased out.

(And in reality, I don't think AI will ever be at a capability level where it's strong enough to take control, but not strong enough to benefit in expectation from phasing humans out.)

I think the right takeaway is very clearly "don't build AGI that has no concern for human welfare", not "try to be like gut bacteria (or talking ants) to a misaligned AGI".

>extremely narrow range of final goals for which flesh-and-blood humans are physically optimal

Not so quick there.  Currently Al  can't do anything without depending on humans.  I have yet to hear an explanation of how the AI rids itself of this dependence.

Agree. We don't trade with ants but we do trade with monkeys, both in experiments https://papers.ssrn.com/sol3/papers.cfm?abstract_id=675503 and when tourists have things stolen https://www.smithsonianmag.com/smart-news/monkeys-bali-swipe-tourists-belongings-and-barter-them-snacks-180963485/. It seems to me that communication is all that is really required. Arguably all domestication is a trade that's become established over evolutionary timeframes. (Domesticated) honey bees are therefore both trading with us and with flowers when they pollinate and produce honey.

Yes, that is the "arguably": do you require agency in your definition of trade, and at what level. There is a mutualistic relationship with the honeybee hives  that produce honey and pollinate well, hence their levels are rising during generally declining  numbers of other bees. Similarly, we have traded with the genomes of domestic animals, increasing their number, even if the individuals that hold the genes have a worse life because of this trade. There are several stages and timescales to these interactions. The bees trade labor for nectar with the flowers, but the flowers can only establish the deal over evolutionary timescales and rely on bees to have agency in a given lifetime. Similarly we trade our labor and syrup for the bee's honey, but their only alternative is to swarm off/attack and probably the hive will. In my view an exploitative exchange is still a trade.

Curated and popular this week
Relevant opportunities