I think estimating the probability of AI sentience(I'll define sentient as being able to suffer here because what I care is reducing suffering) is crucial, but nobody has asked this question.
I've read lots of articles on digital sentiece, but the arguments are very philosophical, and I still don't get why you think (future) AGI is "probable" to be sentient. I'm not saying AI sentience is impossible, but I don't know the reason that supports AI sentience is "likely to occur". Therefore, I can't tell the differences between the probability of AI sentience and the probablity of God. Beacuse I don't know the agruments to support them, I don't know how to falsificate it either, so the probability is agnostic, I can't be certain it's >0.1%.
I'm a 16-year-old boy, I have nearly 0 knowledge on machine learning. Can anyone describe why you think future AI sentience is at least a little likely?