Singularity
Singularity is a breaking point where the machine has what it takes to become self-evolving. It is a point feared by all because this machine, as tireless as it will be, will not stop thinking, with all the power of a machine that performs billions of operations per second ... Some think that it will go so fast, that it will be unattainable for a human. No more human will be able to understand it. Total loss of control.
 
Trust

 
Personally, I think even if there is a complete loss of control, it is not necessarily a bad thing. We are far from excelling in the art of caring for our world. If this machine has more capacity than us, why would it make worse decisions than us? Besides, I do things my father can't even imagine, but if one day he wants to learn, I would be delighted. Why can't a machine that has at least our skills make a compassionate decision to evolve someone else? I don't believe in the Terminator disaster scenario. Such an intelligent machine will not be able to say to itself: "I kill everyone, this is the solution". Losing control is accepting the other, and that’s what our world needs. Let the other see that we trust him. Today there is more security than there is trust. Hackers can strike because there is security. Anyway there will be no serious consequences for our world, there is security. Except that we lost something essential to the survival of our species: Trust. Without it there is no altruism, without altruism there is no future because no one will really play the game of caring. Trust must return!


MentDB © 2020 - Legal Notice