Report, if you have a problem with this page“ Information, defined intuitively and informally, might be something like 'uncertainty's antidote.' This turns out also to be the formal definition- the amount of information comes from the amount by which something reduces uncertainty...The higher the [information] entropy, the more information there is. It turns out to be a value capable of measuring a startling array of things- from the flip of a coin to a telephone call, to a Joyce novel, to a first date, to last words, to a Turing test...Entropy suggests that we gain the most insight on a question when we take it to the friend, colleague, or mentor of whose reaction and response we're least certain. And it suggests, perhaps, reversing the equation, that if we want to gain the most insight into a person, we should ask the question of qhose answer we're least certain... Pleasantries are low entropy, biased so far that they stop being an earnest inquiry and become ritual. Ritual has its virtues, of course, and I don't quibble with them in the slightest. But if we really want to start fathoming someone, we need to get them speaking in sentences we can't finish. ”
Brian Christian
From : The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive