Searching for better questions
There are five and a half billion searches on Google every day. Five and a half billion questions asked every single day. That’s a lot of people looking for answers. Google’s mission, “to organise the world’s information and make it universally accessible and useful”, is about providing answers to all those questions. Ask Google any question you can think of, and it will provide an answer. The answer might not make sense, it might not be what you were looking for, but it is an answer, nonetheless.
Asking questions is an inherently human trait. Every human language has questions. Children ask questions almost as soon as they learn to speak. We ask questions to start conversations, to learn things, to find out how people are. When want to know the truth about something, we ask. Lawyers and teachers ask questions. So do therapists, researchers, job interviewers, and philosophers.
Socrates is perhaps the most famous questioner. Socratic questioning is a method of teaching by ****asking thoughtful questions. They are questions asked from a place of genuine curiosity, encouraging the student to think deeply about their responses, to examine complex ideas and challenge assumptions. Asking questions is a skill.
Computers can provide answers, but they don’t ask questions. They don’t desire new knowledge, and they don’t care whether the answers are truthful. The problem with any algorithmic system based on collating mass behaviour, like people putting search terms into Google, is that it can be gamed. As more people type racist, sexist, Islamophobic, discriminatory terms into Google, the more Google shows those in autocomplete options for all to see and click on. There’s no shortage of research to demonstrate this.
Jonathan Albright, an assistant professor of communications at Elon University, mapped right-wing and fake news websites to show how they use search engine optimisation techniques to gain positions in the search results Google displays.
Robert Epstein, a research psychologist at the American Institute for Behavioural Research and Technology, has shown that the contents of a page of search results can influence people’s views and opinions, and even affect voting patterns.
The technology companies have a responsibility to ensure the algorithms they deploy are able to rebalance the search terms they provide. The argument that the algorithms simply reflect human behaviour can’t stand up if the algorithm can be intentionally gamed. The world’s information is not universally accessible and useful if only one point-of-view is displayed in the search results. But do any of us think technology companies take this responsibility seriously?
If not, then we simply can’t leave it up to the computers to provide better answers. We ask questions with the intention of uncovering the truth, of understanding the world around us more clearly. To do that online means we have to take the responsibility for thinking more critically, being more Socratic in our questioning, developing our skills.
For our technology to get better at answering questions, we have to get better at asking them.