As a society, we seem to have mixed feelings about whether it’s better to add or subtract things, advising both that “less is more” and “bigger is better.” But these contradictory views play out across multibillion-dollar industries, with people salivating over the latest features of their hardware and software before bemoaning that the added complexities make the product difficult to use.
A team of researchers from the University of Virginia decided to look at the behavior underlying this tension, finding in a new paper that most people defaulted to assuming that the best way of handling a problem is to add new features. While it was easy to overcome this tendency with some simple nudges, the researchers suggest that this thought process may underlie some of the growing complexity of the modern world.
Let’s add stuff
The researchers say they got interested in the topic because they noticed that beyond the admonition that less is more, many fields had specific advice about improvement through subtraction. Editors caution writers about using excess language, social scientists talk about the need to remove barriers, and so on. In contrast, there are few reminders to add stuff to fix problems.
Perhaps, the researchers reasoned, people have no problems remembering to add things even without any prompting. So they collected a bit of data on people’s tendencies in this regard. They found that additive solutions were far more common than subtractive ones. For example, when an incoming university president solicited ideas for improvements, only 11 percent involved getting rid of something. In an experiment that involved making patterns out of colored squares, only 20 percent of the participants removed squares in order to achieve a pattern, even though either option was equally viable.
And so on it went. When asked to improve a travel itinerary, only 28 percent of the participants did so by eliminating destinations. Essay improvements led to an increase in word counts in all but 17 percent of the cases. People just didn’t tend to take things away in a huge range of contexts.
The obvious next question is “why?” It could be because people never even think of removing something, or it’s possible that we consider the idea and then reject it for various reasons. Another possibility is that we’ve internalized the “more is better” attitude, and that skews the solutions we view as viable candidates. So the researchers designed a series of experiments to evaluate these different explanations.
Why didn’t I think of that?
One of the experiments involved giving the participants a pattern of colored and white squares and asking them to change the colors in order to make the pattern symmetric. In every case, symmetry was far, far easier to achieve by taking away a few colored squares, but only half the participants recognized this solution. When given a few opportunities to practice, however, the rate of subtractive solutions went up to 63 percent.
This seems to indicate that people don’t always make subtractive solutions their default but will eventually work it out. To probe this question further, the researchers did the same experiment but gave the participants additional tasks to distract them. This added cognitive load seemed to decrease the likelihood that participants would come up with subtractive solutions, suggesting that it takes some mental energy for people to overcome a natural tendency to ignore subtractive options.
A couple of additional experiments looked at the original topic that got the authors interested in the subject: the little nudges we use to get people to consider that less might be more. Here, the researchers used a control set of instructions that simply laid out the task at hand and a second set in which the instructions specifically mentioned the option of deleting something. It turns out that these nudges work. In a typical experiment, the number of participants who suggested subtractive solutions went up by 20 percentage points relative to the control instructions.
The researchers also did an experiment involving making a problem worse rather than improving it. There was no significant difference in the use of subtractive solutions between improving something and making it worse, suggesting that people don’t focus on additive solutions simply because they view subtractive ones as worse.
Overall, the researchers come to the conclusion that people just don’t often consider subtractive solutions. When they do end up thinking about them, they often find they’re good options. And small prompts seem to get people to rethink their tendency to just add more stuff when making changes.
All of that is potentially useful knowledge. But it’s important to recognize that there are reasonable and practical limits regarding when subtractive solutions make sense. If you’re making improvement suggestions to a university president, to borrow one of the paper’s examples, suggesting eliminating some of your colleagues’ departments might not go over well. Plus, in a lot of cases, elements exist for reasons that may not be clear without a deep understanding of the system. They may even exist for aesthetic reasons.
Finally, we also have to recognize that there’s often value in adding something.
Still, the study provides a simple and effective solution for getting a greater diversity of solutions—if it’s appropriate, simply remind people that removing features is an option.