
Commercial software developers put a lot of effort into creating applications with powerful easy-to-use features—yet every new release of a major application or operating system brings at least a few features that leave user forums filled with rage. My current bugbear is Microsoft's aggressive implementation of Copilot in Word, which brings back memories of the infamous Clippy.
I wasn't able to find much research exploring what kinds of features annoy users and how such features end up in software, but one recent study of frustration amongst computer users paints a depressing picture. Morten Hertzum and Kasper Hornbæk (2023) had 234 users log each episode of frustration they encountered in one hour of ordinary computer use. The researchers reported an average of 0.79 frustrating episodes in the hour, with between 11-20% of the hour lost to frustration. While this is a significant improvement over the 44-50% of time lost in studies undertaken fifteen years ago, Hertzum and Hornbæk still describe the time lost as "disturbing", and speculate that this level of frustration may go some way towards explaining why computers haven't increased productivity as much as their proponents expected.
Hertzum and Hornbæk's "frustrations" include intermittent failures like excessive response times, crashes, and loss of network connectivity, but also software features that interfered with use of an application, such as pop-ups and misplaced buttons. The latter kinds of frustration are what I want to address in this article.
Distraction
When I was teaching introductory web technologies in the early 2010s, textbooks advised budding web designers to avoid distractions, such as animations and the notorious <blink> tag. Looking at the web today, however, it often seems no one listened.
Dhana Lakshmi Ponugubati and Vineesha Vallem prepared a systematic literature review of requirements engineering for distraction-free software for their Masters thesis at Blekinge University (2020). They distinguish two categories of distraction: those created by features within an application, and those created by notifications from other applications. The former kind of distraction may be a deliberate feature designed to keep users engaged with the application (such as on social media), even though the distraction may interfere with users' ability to concentrate on other tasks (such as paying attention in class). The latter kind of distraction may arise from honest intentions to keep users informed, but can overwhelm users with an endless series of events clamouring for attention.
Ponugubati and Vallem go on to describe a number of anti-distraction strategies extracted from the articles in their review. Individuals might combat distraction by being more aware of the time they "waste" on digital devices, avoiding multitasking, turning off notifications, and using relaxation techniques to combat "fear of missing out" and anxiety. Software developers might combat distraction by measuring the level of distraction experienced by users, implementing "zenware" that focuses strictly on the task at hand, avoiding addictive features, and adopting design philosophies that promote simple distraction-free interfaces.
In another review of "digital self-control tools", Alberto Monge Rofarello and Luigi De Russi (2023) identify four broad strategies for controlling distraction: blocking features, tracking the time spent on activities, setting user-defined goals, and implementing punishments or rewards. They consider, however, that such strategies may be simplistic, and that more nuanced tools might support positive behaviour (such as contributing to discussions) rather than simply avoiding irritations.
Monge Rofarello and De Russi also complain that existing research in digital self-controls is very focused on what individual users can or should do, without considering the larger environment that users are a part of. Perhaps, in an ideal word, user interfaces would be re-designed to promote meaningful engagement from users and avoid addictive behaviours. They acknowledge, however, that "this would necessarily require a change in perspective by contemporary tech companies, starting from providing developers with more transparent APIs to finding alternative business models", but it isn’t obvious what would motivate technology companies to do this.
Dark Patterns
Over the past five or ten years, a large body of research has grown up around so-called dark patterns (or deceptive user interfaces), meaning user interfaces designed to nudge users into behaviour that protects the interests of the service provider, but that may not be in the interests of users. Dark patterns explain the existence of a number of features that users dislike but are in widespread use anyway.
In a recent guide for psychologists, Patrick Fagan (2024) divides dark patterns into six broad categories:
- framing, in which information is presented in a way that exploits users' psychological biases;
- obstruction, which makes users' preferred actions more difficult to carry out than the actions preferred by the service provider;
- ruses that trick users into performing an action that may not be in their interests;
- compulsion of users into actions that they may not wish to perform;
- entanglement, which keeps users engaged with a service for a longer than they might have chosen otherwise; and
- seduction, which encourages users to be emotionally rather than rationally engaged.
Well-known examples include drip pricing (framing), making unsubscribing much more difficult than subscribing (obstruction), and infinite scrolling (entanglement). Liming Nie and colleagues (2024) catalogued 64 distinct patterns identified by researchers, which I won't describe in detail here, but gives some idea of how many ways exist for software developers to annoy their users.
A Right to Modify?
Konrad Kollnig and colleagues (2023) have published a series of articles discussing how dark patterns might be addressed through a "right to repair" for mobile devices. In their most recent and most comprehensive study, they asked 100 European mobile device users what changes they'd like to make to their devices, developed a prototype "app repair tool" that permitted the users to make changes, and interviewed eight academic experts on the legal and ethical implications of such a tool.
By far the most common requests for change referred to social media applications such as Facebook, WhatsApp and Instagram. Kollnig and colleagues divide the requests into three broad categories:
- making business models more friendly, such as by restricting advertisements and in-app purchases (60% of participants);
- making user interfaces more attractive or accessible, such as by changing the colours or removing unused interface items (92% of participants); and
- re-balancing privacy and security (55% of participants).
Kollnig and colleagues note that some of the changes that participants asked for would amount to "fraud", such as requesting access to paid-for features without paying (it might be argued that blocking advertisements on ad-supported sites is also "fraud", but Kollnig and colleagues say that German courts have determined ad-blockers to be legal, at least in that country). Requests from some users also conflicted with requests from other users, such as some users wishing to take screenshots of others' posts while others wanted such behaviour prohibited.
Their expert panel identified both opportunities and challenges in app repair. On the positive side, app repair might increase user participation in software design, speed up feature development through crowdsourcing, and support research into the social effects of software. On the negative side, miscreants may use "repairs" as a means of inserting malicious code into devices, original software developers might engage in an arms race with repairers, and various kinds of repair may conflict with intellectual property law.
A Little Tweak
In researching this topic, I came across the concept of userscripts, which are snippets of Javascript code injected into a web page to increase or modify its function. Though userscripts have existed since the early 2000s, few people seem to be aware of them, possibly because using them requires relatively sophisticated knowledge of web technology. The tools for modifying the behaviour of mobile applications that Kollnig and colleagues describe sound even more forbidding, and Kollnig and colleagues acknowledge that their own tool may be legally questionable if used outside a research context.
A few web applications I use frequently annoyed me enough that I decided to explore userscript development, which I've published on Github. If you find them useful, you can buy me a Ko-Fi.
I intend to add to the collection over time, and will consider work for hire on a case-by-case basis. Because my intention is to complement the original software, not replace it, I don't implement anything (including ad-blockers) designed to undermine the original developer's business model, compete with the original developer's product, or circumvent the rights of other users of the same software. Drop me an e-mail to discuss terms.
References
Patrick Fagan (2024). Clicks and Tricks: The Dark Art of Online Persuasion, Current Opinion in Psychology 58, August 2024, Article 101844.
Morten Hertzum and Kasper Hornbæk (2023). Frustration: Still a Common User Experience, ACM Transactions on Computer-Human Interaction 30(3), Article 42.
Konrad Kollnig, Siddhartha Datta, Thomas Serban Von Davier, Max Van Kleek, Reuben Binns, Ulrik Lyngs and Nigel Shadbolt (2023). 'We are Adults and Deserve Control of Our Phones': Examining the Risks and Opportunities of a Right to Repair for Mobile Apps, Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency, pages 22-34.
Alberto Monge Rofarello and Luigi De Russi (2023). Achieving Digital Wellbeing Through Digital Self-control Tools: A Systematic Review and Meta-analysis, ACM Transactions on Computer-Human Interaction 30(4), Article 53.
Liming Nie, Yangyang Zhao, Chenglin Li, Xuqiong Luo and Yang Liu (2024). Shadows in the Interface: A Comprehensive Study on Dark Patterns, Proceedings of the ACM on Software Engineering 1, issue FSE, Article 10, pages 204 - 225.
Dhana Lakshmi Ponugubati and Vineesha Vallem (2020). Requirements Engineering For Distraction-Free Software: Systematic Literature Review and Survey, Masters thesis, Blekinge Institute of Technology, 2020.