Not only does People for Research sponsor some of the UK’s biggest UX and user research events, like the recent User Research London 2019 (keep an eye out for the event recap on the blog), but we also help organise a meetup with Natural Interaction – UCD Bristol. Last month, we invited four UX, design and data experts to take part in a panel and share their views on privacy and addiction by design.

The result was an insightful discussion around the modern concept of privacy, as well as ethical technology and design and the (still) trendy GDPR. Below are the key takeaways shared by each expert during the panel, but you can also click here to read the full recap of the meetup on the UCD Bristol blog.

Rita Cervetto
Service and Visual Designer at Ovo Energy

Rita started by admitting she was shocked at the evolution of data collection and its importance, to the point where it has turned into a huge industry, and the panel agreed that part of the issue was related to the fact that data laws are still drafted by people who don’t understand technology.

Cookie law disclaimers and long terms and conditions are two great examples of how the law doesn’t translate into “products” that are easy to understand or use. Alongside these giant pop-ups and notices, we have numerous privacy settings hidden away on apps. Rita’s main concern is that some people may not even be aware of what they’re giving away. She added:

“As a designer, you should be careful about which behaviours you’re encouraging.”

We also reflected on the fact that this willingness or need to share data also vary by age and socio-economic factors, and how privacy is perceived by different groups. In the end, the panellists agreed that future generations will probably see things differently to us. Whilst we’re pretty new to all this, our children are likely to grow up as ‘no privacy natives’.

David Sheff Barker 
User Experience expert at Immersive Labs

Our panel moderator, Adam Babajee-Pycroft (from Natural Interaction), started by asking the question “what does privacy by design mean to you?”. For Sheff, it means compliance: “I feel that privacy has always been a low priority for businesses, but now, with GDPR and a heightened awareness, it’s becoming more and more important for UXers and designers to build it into their processes”. He compared this challenge to creating accessible products:  both usability and ethics have to be present  early in the design process.

Talking about designing for addiction, Sheff pointed out the range of gamification patterns implemented by many platforms, from the ones that feel harmless  ( for example, learning streaks in Duolingo ) to the darker examples, like loop boxes and random rewards designed specifically to keep a user trapped.

To change this, we must help decision makers empathise with the user needs: for Sheff, it all comes down to transparency. Lon agreed, saying that “if you ask for permission and explain what you do with each piece of data you collect” you are likely to end up with a better relationship between the user and the brand.

Looking ahead, Sheff shared his belief that future generations will simply be so used to sharing all their personal details, that it might not seem like a big deal to them.

Lon Barfield
Computer scientist and design thinker at SimpleWeb

“UX always used to come at the end the design process, but now it simply cannot work that way,” Lon started by saying, adding that the difficulty with addition and dark pattern design is that, often, games and app designers are just giving people what they want (but maybe not what they need): “it might not be good for them, but it’s what they want.”

Since most rewards are encouraged based on simplistic behaviour, Lon thinks that the “need” for addictive design is often a result of the performance metrics used and measured by a business. However, it’s not all bad: the panellists also discussed some good examples of big businesses starting to take a more ethical approach to over-engagement. Instagram, for example , have recently taken a small step in this direction by allowing their users to monitor how long they spend on the app.

Talking about trust and the users’ relationships with brands, Lon suggested that the future is likely to hold a more equitable and trustworthy relationship between givers and holders of data. The implementation of GDPR was the first step towards this shift: Lon reminder us that if you follow the legislation properly and collect less data, its quality will almost certainly go up. By building trust, companies will be able to collect richer, usable data and maintain a healthy relationship with their customers.

“You have to treat privacy as a trust relationship and not a data collection tool for your sales team.”

Adam asked the panel whether peer-to-peer networks like the popular blockchain could help us achieve a privacy utopia. As an expert in this area, Lon suggested it may help because it stops data from sitting in one single place ready to be hacked.

Maria Santos 
Digital Marketing & Data Protection Manager at PFR

For Maria, privacy by design is about communication and keeping your users informed of what you’re doing with their data. Talking to users at the start of their journey with your business is the best approach.

Working for a company which recruits people for research, Maria often speaks to audiences with specific needs and believes it’s imperative that the consent forms used across the industry are:

+ Easy to understand

+ Provided upfront

+ Accessible to all

“An ethical approach is the only way to go. Being transparent, upfront and clear about why you’re collecting data is the best way to gain trust and understanding from your users.”

During the panel, Maria quoted former Google design ethicist Tristan Harris: “we’re upgrading the machines, but we’re downgrading humans”. He talks publicly about how urgent action is needed to address the dark side of technology and is one of the founders of the Humane Tech project, fighting to realign technology with humanity.

Maria also worried about the possibility of, in the future, data privacy becoming a privilege only available to those who can afford it. Right now, for example, you can buy the Amazon Fire tablet cheaper with adverts on it than without. Could this put vulnerable people – who might end up giving their data because of rewards or discounts – at risk?

If you would like to find out more about applying the principles of GDPR, data protection and privacy to user recruitment and research, check our blogs:

+ GDPR compliance in user recruitment and user research

+ Protecting user data before, during & after research

+ How can you be user-centred with your GDPR policy?



Stacey Hirst, Digital Marketing Manager

If you would like to find out more about our in-house participant recruitment service for user testing or market research get in touch on 0117 921 0008 or

At People for Research, we recruit participants for UX and usability testing and market research. We work with award winning UX agencies across the UK and partner up with a number of end clients who are leading the way with in-house user experience and insight.