Compliance, early consultation, Privacy, SDLC and issues with integration.
Whilst it is possible to take a usability-driven approach to secure system design and development, it comes with several problems. Building through a security-tinted lens provides for more secure UX design but comes at costs to time (and money). It allows for enhanced GDPR compliance, ensured in the design phase, thus becoming a vehicle to ‘sell’ value to both the user and the business. However, since designers don’t typically know what ‘stuff’ (data, ip addresses, etc) they have, it’s difficult to integrate. This can be mitigated by bringing in someone with the relevant knowledge to assist, but that has a waterfall effect on time and cost. Whilst research looking into whether secure principles lead to cheaper software seems helpful, it would be a challenge to execute. There are many moving parts in developing something and it can be quite different depending on the scenario and context.
Security and privacy are seen as impeding usability. For example, AI is less effective with more anonymised/secured data. However, with GDPR making personal data anonymous when possible written in law, an AI that can’t work with anonymised data would be difficult to commercialise or use. It would likely be cheaper to design is in a privacy focused way from the start instead of redesigning at a later stage.
Taking this further, large-scale surveys addressing what the UX industry thinks would be helpful to form the basis of research. Instead of opinions by those within the security industry and the assumptions they carry, hearing these will be able to point us to which techniques and methods would have the highest likelihood of being adopted and if it’s more attractive than a typical security approach. At which point, action research can be undertaken to find out the effect a usability-driven approach may have on secure system design.
If you have any questions or thoughts about this post, feel free to let me know at email@example.com and thanks for reading.