Just Because it Functions, Doesn’t Mean It Is Good… Lessons to Be Learnt from Design and Development
Whitepaper
As part of our series ‘Just Because it Functions, Doesn’t Mean It Is Good…’, we have looked at all areas of a business, and this phrase can be applied in a range of different contexts. Whether it is a solution redesign, or the day-to-day people, process, or technology, we are looking at optimisation within an organisation: understanding where basic functionality ends, and true efficiency and productivity can reign.
This is never truer than when it comes to technology and system design. There are many businesses who are getting by with systems that work – that is, they have the capability to perform the basic function required – but that are not necessarily “good”, efficient, or well-designed.
Most businesses do not generally have an issue with these systems that ‘just work’, because the system achieves its objective, and they are getting good results. However, it is critical to review systems to see whether there are opportunities for improvement, and make sure they operate as efficiently as they can.
For example, when it comes to AI/machine learning, a lot of work needs to be input to achieve a functional system with meaningful insights from an organisation’s data sets. Note that here we say ‘functioning’, but that does not mean ‘good’! Amazon, a massive and leading player in the AI/machine learning industry learnt this the hard way in 2014, when developing their machine learning systems to review job applicants, with the aim of automating the search for top talent. They used historical recruitment data from the previous 10 years to train the system, and soon found that the AI was selecting predominantly male candidates. Based on the data it was provided, and without further input from the team, Amazon’s system had taught itself that male candidates were preferable and had even penalised the word “Women’s” e.g., “Women’s chess club”, as well as downgrading graduates from two all-women’s colleges. Based on the bias of the historical precedent, the system was essentially discriminating against women and exposed a bigger problem with male dominance across the tech industry. In the end, while they made some adjustments to the systems, the project was abandoned in 2017 after executives lost faith. The episode, however, goes a long way to demonstrating the important of understanding the data and processes behind machine learning systems, to ensure that the outputs make sense, and the systems are of ‘good’ quality.
And while not all businesses are leveraging AI in their technology stack, it is safe to assume that every organisation has at least one solution that is critical to their business, whether it is for emails, payroll, or holding client data. Looking to the development world, some developers have a tendency to re-use outdated code. Whilst it may mean the system ‘works’ and saves development time, this can have impacts further down the road. Re-use of code is not ideal because older or outdated code tends to have a higher impact on system memory, which can impact running speeds and system efficiency in the long run. On top of this, older code likely has potential exploits, which have not been mitigated for in the wider system, and this can expose organisations using the technology to the potential risk of cyber-attacks and data breaches. Again, whilst the system may ‘work’ with re-used code, in order to ensure it is as ‘good’ as possible, developers should continuously review their code, and clean and streamline wherever they can.
Putting this practice in a wider business context, it can be tempting to ‘re-use’ operational processes, overlaying old ways of doing things onto new systems. While this saves time in terms of learning new approaches, it can be detrimental to team efficiency, and mean organisations are not making the most of the updated system functionality. Whether we are talking about coding, or operations, in both contexts it is more helpful to think about the old as a framework – use the ‘old’ as the basic premise or outline of the ‘new’, and ensure that the new code or process is reviewed and tweaked on a regular basis.
In the field of UI/UX design, some systems use so-called “dark patterns” to retain customers and gather data. These “dark patterns” are design elements that deliberately obscure, mislead and deceive users into making unintended and sometime harmful choices, for example tricking users with confusing language into consenting to the use of additional cookies on a website and giving up user data. In other contexts, this can take the form of convoluted and confusing processes to cancel subscriptions, leading to users getting frustrated and giving up. There is also evidence on some websites of use of language which makes it confusing for users to know which button to click, some companies even go as far as “confirmshaming” users for wanting to cancel, where a site guilts you into opting into something e.g. a newsletter pop-up window where the only options are “Ok” and “No, I hate reading about interesting things”. Although widely considered unethical, many sites have continued to adopt these techniques because the data suggests that they work. But whilst an organisation may get results in the short term, they may also find that by using these tricks they have created mistrust with their customers in the longer run.
It is easy to be complacent when it seems that something works ‘just fine’, and very difficult to identify clearly the areas where improvements might be made. This is often a barrier to innovation and transformation projects. Many organisations fine themselves faced with ‘analysis paralysis’, continuously stuck in the scoping phase because stakeholders cannot agree on how to move forward, and this naturally therefore leads to the systems not being improved. One way of approaching this is to break the scope into smaller “bitesize” chunks, this way businesses can achieve quick-wins and negate the analysis-paralysis effect.
In essence, when it comes to getting the most out of your processes, people, and systems, there is much to be learnt from the world of development and design. From making sure the baseline data inputs really give you the whole picture, to constantly reviewing and updating working practices, the adage stands strong: Just because it functions, doesn’t mean it is good…
Organisations should aim for continuous improvement, even if the systems and processes function, identifying opportunities for efficiency gains. Good quality data should be a consistent goal to enable businesses to make data driven decisions that add value to their business and customers.