The UK government, as part of its ‘Digital Economy’ initiative, has released with great fanfare the Data Science Ethical Framework. Its ministerial champion has characterised it as ‘harnessing the progressive power of data science while protecting the public’.
It does neither, but clearly illuminates the lengths to which the government (along with others) will go in trying to influence and dictate behaviour in areas where they have no literacy at all with respect to understanding the underlying capabilities (data, analytics and algorithms) or the consequences of the harm that can come if left to their own devices.
Ever since the notion of big data came onto the scene, many have extolled its virtues in changing the world as we know and understand it. They have hyped – with a zeal not previously seen – the notions of data science, data scientists, algorithms and machine learning.
Virtually all of them have advocated its wide-scale use to analyse and predict citizens’ behaviour in order to gain deeper insights, without any controls as to just how creepy this activity could get in terms of interacting with the public at large.
Any attempt to limit how and where big data and analytics should be applied was met by the fury of these same advocates who characterised it as ‘stifling economic growth and wealth creation’.
Not surprisingly, most advocates have been highly influential in getting governments to go along with their thinking and to take a ‘hands off’ approach.
This has not worked out well for consumers, who now see their daily lives dissected, analysed and ultimately manipulated by the algorithms and machine learning associated with the deep behavioural insights now available to almost every organisation who invests in data and analytics capabilities.
The backlash that has now arisen from this lack of control is significant enough that many governments have created ethics councils and other bodies that have gone on to generate reports and recommendations on the issue of ‘ethics in the age of the algorithm’.
Additionally, these same governments (US, UK, EU, etc) are also major advocates of digital and have undertaken major digital strategy and transformation efforts within their countries. These efforts have served to further exacerbate the ethics problem that we are now experiencing.
A common thread found amongst all of this is the seeming cluelessness that government leaders, ministers and civil servants exhibit each time they make an address or pronouncement on the topic of privacy, ethics and governance associated with big data, analytics, algorithms and digital.
Clearly, they don’t understand the underpinnings of the issues, or the reasons why this topic has become so paramount in the public’s mind and their stated demands that it be resolved to their satisfaction.
>See also: Data-driven government: oxymoron or reality?
Data (big or small), analytics (creepy or helpful) and algorithms (evil or good) are major influences in how the digital world around us evolves. Beyond the well-rehearsed platitudes, there needs to be a fundamental mastery of the details associated with these domains by leaders and policymakers who are ultimately accountable for making citizens’ lives better and protecting them from threats.
Without strong and competent leadership and controls (governance), these same citizens will be victimised rather than benefited by data, analytics, algorithms and digital.
The requirement for competent leadership is not a political platform for campaigning on, but a focal point for government action in order to uphold basic human rights, no matter what pace of change the country is experiencing.
An ethics framework that relies on self-governance, best efforts and serendipity to ensure that consumer privacy is protected and that citizens are not victimised by their own data is a recipe for disaster.
Government leaders must commit themselves to leading at all levels and across all domains. They must be literate and competent in the areas that they promote as catalysts for change and not leave citizens to the vagaries of data science.