Covid–19 and the Panopticon
While Nick Robinson was chuffing empty news on the Today program this morning, like a tank engine chuffing smoke rings in a siding, Dr No was grappling with the elephant in the newsroom: the NHS contact tracing app. Present, but no longer spoken of, since the Secretary of State for Health used Hancock’s Half Hour to deep six the NHSX version somewhere off the Isle of White just over ten days ago, pending reincarnation as Dido’s Cherry. For a while, Dr No felt like one of the blind men trying to conceptualise an elephant. He had a sense of parts of it, but not the whole picture. It had a trunk, for hoovering up data, and a vast body to process and store the data, if not for ever, then for 20 years. It had tusks, the better to prod into action, and big ears for listening. And then Bingo! It suddenly came into focus, not as a pachyderm, but a panopticon!
Originally conceived as a design for efficient prisons and other similar institutions, and now extended to mass surveillance systems in general, a panopticon is at its core a centralised observatory from which small numbers of unseen officials monitor large numbers of inmates. It is set up in such a way that, while the officials can monitor the inmates, the inmates cannot see how or when they are being monitored. The consequent sense that they might be being watched is then sufficient to encourage the inmates to normalise their behaviour, to toe the line. A modern variant is the corporate call centre where an unseen supervisor monitors calls for anti-corporate sentiment. Just knowing the supervisor might be listening in is sufficient to ensure the call centre staff stay in line.
This inherently efficient way of controlling behaviour is at the heart of a panopticon, and clearly has its place in a contact tracing and controlling app. If the app, and so its supervisors, know where you are, then that exerts a strong chilling effect on any urge and impulse to break your state imposed self-isolation. But it is by no means the whole picture. In order to function, a panopticon needs to collect data. It does this in the case of the app by using a Franklinian pact: the app users agrees to trade a little liberty, by giving up personal data, to gain a little safety, a pact many readily enter into, given the remarkably high level — around 41% for individual, rising to 62% for family and friends — of significant covid–19 fear still present in the population.
It is worth, in passing, to put that remarkably high level of expressed fear in context. In the week ending 12th June there were 1,114 deaths registered in England and Wales (population 59 million) where covid–19 was mentioned on the death certificate, giving an actual weekly risk of dying around that time with (not from) covid–19 of nearer 0.0019%.
Turning back to the cherry in Dido’s room, we should note that one of the hallmarks of the British implementation of a contact tracing app is that it relies on centralised personally identifiable data collection. This appears to remain the case, despite widespread MSM reports that the government has ‘U-turned’, and the central database has been ‘ditched’, insofar as the central database hasn’t been explicitly removed; instead, the plan is to combine the NHS app with the Apple/Google app. This central database is of course one of the key features that makes it a digital panopticon. If we then add in the original intention to retain this data for 20 years, along with plans to collect and analyse ever more data, we see that this isn’t an app proportionate to the temporal needs of contact tracing; instead it has all the hallmarks of a mighty big data mining project.
In this, it has more in common with internet behemoths like Google and FaceBook. They too rely on a pact, this time more Faustian and Franklinian: users gain a ‘free’ service by handing over personal data, some of it consented, much of it unconsented. The behemoths then process and market this data as ‘human futures‘, or behavioural predictions, and can even, as we now know, turn this knowledge to the nefarious practise of not just predicting behaviour, but directly influencing it. As one data scientist chillingly put it, ‘We are learning how to write the music, and then we let the music make them dance.’
The British version of the contact tracing app already has the architecture of a panopticon, with documented ambitions to extend the reach and scale of its data collection. It relies, as do the internet behemoths, on a Franklinian/Faustian pact to gain permission to harvest data. Although it is on the face of it, state owned and controlled — though even this is murky, given that NHSX has contracted out development of the earlier app to private entities — it also has the architecture of the internet behemoths: the ability to hoover up personally identifiable data to a centralised data collection and processing facility.
Dr No has no doubt that, given sufficiently brutal controls, with absolute restrictions on mission creep, that some of what the British contact tracing app could achieve will have legitimate utility. But he is also just as aware — lessons learnt from the internet behemoths — of how ‘data scientists’ can get carried away, and extend their reach in intolerable ways, as he is also aware that dear dear Dido was chief pongo at Talk Talk when it suffered a calamitous data breach, reminding us that, unlike a physical panopticon, a digital panopticon can be hacked. Much writing on the wall: not so much reasons to be cheerful as reasons to be careful.