The smartphone in our pocket measures the daily number of steps we take, and the smartwatch on our wrist measures our pulse, blood pressure and sleep patterns. We might even record our menstruation cycle, what medications we are taking, including dose and time, and keep a journal of our eating habits. These puzzle pieces fit together to form an overall picture that provides insights into our lifestyle and habits and sheds light on any health risk factors. Is this the promise of digitalisation, or does it pose a risk to our society? The mountain of digital data offers a plethora of possibilities, depending on who has access to it. Social psychologist Jakub Samochowiec from the Gottlieb Duttweiler Institute (GDI) examined these questions together with other experts. To set the overall parameters for the discussion, the authors came up with four scenarios «that will never occur as such», explains Samochowiec. Extreme situations were selected deliberately to allow for the broadest range of possibilities.
The four GDI scenarios
- Big Government
The state collects data to optimise public health. Healthy behaviour is rewarded; unhealthy behaviour is punished. Those with unhealthy habits are practically forced to adopt healthy ones.
- Big Business
Healthcare is provided through the free market. The more data shared by a person to document good health, the less expensive their insurance premium.
- Big Self
Data are not collected centrally, but instead are used by the individual to receive feedback on their own behaviour and thus enable them to adopt healthy behaviours.
- Big Community
Everyone voluntarily shares their data as a service to society. This model does not discriminate because it treats diversity, which includes people who do not lead healthy lifestyles, as a factor that enriches the total pool of data.
Mr Samochowiec, when reading the outline of the four scenarios, one can already hear cries of outrage …
Because they are described in extreme form. And there is something totalitarian about any extreme. We attempt to explore these extremes in order to offer as much room for discussion as possible.
Are the discussions even productive if the scenarios are frightening?
We asked ourselves the same question. It’s a risk we’re willing to take. Because the extreme scenarios described also show something else, namely that the course of technical development is not deterministic. Rather, the question is how we, as a society, are to manage and use this data.
Let’s look at Big Government, which relegates data sovereignty to the state. A scenario that certainly has very obvious parallels with the current situation ...
… but which has always been subject to debate. What’s better? A lean government or a powerful one? Here the pendulum is always swinging back and forth a bit. Particularly right now during the pandemic, we have seen that a strong government certainly has its benefits when the situation calls for quick reaction.
But the longer the pandemic goes on, the more scrutiny a strong government will face.
That’s a very Swiss point of view. Several countries with fewer deaths have strong, controlling governments. But I also think that Big Government quickly comes up against its limits. The state cannot control our lives down to the last detail ‒ even if it wanted to. There’s another difficulty as well, namely that governments quickly gamble away people’s trust.
Can you give an example?
At the beginning of the pandemic, we heard that masks weren’t very effective. The reason was simply that there was a shortage. When mask mandates are then instituted later, of course credibility is going to suffer.
The pandemic has shown that a strong state has certain advantages.
Jakub Samochowiec Trend Researcher at the GDI
What has the pandemic taught us in terms of digitalisation?
I think what’s happening with people working from home is exciting, for example. It has shown that less control is needed than many of us thought. People are working from home quite effectively, without the need for extensive technical instruments of control. The pandemic perhaps called certain assumptions about people into question.
It indicates that people take personal responsibility and act with solidarity. But is that true? Aren’t smartwatches, for example, eroding solidarity?
I have to admit, the pandemic does seem to shed doubt on the issue of solidarity. At the same time, calls for a vaccine mandate have indeed been voiced loudly. Information campaigns that appeal to people taking personal responsibility eventually come up against their limits. Take the smoking ban, for example. Most people are glad that smoking in restaurants was simply banned and that it was no longer left to an appeal to personal responsibility.
So Big Self isn’t really an option either?
In the emergency that is a pandemic, it’s probably too late to appeal to individual responsibility if it doesn’t exist in the first place. A society does not reach a state that enables it to act in a certain way overnight; it’s a long-term process.
What barriers need to be overcome in this regard?
When we examine the datafication of society, we quickly start to have misgivings. For example, that we might lose our autonomy or relinquish control to tech giants (Big Business) or the state (Big Government). But it is possible to readily use data in a gainful way without necessarily surrendering control.
What criteria need to be met for us to manage health data in a sensible way?
It takes knowledge ‒ about the technology as well as the possibilities for using the technology. This is the only way to build trust. Not in the technology itself, but trust in how our health data is handled. And then it will take trusting other people: as a society we must also be able to trust that people will act reasonably ‒ without needing to impose rules (Big Government) or economic incentives (Big Business) ‒ and act with solidarity.
And how do we motivate a society to share data?
By encouraging its members to voluntarily share data with others. This sharing can be seen as a new form of solidarity.
What does that mean for each individual?
That we explore the issue of how our health data is used. And also that we better understand who we are sharing it with when we track our jogging sessions with our smartwatches. But that we also see ourselves as part of a system in which we are both its participants and its beneficiaries. This takes understanding that the quantity, variety and quality of the data are what determine how it can be used.
Doesn’t the volume of data in particular harbour a big risk?
The mass of data is a great opportunity; the big risk lies in the concentration of power. A key message of the study is that control of and discrimination against people based on data does not have to be an unavoidable future outcome. In Switzerland, for example, health insurance premiums do not depend on the individual’s behaviour or state of health. This principle still works despite or perhaps because of the data that is available.
Don’t we already change our behaviour because we are collecting data?
That’s definitely true. For better or for worse. Too much control is counter-productive, because we simply follow rules but no longer act rationally. We know that taking away people’s choices costs money and destroys trust. There’s also the risk of bad incentives.
Which ones, for instance?
I can think of one personal example. I ride my bike from Zurich to the GDI in Rüschlikon. So I started tracking the ride with an app. It always took me a bit more than half an hour. To beat the 30 minute mark and improve my numbers, I started riding more recklessly. Once I recognised this, I switched off the app in order to eliminate this bad incentive again.
Social psychologist Jakub Samochowiec works as a Senior Researcher at the Gottlieb Duttweiler Institute (GDI) and studies societal, economic and technological changes. He is co-author of the GDI study Are smartwatches eroding solidarity? Scenarios for a data-driven healthcare system, which was commissioned by the Sanitas Health Insurance Foundation.
llustrations: Nils Kasiske
Do you like this post?