It is said that the best way to explain anything in a way so that it can not only be understood but also retained as a piece of knowledge is to tell a story. So I am going to tell a story about what is called in the GDPR ‘profiling’ and ‘automated decision making’.
The connection between the two is understandably difficult for many to comprehend, for good reasons. It seems technical after all and a consequence of technology, data warehousing/analytics/AI etc. The question for many is why are they connected? So this is a story about me and a box and automated decisions made before technology became a part of it. I say me first, because I came before the box did, although you may argue this point after reading my story 😉
In fact it is not just a single box but hundreds of boxes, each one unique and different and through some act of fate, or magic, the single instance of me happens to have resided in each of the boxes at some time in my life. Often I reside in multiple boxes at the same time, a physical impossibility I know, but it is true.
My first memory of the box, was before I was aware of them, or their function. I was about 10 years old, and with my sister and brother introduced to a new friend/acquaintance of our family.
My mother speaking, proudly. “This is Karen she’s the intelligent one of the family.” Then she moves on to introduce my sister and brother, as the tomboy, and the trouble-maker. So I was profiled even before computers arrived on the scene in a normal family or even business life. It was even before store/loyalty cards, data-warehousing and big data analytics. I internalised this when looking back 10 years later, that I had been put into a box.
So what? You may think. Well the consequences are subtle and profound. Clearly for those that have been introduced to our family, the automated decision made is that Karen is intelligent which implies an inverse situation with the other 2 kids. Karen is the favoured according to the other 2 siblings. Clearly it didn’t make me popular, hence an automated decision is made on how I should be treated by those who are aware of this box. What is more is that my siblings have been placed into another box, inadvertently which influences how they are treated by others, and even how they perceive me.
I started to become aware of these boxes when I got pregnant at the age of 17 years and I was moved in a new box. The old box I didn’t mind so much, but this new one was not nice at all. I had done something really bad, my life, my dreams, it was over. I was in the ‘teenage pregnancy’ box and there were a whole load more boxes waiting to grab me when I took the marry the father route, teenage mother box… and let us not forget the box that my son ended up in a statistic by social services. Decisions were made over which I had no control, no more smiling faces, but faceless bureaucrats, George Orwell 1984 style. Nothing digitally automated, faceless human beings filling in checkboxes.
By the time I was 21 I landed in a new box of my own making for the next 13 year of my life. I was a single mum with a young kid living in an area occupied by mainly single mums like me, dysfunctional families, prostitutes and drug pushers. I didn’t choose to be there even though this is the box I was in. I lived in a flat with no heating, no wall insulation, single pane windows and concrete floors. For periods of time I didn’t even have hot water due to an automated decision. At this stage to have a choice in anything in life was a luxury. You survive. Which takes us to the most important word when it comes to human rights, and that is the right to choice. The right for human intervention in the GDPR on automated decisions made on us gives us a choice to challenge decisions made which affect your rights and freedoms as a data subject.
Fast forward to today. I realise now that what I called a ‘box’ when I was young, is actually a ‘profile’ in GDPR. I had been profiled, and automated decisions were made on my life over which I had no control -and this was before profiling and automated decisions were digitally automated -as we know it today.
This all happened in the UK. Now to place this into today’s context in Sweden.
By 2019 I’d been an entrepreneur for 6 years, and my second startup was suffering. I became personally liable for tax debts which I was unable to pay. I requested a ‘repayment plan’ from the Swedish tax authority, which I knew was technically possible, but was refused. The reason was that I’d had 3 parking fines over 3 years that I had been overdue in payment, hence I had passed the threshold to not get offered a repayment plan. I explained to them that I’d paid all my taxes for over 15 years, but it didn’t make a difference. I had been profiled on 3 parking fines, and an automated decision had been made which was was going to devastate me and my family. This was bizarre considering I had significant capital in assets (our house) which I was unable to take a loan against due to that I was by then profiled as not creditworthy. Soon afterwards I received a fat letter in the post, a repossession order on our family home.
What was interesting are the similarities from when I was young, and today which are as scary as the differences. The main difference being that I knew what was happening this time. I knew I’d been profiled using technology, and an automated decision had been made using technology. What was the similar, was that I felt equally as vulnerable as I had done in the 1980s-90s. The problem was that even though I had rights now, I was aware of my rights, I wasn’t in a position to use them.
Article 22 of the GDPR is a really important data subject right that we can overlook because we get bogged down with the technical/legal stuff. It seems complicated. But in reality it is not complicated nor even technical, it is very personal. I hope my story has succeeded in bringing this message across.