Let’s face it, we’ve all been bombarded with news about the new Data Protection Act and how and why we should comply. We have all been inundated with good advice and the perils if we fall foul of the new regime. In fact some people think that we’ve over regulated and that data protection is complete “overkill”. But have you ever wondered what would happen if we did not have it?
China’s “Social Credit System” is a good example of how an individual’s personal data can be used and shared in a most intrusive manner which frankly would neither be tolerated or deemed to be legitimate processing as that term is understood by the Data Protection Act.
In the UK we are all used to credit checks and, indeed, we may have a credit score provided by credit reference agencies. We, as data subjects consent to this information being used and it facilitates the granting of credit by lenders, including mortgage providers.
However, in China their social credit system expands this to all aspects of life and can be used to judge citizens’ behaviour and trustworthiness. So, for example, being caught jaywalking, playing music too loud or non-payment of bills will impact if an individual is to book a flight or train tickets. Originally planned in 2014 the Chinese government hope to have the entire system in place by 2020. Whilst at the moment China does not have a national social credit system many local governments operate them, whilst there are several unofficial private versions. What may well happen is that the information held by the various private companies will be passed to the central government and with there being no consistency amongst the private companies, it is not precisely known how or in what manner the information may be used by the government. So rating of scores or the types of data used may well lack consistency.
However, eventually the government will have a “unified social credit score” which will be linked to an individual’s permanent record. It may well be that a person could be “black listed” if, for example, they refuse to pay a fine. A person’s social credit score will improve if they are a parent or could deteriorate if they have traffic violations.
Being blacklisted can have serious consequences and could prevent a person being able to take out a loan or purchase property. Whilst there will probably be a system in place for a person’s score to be rehabilitated the process of doing so may well be long and arduous and with China not having the “rule of law” an individual’s civil rights are not secured.
The advocates of the system say that if a person has a good score then rather than their being “off the radar” they may be able to access credit which currently may be unavailable to them.
London’s “Evening Standard” recently reported that one of China’s provinces is operating a new app which will alert people if they are in 500 metres of someone in debt. Apparently the app name translates to map debtors which can be accessed through the country’s instant-messaging platform. The concept is that people will be alerted as to those who are in debt.
So before we criticise our data protection regime as “over egging the pudding” perhaps we should be grateful for the “rule of law” and the host of rights which individuals in this country are bestowed.
Perhaps, on a more positive note, we should gain some inspirations on the use of artificial intelligence currently being developed by “Fintech” at Deloitte. Their global head, Kent MacKenzie says that its use “could change the fortunes of the homeless” by building a digital background for people without an extensive conventional financial history which could result in financial services being outwith their reach. Fintech is all about financial inclusion and with individuals having difficulty in opening a bank account without disclosing years of payment data, addresses and employment data this initiative is badly needed. With the case of Fintech the individual’s profiling can be assessed by using alternative data rather that the traditional material which is currently relied upon. The other data which the Fintech model will use should be able to profile a credit seeker and establish whether that individual should be granted credit. Because Mackenzie’s view is that financial services are a basic human right, profiling individuals in such a fashion certainly accentuates the positive. So whilst data subjects will still gain the protection of the Data Protection Act it is the use of that data which is so positive.
For further infomation please contact Stephen Cowan
on 0141 572 4251.