Future Watch: Privacy and Social Media
Future Watch 3 – Privacy and Social Media
Tech legend and one-half of Know Now, Chris Cooper concludes the discussion with Gemma Christie on data security and privacy. They talk openly about how Facebook and Google use your data, along with a great sandwich story. We also take a look at what the future holds for having more control over your own personal digital footprint.
G: You wrote your article ‘5 rules of thumb and security of YOUR data’ in 2015 so it’s about a year old now. Do you think there have been any substantial improvements within the last year?
C: No. I think it’s got worse, to tell the truth.
G: In what way?
C: We’re seeing more and more stuff being connected without any good agreement on what constitutes good security design. There are a number of standards that are out there, but there isn’t an agreement on the handshake and the trusted exchange of information between devices. There isn’t a mechanism for what I call the ‘lizard principle’ or the ‘lizard tail principle’ for where you could shut something down and confine it and contain where you have maybe a risk or an exposure, and you can sacrifice that particular component.
I think where so many of the solutions that we see coming through are what I call ‘single-threaded decision makers’, so you have one sensor or one trigger that leads to one action, and all it takes is for that one thing to be compromised, and it just sets in chain a whole run of events.
A good holistic design has multiple decision-making points reinforcing a trend. If you’re responding to that trend proactively, it will achieve some type of difference. Where we seem to be, is on things that are going, right, I want to know something about the state of this area, and then once it reaches a certain point then I’m going to go do something else.
An example of this could be: river flow is going up, let’s open up a sluice gate. A sluice gate only has one centre and if that centre is compromised, the sluice gate doesn’t open. People get flooded. That just seems really poor design, a lack of thought on how you make stuff happen and a lack of desire, in my opinion, to invest appropriately in a fit-for-purpose solution that would stand the test of time. We tend to be buying on price for the short term and not investing in a project for the medium to long term.
So no, I don’t think how our ethos and how we approach projects, how we are trying to deliver stuff and connect stuff up, is following good systems practice. We’re still seeing a number of poorly-designed, poorly-implemented solutions, especially in the internet space.
The 5 reasons to worry about security
G: In the article that was written last year, you said there are five good reasons we need to worry about security?
C: I think the top one is if it’s really, really secure, don’t connect it. We’re not ready yet for completely secure systems. The UK MOD has its own internet, which it connects, its own physically-deployed, never shared anything. It’s completely separate. There’s another internet run by the military. Maybe if you are needing to connect stuff up you’ve got to look at having your own proprietary network and installations and separation across the whole bundle, with comms, switches, power plant etc. to ensure nothing is shared. Then you may be fine, but that’s obviously incredibly expensive.
The second one, privacy by design principles, that’s becoming more familiar with developers. We’ve got a change in EU law coming up which will enshrine this, so apps have to have privacy by design so you can be forgotten, you can consent to have none of your data being used by a particular provider or service. That is moving along nicely. We like that one.
The lizard tail principle, I think I’m still unique in thinking in that way, but this is risk mitigation. Have a policy of sacrifice and containment of services that have been hacked or compromised.
On point 4 (machine to machine preference) we’re still seeing too many humans trying to do stuff, trust the machine, machines trust machines, but what’s interesting is that Google are now looking at saying well let’s look at this from a machine device learning perspective, how your machine is playing, being used. Your machine being a mobile so where it goes, how it’s moved, so they can see that as long as you’re moving the phone at these time periods as you do every other Monday to Friday, the chances are it’s you. If you then move to a different country and do something else, it kills their question whether it’s you.
You might see different levels of security just because there is an implicit view that it must be you because it’s behaving as designed. Be interesting to see how this goes but this is moving more into your machines trust machines. It’s obviously got a lot of people a bit worried because of the amount of data Google then has on your behaviour that they can then use and it’s whether that is then a good thing and you’re exposing yourself to things like number 2 (secure & privacy by design principle) because it’s no longer privacy by design.
It’s always remembering what happens so it’s a really interesting dilemma, where I’ve got conflicts in what I’m saying so we need to find a way forward there. I think that there is more of a desire to be open and trusted that is coming, trust in your interactions, drive your experience, these are now value adds that people are prepared to pay for. So for 50% of people that buy an ad and drive in purchases they would pay for some additionality trust in their data. Having an idea of where their data is going and how it’s being used. Which is good.
G: Okay, how do you know this?
C: I actually did a study with the University of Portsmouth that highlighted through a small empirical data set. You could say also comparing that against data by the mobile ecosystem forum, which is also suggested that it’s a growing market of people that would buy services that protect their data.
Which is good because KnowNow now has an innovation in this space launching very, very soon called Consentua.
Consentua – Privacy innovation
G: Consentua? Can you tell us a little bit more about that?
C: Yeah, Consentua was born out of our experience at Canary Wharf. What we created was a very powerful app that used a lot of personal data about where you were, what you’re doing and what you wanted. We felt that that wasn’t appropriate if you didn’t want to have that information stored or collected for particular times or circumstances. It gave you the opportunity to turn it off or to turn the volume down on the data that was collected. You still get some service but not the great service.
This has been picked up by people in the security industries as being a really good idea, we’ve since been testing a couple of designs as a minimum viable product and we’ve been doing a market assessment through the University of Portsmouth student project. This validated whether this was a good business opportunity. This is being driven by 1, people’s desire for better control of their data, 2, it changing the GDPR which is general data protection regulations in the EU. Finally, I believe that because we’re moving to an API based modular coding world, developers are very happy to borrow bits from other places.
For example, how you log on and off, you might go and use the Twitter, Facebook, or LinkedIn. Those guys are not providing or have any view of what the apps doing, all they’re providing is saying yeah I’m that person, you can trust them because they’re on my network. It’s a very easy way of not having to go and create Rome again, this idea of modular code build is also moving into the mainstream.
The conclusion is that we should have the ambition to attract at least 1% over the next 12 to 18 months of what is just in the UK alone, a 90 billion market. That’s made up of in-app purchases. Now if you’ll take the whole of the EU, you’re now looking at a 300 million pound mark. If you looking at globally, you’re looking at 550 billion.
And even if you’re taking 1% off 550 billion that is still a rather lot of money, and how the service would work would be micropayments paid for by those apps. It’s a business to business service that we provide other app providers to manage consent from users, and it’s got a really positive business case and we’re hoping it’s the next big idea.
G: Sounds great. Good luck.
C: Thank you very much.
Logins and multiple identities
G: What would you say about using login via a social network i.e. Facebook on another website i.e. Ocado, rather than creating a whole new member login on a website?
C: I think this is fine, it will evolve into something much better about your identity. Where I think this will move to is you will have your own verifiable digital identity as much as you have your National Insurance card now. How we validate it, how we create it, how we change as people do change their identity because people get married, they change sex, we need to work out how we do these things. How do we stop your identity being used, how do we delete your identity in death if it gets compromised and create a new one for you? Obviously not if you’re dead, because that’s done.
Norway does something interesting in that it uses its banking system to provide the validation of a person, so there is a physical validation of a person in a branch saying, “Oh yes, you’re a live, sentient human being”, then it checks other realms of documents and then that creates your digital identity which is then used to sign all your digital activity and give you access to digital stuff. One that this country is looking at is a thing called Verify, but that is not fit for purpose for doing a lot of things that you’d want to do on the web about validating transactions especially, there are some weaknesses. It’s moving in the right direction, but we have this paranoia about the government owning our digital identity and yet there are so many places where you have an identity already known by the government.
What we also need to think about is how we can best use our identity in all of these databases, because there’s going to come a time when actually we want to have a rich personal data exchange at different points in our life. For example, if we are in healthcare or you’ve got your tax bill coming up and you’ve seen the changes in people’s desire for understanding your tax situation. I predict that your tax affairs will be online. It will be open. If you look at Scandinavia, that’s where they’ve gone and you can see that this country will follow, just to remove the idea that there are people taking stuff out that they shouldn’t or not paying enough. Society has spoken, it wants the stuff online. We’ve got to bear in mind what we consider to be sacrosanct and something that you wouldn’t put online is actually going to be foisted and opened up anyhow.
G: Is there a conflict between wanting to be secure and the requirement for open data? As one of the ingredients for a smart city, can see those working together harmoniously?
C: Yeah, where I think we’ll end up, you’ll be managing a bunch of exchanges on very deep personal levels about your location, about the context of you, to get something from someone or some service. What they did do with that data, how they store it, how they repurpose it, again may not be open to them because it may just be for that real-time opportunity that you swap a bunch of credentials and it’s a fire and forget model. That has its own benefits and impacts. On the benefit side, to you as a citizen it’s great, you know that you’ll never be hassled again because they have no visibility of you. On the downside, you have no ongoing sense of “they understand what I want because I’ve been here before.” Every day will be like Groundhog Day. “Hello.” “But I was only here five minutes ago.”
You might get people that want to test it and go, “No, this thing is completely stupid, it will only work in its knowledge of you in the now”, especially when some services you want that ongoing. There’s nothing nicer than when you go to the same sandwich shop – when I was at university I went to the same sandwich shop because it was across the road, every single day and I had the same thing every single day. I’m like that, I have my groove.
After a while, there would just be a sandwich made for me. I’d go, we’d have a silent conversation nod, nod and I’d leave with the sandwich. It was great. I love that service. When I was younger I used to go to the pub a fair amount. They’d see you come through the door and then you’d nod and a pint would be poured. The crowd would be busy and then some time later the bartender would come and pick up your money. You’re going, “that’s great personal service.” That’s what you want from the internet.
How much information do I have to exchange to achieve that? Am I happy for them to store it and then share it and do other stuff? That’s the bit that we need to get our heads around. That’s why having consent of your data is really important because then you have a better understanding of what it is.
What we need to do as engineers, is to make it really easy. Let’s get rid of these 20 pages of terms and conditions. Let’s just give you a very clear, in plain language, not legal speak, just plain language, “If I do this with your data you’re going to get that and it’s up to you whether you like it or not.” You’re then making an informed decision and people will for selfish reasons make decisions because they’re doing it already with Facebook.
“My viewpoint is that Facebook is open with how they will use your data within their ecosystem… The internal conversation you might have about Facebook goes along these lines… Facebook takes all of my information and I let them because I want to tell the world what I’m doing, I want to share that picture”.
G: On Facebook, it’s very difficult to manage your security settings. I don’t think many people know how to do it properly.
C: Yes and Facebook have that tendency because they’re looking for that interaction … And also they don’t necessarily know what they want to do. They will be changing your terms and conditions.
I think one of the weaknesses … When was the last time you read the terms and conditions of Facebook?
G: I think I’d be lying if I said I ever have read them properly.
C: It’s scary what they will do and what they can do with your data.
G: I don’t think I want to think about it to be honest.
C: All those photos that you’ve uploaded. They’re now owned by Facebook.
G: Yes, it’s scary isn’t it?
C: They have a deep awareness of what you like and how you like. We already have a two-tier internet. Here’s an example you can do. Go and use a different search engine to the one that you normally use on a different computer. Go and use Duck Duck Go and then compare it to Google. Type in the same search request and see what you get back.
G: I’ll do it today.
C: And you’ll be really surprised because one of them has no awareness of you, especially when you use Duck Duck Go, but the way it works it’s different to Google. Google is giving you what it thinks it wants you to see based on its view of you as an entity.
G: Google has so many different parts to it that you log into.
C: I know what a wonderful organisation. It’s scary.
Do you have concerns about your own privacy? Have you ever looked at the terms and conditions or privacy settings on a website and not agreed with them? Let us know using the form below!