Episode  MSP69  Fixing It: Working for the Machine
Episode  MSP69  Fixing It: Working for the Machine
WIs your technology working for you, or are you working for it? Why machines want us to prove that we are human.
Produced by Jeff Sandhu for BFM89.
These shows are dictated to and transcribed by machines, and hurriedly edited by a human. Apologies for the typos and grammar flaws.
One of the reasons that this show exists is to demonstrate some of the many ways that technology is dramatically improving our lives. But what do you do when the technology that is supposed to be making our lives easier seems to be serving other masters?
This sounds like another of those Game of Thrones inspired episodes. Are we back to your master and servant theme?
· Indirectly, I guess.
· This is another one of those episodes I’ve had sitting on the burner for a while.
· And then the brouhaha about Facebook’s supposed pivot towards privacy broke cover last week and I thought it was time to tackle this one.
Before we head into Facebook land again, do you want to give us a little background for today’s show?
· As you mentioned in the intro, I think technology is fantastic.
· For the majority of humans – in the developed world at least – this is still the best time to be alive.
· And part of the reason for that is technology.
· Whether it’s technology delivering information to your hand, safer and more reliable modes of transport, cheap and plentiful food, incredible medical advances.
· Everywhere you look, technology is chipping away at the brutality of the past.
You could argue that some of that technology is pretty brutal too.
· Absolutely. We have an enormous capacity for violence and destruction.
· And one of the first things we do with a lot of technological developments is to weaponise them and use them for so-called defence or security purposes.
· Whether it’s video cameras or advances in sonics.
· It’s not what we’re talking about today, but if you are interested in that part of our development, there’s a great article at New Scientist called How Humans Evolved to be both shockingly violent and super co-operative.
· Not the snappiest title, but the piece, written by Richard Wrangham, looks at the evolutionary and societal case for violence and cooperation.
When do you think we lost control of the technology?
· I’m not there’s even an answer to that question.
· You could say that we lost control of technology hundreds of years ago.
· Donald Trump loves to tell people how well walls have worked for millennia.
· And yes, they keep marauders and bandits out.
· History is full of walled cities that fell to invaders.
· Some were more technologically advanced than the invaders, who simply encircled them and starved them out.
· But often it was technology that breached the walls.
· Siege engines. Catapults. Crossbows.
I thought we weren’t talking about wars and violence?
· Yeah, but if I start talking about looms and steel mills people are going to switch off.
· The things most people seem to remember from their history classes are the battles and the wars.
· And greater technology has often – not always – played a decisive role in who wins wars.
· The US Civil War is a good example. The industrialised Union States literally had an enormous war machine that the more agricultural and slave based economy of the Confederate States couldn’t compete with.
So we’ve never been in control of technology?
· You can own your own spade. It doesn’t make sense for everyone to have their own steel mill.
· So, it’s a really difficult question to answer.
· But I think, throughout the 20thCentury at least there was a feeling that the technology served us.
· Electricity. Telephones. Cars. Fridges. Air conditioning. TVs.
· All the paraphernalia of last century’s consumer culture.
· But somehow, our ability to grasp that technology seems to be declining.
Is that from the perspective of understanding how things work or from the actual ownership of a product?
· A little bit of both I think.
· And they do seem to be synonymous.
· The more complicated the technology in our lives becomes, the less ownership we take of it.
· Or the more willing we are to allow the manufacturer or seller to retain some of those ownership rights
In what sense?
· When I bought a new car a few years ago, it didn’t come with a CD player.
You still play CDs?
· This isn’t that show. But yes, I still buy and play CDs.
· The car at the service centre told me that they couldn’t install one for me.
· Even though I’d found a CD player that the manufacturer sells in other countries for precisely those reasons.
· I said I’d import the player. They said they couldn’t install it.
· And they said that if I installed it myself, or had a CD player installed by a third party, it would invalidate the warranty on the entire car.
So what did you do?
· Bought a car with a CD player. But, like I said, this isn’t that show.
· It was only after that I thought how monumentally weird and arrogant that was.
To buy a car because it had a CD player?
· To be told that I can’t do something with my own car.
· That if I so something as inconsequential as changing the media centre, then the manufacturer won’t cover me if the gearbox falls out or if the ECU fails.
· We take a lot of these things for granted, and it isn’t accidental.
· There are numerous examples where we’re pretty much groomed to accept these very weird restrictions around technology we use every day.
I guess Apple’s bricking of jailbroken iPhones is one of the most obvious examples…
· Yes. And let’s be clear. This isn’t just an Apple issue.
· Most of the big tech companies – product and services included – are guilty of this kind of high-handedness.
· But the Apple example is a useful one because it was so high profile.
Do you want to remind the people who might have forgotten what the story was about? Or they never owned an iPhone and couldn’t be bothered to follow the story?
· I’m trying to pull it out of the dusty Rolodex of the mind where my memories are stored.
· Essentially, some people wanted to be able to play around with their iPhones in ways that Apple didn’t approve of.
· Which basically means doing all the tweaking that’s standard on Android phones.
· Or to side-load apps that didn’t meet Apple’s T&Cs or were from developers who didn’t want to submit themselves to Apple’s set of ethics and conditions.
· That could range from anonymizing apps that stopped the phone from sending data back to servers to pirated copies of retail apps from the App Store which people could then add Free or at a much-reduced cost.
What was Apple’s response?
· To issue updates that detected if the phone had been jailbroken and bricked it, a slang term for rendering it inoperable, ie turning it into a sleek glass fronted brick.
· Which, depending on the ethos of the day, the company might choose to unlock for you – at a cost – or simply leave dead.
How did people respond?
· People who had bought these really expensive phones they could no longer use were rightly outraged, but the wider public?
· It was pretty much: who cares?
· And I think that, and similar occurrences at other tech companies, sent a strong signal that we were prepared to let them get away with pretty much anything.
· It showed that most people were quite happy to go along with whatever set of conditions came along with the device.
· And, crucially, that the people who weren’t didn’t have a sufficiently large voice or enough influence to make people understand the fundamental weirdness of this approach.
That you don’t own the thing you buy?
· It’s one thing to buy something with financing.
· You want a house, you need a mortgage.
· And until you finish paying, the finance company has first dibs on any sale of that property.
· But you buy a phone from Apple or Samsung or whoever, and that company dictates how you can use the device?
· That’s strange. You wouldn’t let your mortgage company tell you what colour the walls were, what furniture you could use, or who you were allowed to let into the property.
· But that’s routinely the arrangement we accept with electronic devices and tech services these days.
When we come back, how we’re being groomed by Big Tech.
Today we’re looking at some of the ways that our relationship with technology and the companies that make and control that technology are breaking down.
Matt, before the break you used that loaded term, grooming. What do you mean when you say that?
· Again, I’m not getting into that conspiracy theory territory.
· I don’t think this is a grand conspiracy, it’s more a case of bandwagoneering.
· And of course, there are probably a lot of agencies and consultants sitting in the background touting these ideas as best practises.
· But those EULA’s we all blithely click on.
· When you look at how they’re structured, they want you to click yes without reading.
· Partly because most of us will fall into a narcoleptic coma halfway through point one, but also because it’s the most efficient and machine-like way of using the service.
But we’re not supposed to be machine-like…
· That was something that we touched on in last week’s episode – Data Babies – available on all good podcast platforms if you haven’t listened to it yet.
· The idea that we are deliberately remaking ourselves in the machine’s image.
· There’s a great article by John Naughton at the guardian, from November 2018, titled Computers Have Learned to Make Us Jump Through Hoops.
· He briefly goes through some of these grooming exercises.
· And it’s one of those ‘Aha’ type moments.
· Those Recaptcha tests that ask us to prove we aren’t robots.
· And then they take us to a bunch of pictures where we have to identify road signs and traffic lights.
· What we’re actually doing is helping to train AI, like Google’s Waymo self-driving car systems to recognise and identify the things they will see as they drive around.
· We’re doing that work for them. Free of charge. And then, when I finally prove I am a real boy, that thing I just signed up for asks me for all kinds of personal information and preferences that it uses to try and tell me stuff.
It’s just a Turing test…
· But it isn’t.
· As Naughton points out those are an inversion of the Turing Test.
· The Turing test is for robots to try and pass as human.
· These tests are for us to prove to a robot that we’re human.
· And that’s what I mean about this relationship being all wrong.
· It’s like when one of my speakers announces, battery low, please charge now. Is it talking about me or itself?
· We have companies telling us how to use their products, not allowing us to change a thing, and then making us complete a series of challenges to prove that we’re human enough to be their customers.
· It’s more than absurd.
Which I imagine is what brings us to Facebook…
· The poster child for everything that’s wrong with Silicon Valley.
· Which, I imagine, has taken a lot of pressure off technology’s last punching bag, Uber.
· Let’s face it, we hardly even notice when Twitter’s Jack Dorsey posts pretty pictures of temples and ignores what many commentators have classified as ethnic cleansing and potentially genocide in the same country.
· Because Facebook will probably have done something tone deaf and foot shooting during that same news cycle.
And what’s the latest example?
· Facebook. Last week MZ announced that the company was pivoting towards privacy.
· I think one of the examples he used was that Facebook was founded on the idea of people living globally, and overlooking the simple fact that many people want to live locally.
What does that mean?
· Who knows? If anyone does, please feel free to tweet us your explanation.
· One of the major steps Facebook has announced it will take is to unify the messaging platforms of WA, Instagram and Facebook.
How does that protect our privacy?
· In theory it means that the same encryption can be used to secure all the platforms.
· Which you can do without unifying the platforms. I can understand that Facebook thinks it’s paying out a lot of money to replicate systems.
· If that’s the case, tell us. At least then we can make an informed decision to carry on using the services or migrate elsewhere.
· Don’t show us a grapefruit and tell us it’s an orange.
· We’ll know as soon as we take a bite.
Can Facebook make privacy a priority?
· I don’t think it can.
· We’ve discussed it here so often.
· Facebook’s business model relies on exploiting the privacy of its users and selling that information to its real customers, advertisers.
· A lot of the comment I’ve seen online suggests that this could be a move to integrate various parts of the company more tightly in case it’s ordered to break up or divest certain components in coming years.
Is that likely?
· There’s a growing public and political consensus that some of the companies are already too big and too powerful.
· That’s something we’ll have a look at in more detail next week.
· Honestly, I’m on the fence. Does Google dominate search? Yes.
· But how do you break that up? You can get the company to split or sell-off its other divisions, like Maps, but how do you reduce their dominance in search?
· And if you do, do you create another monopoly to rise in its place.
· More on that next week. But as I said there is growing consensus that something should be done.
It seems to be a focus of the Democratic presidential nomination campaign in the US.
· I think Elizabeth Warren issued comments about breaking up the giants.
· I think the EU is also looking at the possibility of anti trust cases against some companies, similar to the ones that forced Microsoft to break up parts of its business in 2000.
· So, a lot of people are seeing the Facebook announcement as a pre-emptive move.
· Turning Facebook, WA and Instagram into essentially a single product would make it much harder for lawmakers to order them broken up.
· And in doing so, it locks users ever more tightly into a Facebook owned ecosystem.
We’ve talked about interoperability before. Especially when it comes to IOT and home control systems. Is this another area where technology is failing us?
· To an extent. Because we’re often talking about commercial standards rather than open source ones.
· Obviously, companies want to sell you as many of their own products as possible.
· They won’t necessarily want their washing machine to communicate as seamlessly with a rival’s dryer as they will their own.
· But we’re sold that as an advantage rather than a drawback.
· And even when the systems are more open, they still rely on a commercial third party like Google Home or Amazon’s Alexa.
· So the standards are set by a company than can decide to freeze you out or directly compete with or replicate your product.
· Once again, favouring the entrenched operators rather than innovators and emerging businesses.
We’ve mostly been talking about tech’s usual suspects today. But this issue is much broader than that, isn’t it?
· One of the reasons I do that is because these shows are quite short.
· It’s easier to use examples that people recognise.
· But many of the subjects we tackle on MSP are valid for a much wider range of companies.
· Especially when we talk about all the ways technology can be used to create divides in society.
· To favour those who have wealth and power and to continue to increase the gap between them and the people who don’t.
· So, last week a breakthrough depression fighting drug was announced to be close to clearing its FDA approvals in the USA.
· I’m not going to name it, you can Google it and find out more.
· But it’s derived from a commonly used anaesthetic drug called ketamine.
How does it work?
· No one’s really sure. But it seems to work really well on people who are resistant to the serotonin class of anti-depressants.
· And rather than taking it every day forever, you take a short course of the drug under medical supervision and for many people their mood starts to lift shortly after.
· The treatment can then be repeated as required if the illness returns.
Isn’t that a good thing?
· It’s a fantastic thing. Finally people are opening up and we can actually have conversations about mental health.
· But this breakthrough is also a very expensive thing. A course can cost thousands of USD.
· Which you’ll pay every time you need a top up.
· Which is fine if you have great medical insurance, or if your country’s healthcare system covers it.
That’s generally how healthcare works…
· There’s a couple of dangers.
· Ketamine is a widely used drug available in generic form, so it’s cheap.
· But because of that, few companies are willing to fund trials to have it classified as an anti-depression medication.
· But tweak the formula and patent it, and you have your miracle wonder-drug that commands premium prices.
And the second danger?
· Ketamine has been used as an illicit party drug for decades.
· It’s easily available on the street.
· That’s where you see people being failed by the technology and development.
· Where desperate and vulnerable people whose depression is drug resistant, turn to illegal sources for treatment.
· And use drugs of uncertain provenance and dosage outside of medical supervision.
· That’s a horrible risk to take when someone else is able to obtain similar treatment, perfectly legally, by virtue of being wealthier.
This week we hosted BFM Rocks, a conference for retailers and other small business owners. You were appearing on a panel about small businesses using data to boost their businesses. One topic we covered was how can small businesses use the same practises as the big tech companies without alienating their own customers?
· Due to the conflicts of time and in the spirit of full disclosure we’re recording this show before ROCKS.
· So I’m talking about what I’m planning to say rather than what I actually said.
· And it is difficult.
· That edge that data about customers can make a significant difference to Small businesses and retailers.
· And I’ll be honest, when clients ask me about this, it’s the area where I’m most conflicted.
· So I usually outline the fact that some people are starting to break away from the data systems created by companies like Facebook and Amazon.
· They want the convenience of retailers that know something about them and their tastes, without the intrusion of continuous attempts to reap their data.
A lot of it comes down your own business ethics?
· Especially when it comes to systems that allow you to facetrack customers, and monitor their behaviour instore.
· I’d advise being upfront with customers. Let them know that you’re using these systems.
· Give them an opt out and make your data retention policies clear.
· One of the reasons we get so frustrated with the big tech companies is because they aren’t upfront about their intentions.
· They don’t answer straight questions, and worse, they still refuse to come clean when we can see straight through them.
· So id say be as honest as you can.
· Don’t pass off actions you’re taking to improve your business as things you’re doing for your customers.
· Treat them with respect. Give them great prices and great service, and a reason to come back and shop again.