Entries Tagged "retail"
Page 1 of 6
Your computerized things are talking about you behind your back, and for the most part you can’t stop them—or even learn what they’re saying.
This isn’t new, but it’s getting worse.
Surveillance is the business model of the Internet, and the more these companies know about the intimate details of your life, the more they can profit from it. Already there are dozens of companies that secretly spy on you as you browse the Internet, connecting your behavior on different sites and using that information to target advertisements. You know it when you search for something like a Hawaiian vacation, and ads for similar vacations follow you around the Internet for weeks. Companies like Google and Facebook make an enormous profit connecting the things you write about and are interested in with companies trying to sell you things.
Cross-device tracking is the latest obsession for Internet marketers. You probably use multiple Internet devices: your computer, your smartphone, your tablet, maybe your Internet-enabled television—and, increasingly, “Internet of Things” devices like smart thermostats and appliances. All of these devices are spying on you, but the different spies are largely unaware of each other. Start-up companies like SilverPush, 4Info, Drawbridge, Flurry, and Cross Screen Consultants, as well as the big players like Google, Facebook, and Yahoo, are all experimenting with different technologies to “fix” this problem.
Retailers want this information very much. They want to know whether their television advertising causes people to search for their products on the Internet. They want to correlate people’s web searching on their smartphones with their buying behavior on their computers. They want to track people’s locations using the surveillance capabilities of their smartphones, and use that information to send geographically targeted ads to their computers. They want the surveillance data from smart appliances correlated with everything else.
This is where the Internet of Things makes the problem worse. As computers get embedded into more of the objects we live with and use, and permeate more aspects of our lives, more companies want to use them to spy on us without our knowledge or consent.
Technically, of course, we did consent. The license agreement we didn’t read but legally agreed to when we unthinkingly clicked “I agree” on a screen, or opened a package we purchased, gives all of those companies the legal right to conduct all of this surveillance. And the way US privacy law is currently written, they own all of that data and don’t need to allow us to see it.
We accept all of this Internet surveillance because we don’t really think about it. If there were a dozen people from Internet marketing companies with pens and clipboards peering over our shoulders as we sent our Gmails and browsed the Internet, most of us would object immediately. If the companies that made our smartphone apps actually followed us around all day, or if the companies that collected our license plate data could be seen as we drove, we would demand they stop. And if our televisions, computer, and mobile devices talked about us and coordinated their behavior in a way we could hear, we would be creeped out.
The Federal Trade Commission is looking at cross-device tracking technologies, with an eye to regulating them. But if recent history is a guide, any regulations will be minor and largely ineffective at addressing the larger problem.
We need to do better. We need to have a conversation about the privacy implications of cross-device tracking, but—more importantly—we need to think about the ethics of our surveillance economy. Do we want companies knowing the intimate details of our lives, and being able to store that data forever? Do we truly believe that we have no rights to see the data that’s collected about us, to correct data that’s wrong, or to have data deleted that’s personal or embarrassing? At a minimum, we need limits on the behavioral data that can legally be collected about us and how long it can be stored, a right to download data collected about us, and a ban on third-party ad tracking. The last one is vital: it’s the companies that spy on us from website to website, or from device to device, that are doing the most damage to our privacy.
The Internet surveillance economy is less than 20 years old, and emerged because there was no regulation limiting any of this behavior. It’s now a powerful industry, and it’s expanding past computers and smartphones into every aspect of our lives. It’s long past time we set limits on what these computers, and the companies that control them, can say about us and do to us behind our backs.
This essay previously appeared on Vice Motherboard.
The way forced authorisation fraud works is that the retailer sets up the terminal for a transaction by inserting the customer’s card and entering the amount, then hands the terminal over to the customer so they can type in the PIN. But the criminal has used a stolen or counterfeit card, and due to the high value of the transaction the terminal performs a “referral”—asking the retailer to call the bank to perform additional checks such as the customer answering a security question. If the security checks pass, the bank will give the retailer an authorisation code to enter into the terminal.
The problem is that when the terminal asks for these security checks, it’s still in the hands of the criminal, and it’s the criminal that follows the steps that the retailer should have. Since there’s no phone conversation with the bank, the criminal doesn’t know the correct authorisation code. But what surprises retailers is that the criminal can type in anything at this stage and the transaction will go through. The criminal might also be able to bypass other security features, for example they could override the checking of the PIN by following the steps the retailer would if the customer has forgotten the PIN.
By the time the terminal is passed back to the retailer, it looks like the transaction was completed successfully. The receipt will differ only very subtly from that of a normal transaction, if at all. The criminal walks off with the goods and it’s only at the end of the day that the authorisation code is checked by the bank. By that time, the criminal is long gone. Because some of the security checks the bank asked for weren’t completed, the retailer doesn’t get the money.
Here’s a physical attack against a credit card verification system. Basically, the attack disrupts the communications between the retail terminal and the system that identifies revoked credit cards. Since retailers generally default to accepting cards when the system doesn’t work, the attack is generally successful.
All of the anti-counterfeiting features of the new Canadian $100 bill are resulting in people not bothering to verify them.
The fanfare about the security features on the bills, may be part of the problem, said RCMP Sgt. Duncan Pound.
“Because the polymer series’ notes are so secure … there’s almost an overconfidence among retailers and the public in terms of when you sort of see the strip, the polymer looking materials, everybody says ‘oh, this one’s going to be good because you know it’s impossible to counterfeit,'” he said.
“So people don’t actually check it.”
Well, new to us:
You see, an EMV payment card authenticates itself with a MAC of transaction data, for which the freshly generated component is the unpredictable number (UN). If you can predict it, you can record everything you need from momentary access to a chip card to play it back and impersonate the card at a future date and location. You can as good as clone the chip. It’s called a “pre-play” attack. Just like most vulnerabilities we find these days some in industry already knew about it but covered it up; we have indications the crooks know about this too, and we believe it explains a good portion of the unsolved phantom withdrawal cases reported to us for which we had until recently no explanation.
When you pay a restaurant bill at your table using a point-of-sale machine, are you sure it’s legit? In the past three months, Toronto and Peel police have discovered many that aren’t.
In what is the latest financial fraud, crooks are using distraction techniques to replace merchants’ machines with their own, police say. At the end of the day, they create another distraction to pull the switch again.
Using information inputted by customers, including PIN data, the criminals are reproducing credit cards at an alarming rate.
Presumably these hacked point-of-sale terminals look and function normally, and additionally save a copy of the credit card information.
Note that this attack works despite any customer-focused security, like chip-and-pin systems.
Many roadside farm stands in the U.S. are unstaffed. They work on the honor system: take what you want, and pay what you owe.
And today at his farm stand, Cochran says, just as at the donut shop years ago, most customers leave more money than they owe.
That doesn’t surprise social psychologist Michael Cunningham of the University of Louisville who has used “trust games” to investigate what spurs good and bad behavior for the last 25 years. For many people, Cunningham says, trust seems to be at least as strong a motivator as guilt. He thinks he knows why.
“When you sell me something I want and trust me to pay you even when you’re not looking, you’ve made my life good in two ways,” Cunningham tells The Salt. “I get something delicious, and I also get a good feeling about myself. Both of those things make me feel good about the world that I’m in a good place. And I also see you as a contributor to that good as somebody I want to reward. It’s a win win.”
I like systems that leverage personal moral codes for security. But I’ll bet that the pay boxes are bolted to the tables. It’s one thing for someone to take produce without paying. It’s quite another for him to take the entire day’s receipts.
A particularly clever form of retail theft—especially when salesclerks are working fast and don’t know the products—is to switch bar codes. This particular thief stole Lego sets. If you know Lego, you know there’s a vast price difference between the small sets and the large ones. He was caught by in-store surveillance.
EDITED TO ADD (6/12): I posted about a similar story in 2008.
Sidebar photo of Bruce Schneier by Joe MacInnis.