Excellent article on fraudulent seller tactics on Amazon.
The most prominent black hat companies for US Amazon sellers offer ways to manipulate Amazon’s ranking system to promote products, protect accounts from disciplinary actions, and crush competitors. Sometimes, these black hat companies bribe corporate Amazon employees to leak information from the company’s wiki pages and business reports, which they then resell to marketplace sellers for steep prices. One black hat company charges as much as $10,000 a month to help Amazon sellers appear at the top of product search results. Other tactics to promote sellers’ products include removing negative reviews from product pages and exploiting technical loopholes on Amazon’s site to lift products’ overall sales rankings.
AmzPandora’s services ranged from small tasks to more ambitious strategies to rank a product higher using Amazon’s algorithm. While it was online, it offered to ping internal contacts at Amazon for $500 to get information about why a seller’s account had been suspended, as well as advice on how to appeal the suspension. For $300, the company promised to remove an unspecified number of negative reviews on a listing within three to seven days, which would help increase the overall star rating for a product. For $1.50, the company offered a service to fool the algorithm into believing a product had been added to a shopper’s cart or wish list by writing a super URL. And for $1,200, an Amazon seller could purchase a “frequently bought together” spot on another marketplace product’s page that would appear for two weeks, which AmzPandora promised would lead to a 10% increase in sales.
This was a good article on this from last year. (My blog post.)
Amazon has a real problem here, primarily because trust in the system is paramount to Amazon’s success. As much as they need to crack down on fraudulent sellers, they really want articles like these to not be written.
Slashdot thread. Boing Boing post.
Posted on May 9, 2019 at 5:58 AM •
Fascinating article about the many ways Amazon Marketplace sellers sabotage each other and defraud customers. The opening example: framing a seller for false advertising by buying fake five-star reviews for their products.
Defacement: Sellers armed with the accounts of Amazon distributors (sometimes legitimately, sometimes through the black market) can make all manner of changes to a rival’s listings, from changing images to altering text to reclassifying a product into an irrelevant category, like “sex toys.”
Phony fires: Sellers will buy their rival’s product, light it on fire, and post a picture to the reviews, claiming it exploded. Amazon is quick to suspend sellers for safety claims.
Over the following days, Harris came to realize that someone had been targeting him for almost a year, preparing an intricate trap. While he had trademarked his watch and registered his brand, Dead End Survival, with Amazon, Harris hadn’t trademarked the name of his Amazon seller account, SharpSurvival. So the interloper did just that, submitting to the patent office as evidence that he owned the goods a photo taken from Harris’ Amazon listings, including one of Harris’ own hands lighting a fire using the clasp of his survival watch. The hijacker then took that trademark to Amazon and registered it, giving him the power to kick Harris off his own listings and commandeer his name.
There are more subtle methods of sabotage as well. Sellers will sometimes buy Google ads for their competitors for unrelated products—say, a dog food ad linking to a shampoo listing—so that Amazon’s algorithm sees the rate of clicks converting to sales drop and automatically demotes their product.
What’s also interesting is how Amazon is basically its own government—with its own rules that its suppliers have no choice but to follow. And, of course, increasingly there is no option but to sell your stuff on Amazon.
Posted on December 20, 2018 at 6:21 AM •
Researchers have demonstrated the ability to send inaudible commands to voice assistants like Alexa, Siri, and Google Assistant.
Over the last two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online —simply with music playing over the radio.
A group of students from University of California, Berkeley, and Georgetown University showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.
This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text. So while a human listener hears someone talking or an orchestra playing, Amazon’s Echo speaker might hear an instruction to add something to your shopping list.
Posted on May 15, 2018 at 6:13 AM •
People harassing women by delivering anonymous packages purchased from Amazon.
On the one hand, there is nothing new here. This could have happened decades ago, pre-Internet. But the Internet makes this easier, and the article points out that using prepaid gift cards makes this anonymous. I am curious how much these differences make a difference in kind, and what can be done about it.
Posted on February 22, 2018 at 6:04 AM •
Interesting essay about Amazon’s smart lock:
When you add Amazon Key to your door, something more sneaky also happens: Amazon takes over.
You can leave your keys at home and unlock your door with the Amazon Key app—but it’s really built for Amazon deliveries. To share online access with family and friends, I had to give them a special code to SMS (yes, text) to unlock the door. (Amazon offers other smartlocks that have physical keypads).
The Key-compatible locks are made by Yale and Kwikset, yet don’t work with those brands’ own apps. They also can’t connect with a home-security system or smart-home gadgets that work with Apple and Google software.
And, of course, the lock can’t be accessed by businesses other than Amazon. No Walmart, no UPS, no local dog-walking company.
Keeping tight control over Key might help Amazon guarantee security or a better experience. “Our focus with smart home is on making things simpler for customers —things like providing easy control of connected devices with your voice using Alexa, simplifying tasks like reordering household goods and receiving packages,” the Amazon spokeswoman said.
But Amazon is barely hiding its goal: It wants to be the operating system for your home. Amazon says Key will eventually work with dog walkers, maids and other service workers who bill through its marketplace. An Amazon home security service and grocery delivery from Whole Foods can’t be far off.
This is happening all over. Everyone wants to control your life: Google, Apple, Amazon…everyone. It’s what I’ve been calling the feudal Internet. I fear it’s going to get a lot worse.
Posted on December 22, 2017 at 6:25 AM •
Amazon has a cloud for US classified data.
The physical and computer requirements for handling classified information are considerable, both in terms of technology and procedure. I am surprised that a company with no experience dealing with classified data was able to do it.
Posted on November 21, 2017 at 6:16 AM •
Amazon Key is an IoT door lock that can enable one-time access codes for delivery people. To further secure that system, Amazon sells Cloud Cam, a camera that watches the door to ensure that delivery people don’t abuse their one-time access privilege.
Cloud Cam has been hacked:
But now security researchers have demonstrated that with a simple program run from any computer in Wi-Fi range, that camera can be not only disabled but frozen. A viewer watching its live or recorded stream sees only a closed door, even as their actual door is opened and someone slips inside. That attack would potentially enable rogue delivery people to stealthily steal from Amazon customers, or otherwise invade their inner sanctum.
And while the threat of a camera-hacking courier seems an unlikely way for your house to be burgled, the researchers argue it potentially strips away a key safeguard in Amazon’s security system.
Amazon is patching the system.
Posted on November 20, 2017 at 6:19 AM •
For once, the real story isn’t as bad as it seems. A researcher has figured out how to install malware onto an Echo that causes it to stream audio back to a remote controller, but:
The technique requires gaining physical access to the target Echo, and it works only on devices sold before 2017. But there’s no software fix for older units, Barnes warns, and the attack can be performed without leaving any sign of hardware intrusion.
The way to implement this attack is by intercepting the Echo before it arrives at the target location. But if you can do that, there are a lot of other things you can do. So while this is a vulnerability that needs to be fixed—and seems to have inadvertently been fixed—it’s not a cause for alarm.
Posted on August 10, 2017 at 1:54 PM •
Amazon has been issued a patent on security measures that prevents people from comparison shopping while in the store. It’s not a particularly sophisticated patent—it basically detects when you’re using the in-store Wi-Fi to visit a competitor’s site and then blocks access—but it is an indication of how retail has changed in recent years.
What’s interesting is that Amazon is on the other side of this arms race. As an on-line retailer, it wants people to walk into stores and then comparison shop on its site. Yes, I know it’s buying Whole Foods, but it’s still predominantly an online retailer. Maybe it patented this to prevent stores from implementing the technology.
It’s probably not nearly that strategic. It’s hard to build a business strategy around a security measure that can be defeated with cellular access.
Posted on June 23, 2017 at 6:26 AM •
Researchers build a covert channel between two virtual machines using a shared cache.
Posted on April 18, 2017 at 5:58 AM •
Sidebar photo of Bruce Schneier by Joe MacInnis.