Entries Tagged "games"

Page 3 of 7

Doxing as an Attack

Those of you unfamiliar with hacker culture might need an explanation of “doxing.”

The word refers to the practice of publishing personal information about people without their consent. Usually it’s things like an address and phone number, but it can also be credit card details, medical information, private e-mails — ­pretty much anything an assailant can get his hands on.

Doxing is not new; the term dates back to 2001 and the hacker group Anonymous. But it can be incredibly offensive. In 2014, several women were doxed by male gamers trying to intimidate them into keeping silent about sexism in computer games.

Companies can be doxed, too. In 2011, Anonymous doxed the technology firm HBGary Federal. In the past few weeks we’ve witnessed the ongoing doxing of Sony.

Everyone from political activists to hackers to government leaders has now learned how effective this attack is. Everyone from common individuals to corporate executives to government leaders now fears this will happen to them. And I believe this will change how we think about computing and the Internet.

This essay previously appeared on BetaBoston, who asked about a trend for 2015.

EDITED TO ADD (1/3): Slashdot thread.

Posted on January 2, 2015 at 7:21 AMView Comments

Attacking Online Poker Players

This story is about how at least two professional online poker players had their hotel rooms broken into and their computers infected with malware.

I agree with the conclusion:

So, what’s the moral of the story? If you have a laptop that is used to move large amounts of money, take good care of it. Lock the keyboard when you step away. Put it in a safe when you’re not around it, and encrypt the disk to prevent off-line access. Don’t surf the web with it (use another laptop/device for that, they’re relatively cheap). This advice is true whether you’re a poker pro using a laptop for gaming or a business controller in a large company using the computer for wiring a large amount of funds.

Posted on December 16, 2013 at 6:09 AMView Comments

NSA Spying on Online Gaming Worlds

The NSA is spying on chats in World of Warcraft and other games. There’s lots of information — and a good source document. While it’s fun to joke about the NSA and elves and dwarves from World of Warcraft, this kind of surveillance makes perfect sense. If, as Dan Geer has pointed out, your assigned mission is to ensure that something never happens, the only way you can be sure that something never happens is to know everything that does happen. Which puts you in the impossible position of having to eavesdrop on every possible communications channel, including online gaming worlds.

One bit (on page 2) jumped out at me:

The NMDC engaged SNORT, an open source packet-sniffing software, which runs on all FORNSAT survey packet data, to filter out WoW packets. GCHQ provided several WoW protocol parsing scripts to process the traffic and produce Warcraft metadata from all NMDC FORNSAT survey.

NMDC is the New Mission Development Center, and FORNSAT stands for Foreign Satellite Collection. MHS, which also appears in the source document, stands for — I think — Menwith Hill Station, a satellite eavesdropping location in the UK.

Since the Snowden documents first started being released, I have been saying that while the US has a bigger intelligence budget than the rest of the world’s countries combined, agencies like the NSA are not made of magic. They’re constrained by the laws of mathematics, physics, and economics — just like everyone else. Here’s an example. The NSA is using Snort — an open source product that anyone can download and use — because that’s a more cost-effective tool than anything they can develop in-house.

Posted on December 10, 2013 at 9:08 AMView Comments

F2P Monetization Tricks

This is a really interesting article about something I never even thought about before: how games (“F2P” means “free to play”) trick players into paying for stuff.

For example:

This is my favorite coercive monetization technique, because it is just so powerful. The technique involves giving the player some really huge reward, that makes them really happy, and then threatening to take it away if they do not spend. Research has shown that humans like getting rewards, but they hate losing what they already have much more than they value the same item as a reward. To be effective with this technique, you have to tell the player they have earned something, and then later tell them that they did not. The longer you allow the player to have the reward before you take it away, the more powerful is the effect.

This technique is used masterfully in Puzzle and Dragons. In that game the play primarily centers around completing “dungeons.” To the consumer, a dungeon appears to be a skill challenge, and initially it is. Of course once the customer has had enough time to get comfortable with the idea that this is a skill game the difficulty goes way up and it becomes a money game. What is particularly effective here is that the player has to go through several waves of battles in a dungeon, with rewards given after each wave. The last wave is a “boss battle” where the difficulty becomes massive and if the player is in the recommended dungeon for them then they typically fail here. They are then told that all of the rewards from the previous waves are going to be lost, in addition to the stamina used to enter the dungeon (this can be 4 or more real hours of time worth of stamina).

At this point the user must choose to either spend about $1 or lose their rewards, lose their stamina (which they could get back for another $1), and lose their progress. To the brain this is not just a loss of time. If I spend an hour writing a paper and then something happens and my writing gets erased, this is much more painful to me than the loss of an hour. The same type of achievement loss is in effect here. Note that in this model the player could be defeated multiple times in the boss battle and in getting to the boss battle, thus spending several dollars per dungeon.

This technique alone is effective enough to make consumers of any developmental level spend. Just to be safe, PaD uses the same technique at the end of each dungeon again in the form of an inventory cap. The player is given a number of “eggs” as rewards, the contents of which have to be held in inventory. If your small inventory space is exceeded, again those eggs are taken from you unless you spend to increase your inventory space. Brilliant!

It really is a piece about security. These games use all sorts of mental tricks to coerce money from people who would not have spent it otherwise. Tricks include misdirection, sunk costs, withholding information, cognitive dissonance, and prospect theory.

I am reminded of the cognitive tricks scammers use. And, of course, much of the psychology of security.

Posted on July 12, 2013 at 6:37 AMView Comments

Google Glass Enables New Forms of Cheating

It’s mentioned here:

Mr. Doerr said he had been wearing the glasses and uses them especially for taking pictures and looking up words while playing Scattergories with his family, though it is questionable whether that follows the game’s rules.

Questionable? Questionable? It’s just like using a computer’s dictionary while playing Scrabble, or a computer odds program while playing poker, or a computer chess program while playing an in-person game. There’s no question at all — it’s cheating.

We’re seeing the birth of a new epithet, “glasshole.”

Posted on April 15, 2013 at 4:29 AMView Comments

Facial Recognition of Avatars

I suppose this sort of thing might be useful someday.

In Second Life, avatars are easily identified by their username, meaning police can just ask San Francisco-based Linden Labs, which runs the virtual world, to look up a particular user. But what happens when virtual worlds start running on peer-to-peer networks, leaving no central authority to appeal to? Then there would be no way of linking an avatar username to a human user.

Yampolskiy and colleagues have developed facial recognition techniques specifically tailored to avatars, since current algorithms only work on humans. “Not all avatars are human looking, and even with those that are humanoid there is a huge diversity of colour,” Yampolskiy says, so his software uses those colours to improve avatar recognition.

Posted on May 4, 2012 at 6:31 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.