Entries Tagged "web"

Page 5 of 14

An Honest Privacy Policy

Funny:

The data we collect is strictly anonymous, unless you’ve been kind enough to give us your name, email address, or other identifying information. And even if you have been that kind, we promise we won’t sell that information to anyone else, unless of course our impossibly obtuse privacy policy says otherwise and/or we change our minds tomorrow.

There’s a lot more.

Posted on December 27, 2010 at 1:04 PMView Comments

Evan Kohlmann

Interesting profile of Evan Kohlmann:

Evan Kohlmann spends his days lurking in the darkest corners of the Internet, where jihadists recruit sympathizers from across the globe. He has testified in over two dozen terrorism trials—and sees danger everywhere he looks. Is he prescient or naïve?

Posted on December 14, 2010 at 5:35 AMView Comments

FTC Privacy Report

The U.S. Federal Trade Commission released its privacy report: “Protecting Consumer Privacy in an Era of Rapid Change.”

From the press release:

One method of simplified choice the FTC staff recommends is a “Do Not Track” mechanism governing the collection of information about consumer’s Internet activity to deliver targeted advertisements and for other purposes. Consumers and industry both support increased transparency and choice for this largely invisible practice. The Commission recommends a simple, easy to use choice mechanism for consumers to opt out of the collection of information about their Internet behavior for targeted ads. The most practical method would probably involve the placement of a persistent setting, similar to a cookie, on the consumer’s browser signaling the consumer’s choices about being tracked and receiving targeted ads.

News story.

Posted on December 6, 2010 at 1:52 PMView Comments

Risk Reduction Strategies on Social Networking Sites

By two teenagers:

Mikalah uses Facebook but when she goes to log out, she deactivates her Facebook account. She knows that this doesn’t delete the account ­ that’s the point. She knows that when she logs back in, she’ll be able to reactivate the account and have all of her friend connections back. But when she’s not logged in, no one can post messages on her wall or send her messages privately or browse her content. But when she’s logged in, they can do all of that. And she can delete anything that she doesn’t like. Michael Ducker calls this practice “super-logoff” when he noticed a group of gay male adults doing the exact same thing.

And:

Shamika doesn’t deactivate her Facebook profile but she does delete every wall message, status update, and Like shortly after it’s posted. She’ll post a status update and leave it there until she’s ready to post the next one or until she’s done with it. Then she’ll delete it from her profile. When she’s done reading a friend’s comment on her page, she’ll delete it. She’ll leave a Like up for a few days for her friends to see and then delete it.

I’ve heard this practice called wall scrubbing.

In any reasonably competitive market economy, sites would offer these as options to better serve their customers. But in the give-it-away user-as-product economy we so often have on the Internet, the social networking sites have a different agenda.

Posted on December 1, 2010 at 1:27 PMView Comments

New Attack Against ASP.NET

It’s serious:

The problem lies in the way that ASP.NET, Microsoft’s popular Web framework, implements the AES encryption algorithm to protect the integrity of the cookies these applications generate to store information during user sessions. A common mistake is to assume that encryption protects the cookies from tampering so that if any data in the cookie is modified, the cookie will not decrypt correctly. However, there are a lot of ways to make mistakes in crypto implementations, and when crypto breaks, it usually breaks badly.

“We knew ASP.NET was vulnerable to our attack several months ago, but we didn’t know how serious it is until a couple of weeks ago. It turns out that the vulnerability in ASP.NET is the most critical amongst other frameworks. In short, it totally destroys ASP.NET security,” said Thai Duong, who along with Juliano Rizzo, developed the attack against ASP.NET.

Here’s a demo of the attack, and the Microsoft Security Advisory. More articles. The theory behind this attack is here.

EDITED TO ADD (9/27): Three blog posts from Scott Guthrie.

EDITED TO ADD (9/28): There’s a patch.

EDITED TO ADD (10/13): Two more articles.

Posted on September 27, 2010 at 6:51 AMView Comments

DHS Still Worried About Terrorists Using Internet Surveillance

Profound analysis from the Department of Homeland Security:

Detailed video obtained through live Web-based camera feeds combined with street-level and direct overhead imagery views from Internet imagery sites allow terrorists to conduct remote surveillance of multiple potential targets without exposing themselves to detection.

Cameras, too.

Remember, anyone who searches for anything on the Internet may be a terrorist. Report him immediately.

Posted on September 16, 2010 at 6:34 AMView Comments

Social Steganography

From danah boyd:

Carmen is engaging in social steganography. She’s hiding information in plain sight, creating a message that can be read in one way by those who aren’t in the know and read differently by those who are. She’s communicating to different audiences simultaneously, relying on specific cultural awareness to provide the right interpretive lens. While she’s focused primarily on separating her mother from her friends, her message is also meaningless to broader audiences who have no idea that she had just broken up with her boyfriend.

Posted on August 25, 2010 at 6:20 AMView Comments

A Revised Taxonomy of Social Networking Data

Lately I’ve been reading about user security and privacy—control, really—on social networking sites. The issues are hard and the solutions harder, but I’m seeing a lot of confusion in even forming the questions. Social networking sites deal with several different types of user data, and it’s essential to separate them.

Below is my taxonomy of social networking data, which I first presented at the Internet Governance Forum meeting last November, and again—revised—at an OECD workshop on the role of Internet intermediaries in June.

  • Service data is the data you give to a social networking site in order to use it. Such data might include your legal name, your age, and your credit-card number.
  • Disclosed data is what you post on your own pages: blog entries, photographs, messages, comments, and so on.
  • Entrusted data is what you post on other people’s pages. It’s basically the same stuff as disclosed data, but the difference is that you don’t have control over the data once you post it—another user does.
  • Incidental data is what other people post about you: a paragraph about you that someone else writes, a picture of you that someone else takes and posts. Again, it’s basically the same stuff as disclosed data, but the difference is that you don’t have control over it, and you didn’t create it in the first place.
  • Behavioral data is data the site collects about your habits by recording what you do and who you do it with. It might include games you play, topics you write about, news articles you access (and what that says about your political leanings), and so on.
  • Derived data is data about you that is derived from all the other data. For example, if 80 percent of your friends self-identify as gay, you’re likely gay yourself.

There are other ways to look at user data. Some of it you give to the social networking site in confidence, expecting the site to safeguard the data. Some of it you publish openly and others use it to find you. And some of it you share only within an enumerated circle of other users. At the receiving end, social networking sites can monetize all of it: generally by selling targeted advertising.

Different social networking sites give users different rights for each data type. Some are always private, some can be made private, and some are always public. Some can be edited or deleted—I know one site that allows entrusted data to be edited or deleted within a 24-hour period—and some cannot. Some can be viewed and some cannot.

It’s also clear that users should have different rights with respect to each data type. We should be allowed to export, change, and delete disclosed data, even if the social networking sites don’t want us to. It’s less clear what rights we have for entrusted data—and far less clear for incidental data. If you post pictures from a party with me in them, can I demand you remove those pictures—or at least blur out my face? (Go look up the conviction of three Google executives in Italian court over a YouTube video.) And what about behavioral data? It’s frequently a critical part of a social networking site’s business model. We often don’t mind if a site uses it to target advertisements, but are less sanguine when it sells data to third parties.

As we continue our conversations about what sorts of fundamental rights people have with respect to their data, and more countries contemplate regulation on social networking sites and user data, it will be important to keep this taxonomy in mind. The sorts of things that would be suitable for one type of data might be completely unworkable and inappropriate for another.

This essay previously appeared in IEEE Security & Privacy.

Edited to add: this post has been translated into Portuguese.

Posted on August 10, 2010 at 6:51 AMView Comments

DNSSEC Root Key Split Among Seven People

The DNSSEC root key has been divided among seven people:

Part of ICANN’s security scheme is the Domain Name System Security, a security protocol that ensures Web sites are registered and “signed” (this is the security measure built into the Web that ensures when you go to a URL you arrive at a real site and not an identical pirate site). Most major servers are a part of DNSSEC, as it’s known, and during a major international attack, the system might sever connections between important servers to contain the damage.

A minimum of five of the seven keyholders—one each from Britain, the U.S., Burkina Faso, Trinidad and Tobago, Canada, China, and the Czech Republic—would have to converge at a U.S. base with their keys to restart the system and connect everything once again.

That’s a secret sharing scheme they’re using, most likely Shamir’s Secret Sharing.
We know the names of some of them.

Paul Kane—who lives in the Bradford-on-Avon area—has been chosen to look after one of seven keys, which will ‘restart the world wide web’ in the event of a catastrophic event.

Dan Kaminsky is another.

I don’t know how they picked those countries.

Posted on July 28, 2010 at 11:12 AMView Comments

Economic Considerations of Website Password Policies

Two interesting research papers on website password policies.

Where Do Security Policies Come From?“:

Abstract: We examine the password policies of 75 different websites. Our goal is understand the enormous diversity of requirements: some will accept simple six-character passwords, while others impose rules of great complexity on their users. We compare different features of the sites to find which characteristics are correlated with stronger policies. Our results are surprising: greater security demands do not appear to be a factor. The size of the site, the number of users, the value of the assets protected and the frequency of attacks show no correlation with strength. In fact we find the reverse: some of the largest, most attacked sites with greatest assets allow relatively weak passwords. Instead, we find that those sites that accept advertising, purchase sponsored links and where the user has a choice show strong inverse correlation with strength.

We conclude that the sites with the most restrictive password policies do not have greater security concerns, they are simply better insulated from the consequences of poor usability. Online retailers and sites that sell advertising must compete vigorously for users and traffic. In contrast to government and university sites, poor usability is a luxury they cannot afford. This in turn suggests that much of the extra strength demanded by the more restrictive policies is superfluous: it causes considerable inconvenience for negligible security improvement.

The Password Thicket: Technical and Market Failures in Human Authentication on the Web:

Abstract: We report the results of the first large-scale empirical analysis of password implementations deployed on the Internet. Our study included 150 websites which offer free user accounts for a variety of purposes, including the most popular destinations on the web and a random sample of e-commerce, news, and communication websites. Although all sites evaluated relied on user-chosen textual passwords for authentication, we found many subtle but important technical variations in implementation with important security implications. Many poor practices were commonplace, such as a lack of encryption to protect transmitted passwords, storage of cleartext passwords in server databases, and little protection of passwords from brute force attacks. While a spectrum of implementation quality exists with a general correlation between implementation choices within more-secure and less-secure websites, we find a surprising number of inconsistent choices within individual sites, suggesting that the lack of a standards is harming security. We observe numerous ways in which the technical failures of lower-security sites can compromise higher-security sites due to the well-established tendency of users to re-use passwords. Our data confirms that the worst security practices are indeed found at sites with few security incentives, such as newspaper websites, while sites storing more sensitive information such as payment details or user communication implement more password security. From an economic viewpoint, password insecurity is a negative externality that the market has been unable to correct, undermining the viability of password-based authentication. We also speculate that some sites deploying passwords do so primarily for psychological reasons, both as a justification for collecting marketing data and as a way to build trusted relationships with customers. This theory suggests that efforts to replace passwords with more secure protocols or federated identity systems may fail because they don’t recreate the entrenched ritual of password authentication.

EDITED TO ADD (8/7): Four blog posts by the authors of the second paper.

Posted on July 20, 2010 at 1:52 PMView Comments

1 3 4 5 6 7 14

Sidebar photo of Bruce Schneier by Joe MacInnis.