Entries Tagged "privacy"

Page 91 of 144

Filming the Police

In at least three U.S. states, it is illegal to film an active duty policeman:

The legal justification for arresting the “shooter” rests on existing wiretapping or eavesdropping laws, with statutes against obstructing law enforcement sometimes cited. Illinois, Massachusetts, and Maryland are among the 12 states in which all parties must consent for a recording to be legal unless, as with TV news crews, it is obvious to all that recording is underway. Since the police do not consent, the camera-wielder can be arrested. Most all-party-consent states also include an exception for recording in public places where “no expectation of privacy exists” (Illinois does not) but in practice this exception is not being recognized.

Massachusetts attorney June Jensen represented Simon Glik who was arrested for such a recording. She explained, “[T]he statute has been misconstrued by Boston police. You could go to the Boston Common and snap pictures and record if you want.” Legal scholar and professor Jonathan Turley agrees, “The police are basing this claim on a ridiculous reading of the two-party consent surveillance law—requiring all parties to consent to being taped. I have written in the area of surveillance law and can say that this is utter nonsense.”

The courts, however, disagree. A few weeks ago, an Illinois judge rejected a motion to dismiss an eavesdropping charge against Christopher Drew, who recorded his own arrest for selling one-dollar artwork on the streets of Chicago. Although the misdemeanor charges of not having a peddler’s license and peddling in a prohibited area were dropped, Drew is being prosecuted for illegal recording, a Class I felony punishable by 4 to 15 years in prison.

This is a horrible idea, and will make us all less secure. I wrote in 2008:

You cannot evaluate the value of privacy and disclosure unless you account for the relative power levels of the discloser and the disclosee.

If I disclose information to you, your power with respect to me increases. One way to address this power imbalance is for you to similarly disclose information to me. We both have less privacy, but the balance of power is maintained. But this mechanism fails utterly if you and I have different power levels to begin with.

An example will make this clearer. You’re stopped by a police officer, who demands to see identification. Divulging your identity will give the officer enormous power over you: He or she can search police databases using the information on your ID; he or she can create a police record attached to your name; he or she can put you on this or that secret terrorist watch list. Asking to see the officer’s ID in return gives you no comparable power over him or her. The power imbalance is too great, and mutual disclosure does not make it OK.

You can think of your existing power as the exponent in an equation that determines the value, to you, of more information. The more power you have, the more additional power you derive from the new data.

Another example: When your doctor says “take off your clothes,” it makes no sense for you to say, “You first, doc.” The two of you are not engaging in an interaction of equals.

This is the principle that should guide decision-makers when they consider installing surveillance cameras or launching data-mining programs. It’s not enough to open the efforts to public scrutiny. All aspects of government work best when the relative power between the governors and the governed remains as small as possible—when liberty is high and control is low. Forced openness in government reduces the relative power differential between the two, and is generally good. Forced openness in laypeople increases the relative power, and is generally bad.

EDITED TO ADD (7/13): Another article. One jurisdiction in Pennsylvania has explicitly ruled the opposite: that it’s legal to record police officers no matter what.

Posted on June 16, 2010 at 1:36 PMView Comments

Voluntary Security Inspections

What could possibly be the point of this?

Cars heading to Austin-Bergstrom International Airport will see random, voluntary inspections Monday.

The searches are part of an increase in security at the airport.

It’s a joint operation between the U.S. Department of Homeland Security, Austin Police, and airport security.

The enhancements are not a response to specific threats, and the security level has not changed.

Officials say the searches are voluntary and drivers can opt out if they want.

Training? Reassuring a jittery public? Looking busy? This can’t possibly be done for security reasons.

Posted on June 1, 2010 at 1:00 PMView Comments

Alerting Users that Applications are Using Cameras, Microphones, Etc.

Interesting research: “What You See is What They Get: Protecting users from unwanted use of microphones, cameras, and other sensors,” by Jon Howell and Stuart Schechter.

Abstract: Sensors such as cameras and microphones collect privacy-sensitive data streams without the user’s explicit action. Conventional sensor access policies either hassle users to grant applications access to sensors or grant with no approval at all. Once access is granted, an application may collect sensor data even after the application’s interface suggests that the sensor is no longer being accessed.

We introduce the sensor-access widget, a graphical user interface element that resides within an application’s display. The widget provides an animated representation of the personal data being collected by its corresponding sensor, calling attention to the application’s attempt to collect the data. The widget indicates whether the sensor data is currently allowed to flow to the application. The widget also acts as a control point through which the user can configure the sensor and grant or deny the application access. By building perpetual disclosure of sensor data collection into the platform, sensor-access widgets enable new access-control policies that relax the tension between the user’s privacy needs and applications’ ease of access.

Apple seems to be taking some steps in this direction with the location sensor disclosure in iPhone 4.0 OS.

Posted on May 24, 2010 at 7:32 AMView Comments

Applications Disclosing Required Authority

This is an interesting piece of research evaluating different user interface designs by which applications disclose to users what sort of authority they need to install themselves. Given all the recent concerns about third-party access to user data on social networking sites (particularly Facebook), this is particularly timely research.

We have provided evidence of a growing trend among application platforms to disclose, via application installation consent dialogs, the resources and actions that applications will be authorized to perform if installed. To improve the design of these disclosures, we have have taken an important first step of testing key design elements. We hope these findings will assist future researchers in creating experiences that leave users feeling better informed and more confident in their installation decisions.

Within the admittedly constrained context of our laboratory study, disclosure design had surprisingly little effect on participants’ ability to absorb and search information. However, the great majority of participants preferred designs that used images or icons to represent resources. This great majority of participants also disliked designs that used paragraphs, the central design element of Facebook’s disclosures, and outlines, the central design element of Android’s disclosures.

Posted on May 21, 2010 at 1:17 PMView Comments

Detecting Browser History

Interesting research.

Main results:

[…]

  • We analyzed the results from over a quarter of a million people who ran our tests in the last few months, and found that we can detect browsing histories for over 76% of them. All major browsers allow their users’ history to be detected, but it seems that users of the more modern browsers such as Safari and Chrome are more affected; we detected visited sites for 82% of Safari users and 94% of Chrome users.

    […]

  • While our tests were quite limited, for our test of 5000 most popular websites, we detected an average of 63 visited locations (13 sites and 50 subpages on those sites); the medians were 8 and 17 respectively.
  • Almost 10% of our visitors had over 30 visited sites and 120 subpages detected—heavy Internet users who don’t protect themselves are more affected than others.

    […]

  • The ability to detect visitors’ browsing history requires just a few lines of code. Armed with a list of websites to check for, a malicious webmaster can scan over 25 thousand links per second (1.5 million links per minute) in almost every recent browser.
  • Most websites and pages you view in your browser can be detected as long as they are kept in your history. Almost every address that was in your browser’s address bar can be detected (this includes most pages, including those retrieved using https and some forms with potentialy private information such as your zipcode or search query). Pages won’t be detected when they expire from your history (usually after a month or two), or if you manually clear it.

For now, the only way to fix the issue is to constantly clear browsing history or use private browsing modes. The first browser to prevent this trick in a default installation (Firefox 4.0) is supposed to come out in October.

Here’s a link to the paper.

Posted on May 20, 2010 at 1:28 PMView Comments

Nobody Encrypts their Phone Calls

From the Forbes blog:

In an annual report published Friday by the U.S. judicial system on the number of wiretaps it granted over the past year …, the courts revealed that there were 2,376 wiretaps by law enforcement agencies in 2009, up 26% from 1,891 the year before, and up 76% from 1999. (Those numbers, it should be noted, don’t include international wiretaps or those aimed at intelligence purposes rather than law enforcement.)

But in the midst of that wiretapping bonanza, a more surprising figure is the number of cases in which law enforcement encountered encryption as a barrier: one.

According to the courts, only one wiretapping case in the entire country encountered encryption last year, and in that single case, whatever privacy tools were used don’t seemed to have posed much of a hurdle to eavesdroppers. “In 2009, encryption was encountered during one state wiretap, but did not prevent officials from obtaining the plain text of the communications,” reads the report.

Posted on May 6, 2010 at 7:06 AMView Comments

Young People, Privacy, and the Internet

There’s a lot out there on this topic. I’ve already linked to danah boyd’s excellent SXSW talk (and her work in general), my essay on privacy and control, and my talk—”Security, Privacy, and the Generation Gap“—which I’ve given four times in the past two months.

Last week, two new papers were published on the topic.

Youth, Privacy, and Reputation” is a literature review published by Harvard’s Berkman Center. It’s long, but an excellent summary of what’s out there on the topic:

Conclusions: The prevailing discourse around youth and privacy assumes that young people don’t care about their privacy because they post so much personal information online. The implication is that posting personal information online puts them at risk from marketers, pedophiles, future employers, and so on. Thus, policy and technical solutions are proposed that presume that young would not put personal information online if they understood the consequences. However, our review of the literature suggests that young people care deeply about privacy, particularly with regard to parents and teachers viewing personal information. Young people are heavily monitored at home, at school, and in public by a variety of surveillance technologies. Children and teenagers want private spaces for socialization, exploration, and experimentation, away from adult eyes. Posting personal information online is a way for youth to express themselves, connect with peers, increase popularity, and bond with friends and members of peer groups. Subsequently, young people want to be able to restrict information provided online in a nuanced and granular way.

Much popular writing (and some research) discusses young people, online technologies, and privacy in ways that do not reflect the realities of most children and teenagers’ lives. However, this provides rich opportunities for future research in this area. For instance, there are no studies of the impact of surveillance on young people—at school, at home, or in public. Although we have cited several qualitative and ethnographic studies of young people’s privacy practices and attitudes, more work in this area is needed to fully understand similarities and differences in this age group, particularly within age cohorts, across socioeconomic classes, between genders, and so forth. Finally, given that the frequently-cited comparative surveys of young people and adult privacy practices and attitudes are quite old, new research would be invaluable. We look forward to new directions in research in this area.

How Different Are Young Adults from Older Adults When it Comes to Information Privacy Attitudes & Policy?” from the University of California Berkeley, describes the results of a broad survey on privacy attitudes.

Conclusion: In policy circles, it has become almost a cliché to claim that young people do not care about privacy. Certainly there are many troubling anecdotes surrounding young individuals’ use of the internet, and of social networking sites in particular. Nevertheless, we found that in large proportions young adults do care about privacy. The data show that they and older adults are more alike on many privacy topics than they are different. We suggest, then, that young-adult Americans have an aspiration for increased privacy even while they participate in an online reality that is optimized to increase their revelation of personal data.

Public policy agendas should therefore not start with the proposition that young adults do not care about privacy and thus do not need regulations and other safeguards. Rather, policy discussions should acknowledge that the current business environment along with other factors sometimes encourages young adults to release personal data in order to enjoy social inclusion even while in their most rational moments they may espouse more conservative norms. Education may be useful. Although many young adults are exposed to educational programs about the internet, the focus of these programs is on personal safety from online predators and cyberbullying with little emphasis on information security and privacy. Young adults certainly are different from older adults when it comes to knowledge of privacy law. They are more likely to believe that the law protects them both online and off. This lack of knowledge in a tempting environment, rather than a cavalier lack of concern regarding privacy, may be an important reason large numbers of them engage with the digital world in a seemingly unconcerned manner.

But education alone is probably not enough for young adults to reach aspirational levels of privacy. They likely need multiple forms of help from various quarters of society, including perhaps the regulatory arena, to cope with the complex online currents that aim to contradict their best privacy instincts.

They’re both worth reading for anyone interested in this topic.

Posted on April 20, 2010 at 1:50 PMView Comments

Life Recorder

In 2006, writing about future threats on privacy, I described a life recorder:

A “life recorder” you can wear on your lapel that constantly records is still a few generations off: 200 gigabytes/year for audio and 700 gigabytes/year for video. It’ll be sold as a security device, so that no one can attack you without being recorded.

I can’t find a quote right now, but in talks I would say that this kind of technology would first be used by groups of people with diminished rights: children, soldiers, prisoners, and the non-lucid elderly.

It’s been proposed:

With GPS capabilities built into phones that can be made ever smaller, and the ability for these phones to transmit both sound and audio, isn’t it time to think about a wearable device that could be used to call for help and accurately report what was happening?

[…]

The device could contain cameras and microphones that activate if the device was triggered to create evidence that could locate an attacker and cause them to flee, an alarm sound that could help locate the victim and also help scare off an attacker, and a set of sensors that could detect everything from sudden deceleration to an irregular heartbeat or compromised breathing.

Just one sentence on the security and privacy issues:

Indeed, privacy concerns need to be addressed so that stalkers and predators couldn’t compromise the device.

Indeed.

Posted on April 19, 2010 at 6:30 AMView Comments

1 89 90 91 92 93 144

Sidebar photo of Bruce Schneier by Joe MacInnis.