Entries Tagged "debates"

Page 1 of 5

Buying Campaign Contributions as a Hack

The first Republican primary debate has a popularity threshold to determine who gets to appear: 40,000 individual contributors. Now there are a lot of conventional ways a candidate can get that many contributors. Doug Burgum came up with a novel idea: buy them:

A long-shot contender at the bottom of recent polls, Mr. Burgum is offering $20 gift cards to the first 50,000 people who donate at least $1 to his campaign. And one lucky donor, as his campaign advertised on Facebook, will have the chance to win a Yeti Tundra 45 cooler that typically costs more than $300—just for donating at least $1.

It’s actually a pretty good idea. He could have spent the money on direct mail, or personalized social media ads, or television ads. Instead, he buys gift cards at maybe two-thirds of face value (sellers calculate the advertising value, the additional revenue that comes from using them to buy something more expensive, and breakage when they’re not redeemed at all), and resells them. Plus, many contributors probably give him more than $1, and he got a lot of publicity over this.

Probably the cheapest way to get the contributors he needs. A clever hack.

EDITED TO ADD (7/16): These might be “straw donors” and illegal:

The campaign’s donations-for-cash strategy could raise potential legal concerns, said Paul Ryan, a campaign finance lawyer. Voters who make donations in exchange for gift cards, he said, might be considered straw donors because part or all of their donations are being reimbursed by the campaign.

“Federal law says ‘no person shall make a contribution in the name of another person,'” Mr. Ryan said. “Here, the candidate is making a contribution to himself in the name of all these individual donors.”

Richard L. Hasen, a law professor at the University of California, Los Angeles, who specializes in election law, said that typically, campaigns ask the Federal Election Commission when engaging in new forms of donations.

The Burgum campaign’s maneuver, he said, “certainly seems novel” and “raises concerns about whether it violates the prohibition on straw donations.”

Something for the courts to figure out, if this matter ever gets that far.

Posted on July 14, 2023 at 7:09 AMView Comments

Glenn Greenwald Debates Keith Alexander

Interesting debate, surprisingly civil.

Alexander seemed to have been okay with Snowden revealing surveillance based on Section 215:

“If he had taken the one court document and said, ‘This is what I’m going to do’… I think this would be a whole different discussion,” Alexander said. “I do think he had the opportunity [to be] what many could consider an American hero.”

And he also spoke in favor of allowing adversarial proceedings in the FISA Court.

On the other hand, I am getting tired of this back-door/front-door nonsense. Alexander said that he’s not in favor of back doors in security systems, but wants some kind of “front door.” FBI Director Comey plays this wordgame too:

There is a misconception that building a lawful intercept solution into a system requires a so-called “back door,” one that foreign adversaries and hackers may try to exploit.

But that isn’t true. We aren’t seeking a back-door approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law. We are completely comfortable with court orders and legal process—front doors that provide the evidence and information we need to investigate crime and prevent terrorist attacks.

They both see a difference here. A back door is a secret method of access, one that anyone can discover and use. A front door is a public method of access, one that—somehow—no one else can discover and use. But in reality, there’s no difference. Technologically, they’re the same: a method of third-party data access that works despite the intentions of the data owner.

In the beginning of the debate, I got the feeling that Alexander is trying to subtly shill his company. (Not that there’s anything wrong with that—I sometimes do the same thing. But realizing it helped me understand some of Alexander’s comments better.) Later, the discussion turned into a recycling of common talking points from both sides.

Posted on September 7, 2015 at 9:14 AMView Comments

Should Companies Do Most of Their Computing in the Cloud? (Part 3)

Cloud computing is the future of computing. Specialization and outsourcing make society more efficient and scalable, and computing isn’t any different.

But why aren’t we there yet? Why don’t we, in Simon Crosby’s words, “get on with it”? I have discussed some reasons: loss of control, new and unquantifiable security risks, and—above all—a lack of trust. It is not enough to simply discount them, as the number of companies not embracing the cloud shows. It is more useful to consider what we need to do to bridge the trust gap.

A variety of mechanisms can create trust. When I outsourced my food preparation to a restaurant last night, it never occurred to me to worry about food safety. That blind trust is largely created by government regulation. It ensures that our food is safe to eat, just as it ensures our paint will not kill us and our planes are safe to fly. It is all well and good for Mr. Crosby to write that cloud companies “will invest heavily to ensure that they can satisfy complex…regulations,” but this presupposes that we have comprehensive regulations. Right now, it is largely a free-for-all out there, and it can be impossible to see how security in the cloud works. When robust consumer-safety regulations underpin outsourcing, people can trust the systems.

This is true for any kind of outsourcing. Attorneys, tax preparers and doctors are licensed and highly regulated, by both governments and professional organizations. We trust our doctors to cut open our bodies because we know they are not just making it up. We need a similar professionalism in cloud computing.

Reputation is another big part of trust. We rely on both word-of-mouth and professional reviews to decide on a particular car or restaurant. But none of that works without considerable transparency. Security is an example. Mr Crosby writes: “Cloud providers design security into their systems and dedicate enormous resources to protect their customers.” Maybe some do; many certainly do not. Without more transparency, as a cloud customer you cannot tell the difference. Try asking either Amazon Web Services or Salesforce.com to see the details of their security arrangements, or even to indemnify you for data breaches on their networks. It is even worse for free consumer cloud services like Gmail and iCloud.

We need to trust cloud computing’s performance, reliability and security. We need open standards, rules about being able to remove our data from cloud services, and the assurance that we can switch cloud services if we want to.

We also need to trust who has access to our data, and under what circumstances. One commenter wrote: “After Snowden, the idea of doing your computing in the cloud is preposterous.” He isn’t making a technical argument: a typical corporate data center isn’t any better defended than a cloud-computing one. He is making a legal argument. Under American law—and similar laws in other countries—the government can force your cloud provider to give up your data without your knowledge and consent. If your data is in your own data center, you at least get to see a copy of the court order.

Corporate surveillance matters, too. Many cloud companies mine and sell your data or use it to manipulate you into buying things. Blocking broad surveillance by both governments and corporations is critical to trusting the cloud, as is eliminating secret laws and orders regarding data access.

In the future, we will do all our computing in the cloud: both commodity computing and computing that requires personalized expertise. But this future will only come to pass when we manage to create trust in the cloud.

This essay previously appeared on the Economist website, as part of a debate on cloud computing. It’s the third of three essays. Here are Parts 1 and 2. Visit the site for the other side of the debate and other commentary.

Posted on June 10, 2015 at 3:27 PMView Comments

Should Companies Do Most of Their Computing in the Cloud? (Part 2)

Let me start by describing two approaches to the cloud.

Most of the students I meet at Harvard University live their lives in the cloud. Their e-mail, documents, contacts, calendars, photos and everything else are stored on servers belonging to large internet companies in America and elsewhere. They use cloud services for everything. They converse and share on Facebook and Instagram and Twitter. They seamlessly switch among their laptops, tablets and phones. It wouldn’t be a stretch to say that they don’t really care where their computers end and the internet begins, and they are used to having immediate access to all of their data on the closest screen available.

In contrast, I personally use the cloud as little as possible. My e-mail is on my own computer—I am one of the last Eudora users—and not at a web service like Gmail or Hotmail. I don’t store my contacts or calendar in the cloud. I don’t use cloud backup. I don’t have personal accounts on social networking sites like Facebook or Twitter. (This makes me a freak, but highly productive.) And I don’t use many software and hardware products that I would otherwise really like, because they force you to keep your data in the cloud: Trello, Evernote, Fitbit.

Why don’t I embrace the cloud in the same way my younger colleagues do? There are three reasons, and they parallel the trade-offs corporations faced with the same decisions are going to make.

The first is control. I want to be in control of my data, and I don’t want to give it up. I have the ability to keep control by running my own services my way. Most of those students lack the technical expertise, and have no choice. They also want services that are only available on the cloud, and have no choice. I have deliberately made my life harder, simply to keep that control. Similarly, companies are going to decide whether or not they want to—or even can—keep control of their data.

The second is security. I talked about this at length in my opening statement. Suffice it to say that I am extremely paranoid about cloud security, and think I can do better. Lots of those students don’t care very much. Again, companies are going to have to make the same decision about who is going to do a better job, and depending on their own internal resources, they might make a different decision.

The third is the big one: trust. I simply don’t trust large corporations with my data. I know that, at least in America, they can sell my data at will and disclose it to whomever they want. It can be made public inadvertently by their lax security. My government can get access to it without a warrant. Again, lots of those students don’t care. And again, companies are going to have to make the same decisions.

Like any outsourcing relationship, cloud services are based on trust. If anything, that is what you should take away from this exchange. Try to do business only with trustworthy providers, and put contracts in place to ensure their trustworthiness. Push for government regulations that establish a baseline of trustworthiness for cases where you don’t have that negotiation power. Fight laws that give governments secret access to your data in the cloud. Cloud computing is the future of computing; we need to ensure that it is secure and reliable.

Despite my personal choices, my belief is that, in most cases, the benefits of cloud computing outweigh the risks. My company, Resilient Systems, uses cloud services both to run the business and to host our own products that we sell to other companies. For us it makes the most sense. But we spend a lot of effort ensuring that we use only trustworthy cloud providers, and that we are a trustworthy cloud provider to our own customers.

This essay previously appeared on the Economist website, as part of a debate on cloud computing. It’s the second of three essays. Here are Parts 1 and 3. Visit the site for the other side of the debate and other commentary.

Posted on June 10, 2015 at 11:27 AMView Comments

Should Companies Do Most of Their Computing in the Cloud? (Part 1)

Yes. No. Yes. Maybe. Yes. Okay, it’s complicated.

The economics of cloud computing are compelling. For companies, the lower operating costs, the lack of capital expenditure, the ability to quickly scale and the ability to outsource maintenance are just some of the benefits. Computing is infrastructure, like cleaning, payroll, tax preparation and legal services. All of these are outsourced. And computing is becoming a utility, like power and water. Everyone does their power generation and water distribution “in the cloud.” Why should IT be any different?

Two reasons. The first is that IT is complicated: it is more like payroll services than like power generation. What this means is that you have to choose your cloud providers wisely, and make sure you have good contracts in place with them. You want to own your data, and be able to download that data at any time. You want assurances that your data will not disappear if the cloud provider goes out of business or discontinues your service. You want reliability and availability assurances, tech support assurances, whatever you need.

The downside is that you will have limited customization options. Cloud computing is cheaper because of economics of scale, and­—like any outsourced task—­you tend to get what you get. A restaurant with a limited menu is cheaper than a personal chef who can cook anything you want. Fewer options at a much cheaper price: it’s a feature, not a bug.

The second reason that cloud computing is different is security. This is not an idle concern. IT security is difficult under the best of circumstances, and security risks are one of the major reasons it has taken so long for companies to embrace the cloud. And here it really gets complicated.

On the pro-cloud side, cloud providers have the potential to be far more secure than the corporations whose data they are holding. It is the same economies of scale. For most companies, the cloud provider is likely to have better security than them­—by a lot. All but the largest companies benefit from the concentration of security expertise at the cloud provider.

On the anti-cloud side, the cloud provider might not meet your legal needs. You might have regulatory requirements that the cloud provider cannot meet. Your data might be stored in a country with laws you do not like­—or cannot legally use. Many foreign companies are thinking twice about putting their data inside America, because of laws allowing the government to get at that data in secret. Other countries around the world have even more draconian government-access rules.

Also on the anti-cloud side, a large cloud provider is a juicier target. Whether or not this matters depends on your threat profile. Criminals already steal far more credit card numbers than they can monetize; they are more likely to go after the smaller, less-defended networks. But a national intelligence agency will prefer the one-stop shop a cloud provider affords. That is why the NSA broke into Google’s data centers.

Finally, the loss of control is a security risk. Moving your data into the cloud means that someone else is controlling that data. This is fine if they do a good job, but terrible if they do not. And for free cloud services, that loss of control can be critical. The cloud provider can delete your data on a whim, if it believes you have violated some term of service that you never even knew existed. And you have no recourse.

As a business, you need to weigh the benefits against the risks. And that will depend on things like the type of cloud service you’re considering, the type of data that’s involved, how critical the service is, how easily you could do it in house, the size of your company and the regulatory environment, and so on.

This essay previously appeared on the Economist website, as part of a debate on cloud computing. It’s the first of three essays. Here are Parts 2 and 3. Visit the site for the other side of the debate and other commentary.

Posted on June 10, 2015 at 6:43 AMView Comments

Prosecuting Snowden

Edward Snowden broke the law by releasing classified information. This isn’t under debate; it’s something everyone with a security clearance knows. It’s written in plain English on the documents you have to sign when you get a security clearance, and it’s part of the culture. The law is there for a good reason, and secrecy has an important role in military defense.

But before the Justice Department prosecutes Snowden, there are some other investigations that ought to happen.

We need to determine whether these National Security Agency programs are themselves legal. The administration has successfully barred anyone from bringing a lawsuit challenging these laws, on the grounds of national secrecy. Now that we know those arguments are without merit, it’s time for those court challenges.

It’s clear that some of the NSA programs exposed by Snowden violate the Constitution and others violate existing laws. Other people have an opposite view. The courts need to decide.

We need to determine whether classifying these programs is legal. Keeping things secret from the people is a very dangerous practice in a democracy, and the government is permitted to do so only under very specific circumstances. Reading the documents leaked so far, I don’t see anything that needs to be kept secret. The argument that exposing these documents helps the terrorists doesn’t even pass the laugh test; there’s nothing here that changes anything any potential terrorist would do or not do. But in any case, now that the documents are public, the courts need to rule on the legality of their secrecy.

And we need to determine how we treat whistle-blowers in this country. We have whistle-blower protection laws that apply in some cases, particularly when exposing fraud, and other illegal behavior. NSA officials have repeatedly lied about the existence, and details, of these programs to Congress.

Only after all of these legal issues have been resolved should any prosecution of Snowden move forward. Because only then will we know the full extent of what he did, and how much of it is justified.

I believe that history will hail Snowden as a hero—his whistle-blowing exposed a surveillance state and a secrecy machine run amok. I’m less optimistic of how the present day will treat him, and hope that the debate right now is less about the man and more about the government he exposed.

This essay was originally published on the New York Times Room for Debate blog, as part of a series of essays on the topic.

EDITED TO ADD (6/13): There’s a big discussion of this on Reddit.

Posted on June 12, 2013 at 6:16 AMView Comments

Whitelisting vs. Blacklisting

The whitelist/blacklist debate is far older than computers, and it’s instructive to recall what works where. Physical security works generally on a whitelist model: if you have a key, you can open the door; if you know the combination, you can open the lock. We do it this way not because it’s easier—although it is generally much easier to make a list of people who should be allowed through your office door than a list of people who shouldn’t—but because it’s a security system that can be implemented automatically, without people.

To find blacklists in the real world, you have to start looking at environments where almost everyone is allowed. Casinos are a good example: everyone can come in and gamble except those few specifically listed in the casino’s black book or the more general Griffin book. Some retail stores have the same model—a Google search on “banned from Wal-Mart” results in 1.5 million hits, including Megan Fox—although you have to wonder about enforcement. Does Wal-Mart have the same sort of security manpower as casinos?

National borders certainly have that kind of manpower, and Marcus is correct to point to passport control as a system with both a whitelist and a blacklist. There are people who are allowed in with minimal fuss, people who are summarily arrested with as minimal a fuss as possible, and people in the middle who receive some amount of fussing. Airport security works the same way: the no-fly list is a blacklist, and people with redress numbers are on the whitelist.

Computer networks share characteristics with your office and Wal-Mart: sometimes you only want a few people to have access, and sometimes you want almost everybody to have access. And you see whitelists and blacklists at work in computer networks. Access control is whitelisting: if you know the password, or have the token or biometric, you get access. Antivirus is blacklisting: everything coming into your computer from the Internet is assumed to be safe unless it appears on a list of bad stuff. On computers, unlike the real world, it takes no extra manpower to implement a blacklist—the software can do it largely for free.

Traditionally, execution control has been based on a blacklist. Computers are so complicated and applications so varied that it just doesn’t make sense to limit users to a specific set of applications. The exception is constrained environments, such as computers in hotel lobbies and airline club lounges. On those, you’re often limited to an Internet browser and a few common business applications.

Lately, we’re seeing more whitelisting on closed computing platforms. The iPhone works on a whitelist: if you want a program to run on the phone, you need to get it approved by Apple and put in the iPhone store. Your Wii game machine works the same way. This is done primarily because the manufacturers want to control the economic environment, but it’s being sold partly as a security measure. But in this case, more security equals less liberty; do you really want your computing options limited by Apple, Microsoft, Google, Facebook, or whoever controls the particular system you’re using?

Turns out that many people do. Apple’s control over its apps hasn’t seemed to hurt iPhone sales, and Facebook’s control over its apps hasn’t seemed to affect Facebook’s user numbers. And honestly, quite a few of us would have had an easier time over the Christmas holidays if we could have implemented a whitelist on the computers of our less-technical relatives.

For these two reasons, I think the whitelist model will continue to make inroads into our general purpose computers. And those of us who want control over our own environments will fight back—perhaps with a whitelist we maintain personally, but more probably with a blacklist.

This essay previously appeared in Information Security as the first half of a point-counterpoint with Marcus Ranum. You can read Marcus’s half there as well.

Posted on January 28, 2011 at 5:02 AMView Comments

Software Monoculture

In 2003, a group of security experts—myself included—published a paper saying that 1) software monocultures are dangerous and 2) Microsoft, being the largest creator of monocultures out there, is the most dangerous. Marcus Ranum responded with an essay that basically said we were full of it. Now, eight years later, Marcus and I thought it would be interesting to revisit the debate.

The basic problem with a monoculture is that it’s all vulnerable to the same attack. The Irish Potato Famine of 1845–9 is perhaps the most famous monoculture-related disaster. The Irish planted only one variety of potato, and the genetically identical potatoes succumbed to a rot caused by Phytophthora infestans. Compare that with the diversity of potatoes traditionally grown in South America, each one adapted to the particular soil and climate of its home, and you can see the security value in heterogeneity.

Similar risks exist in networked computer systems. If everyone is using the same operating system or the same applications software or the same networking protocol, and a security vulnerability is discovered in that OS or software or protocol, a single exploit can affect everyone. This is the problem of large-scale Internet worms: many have affected millions of computers on the Internet.

If our networking environment weren’t homogeneous, a single worm couldn’t do so much damage. We’d be more like South America’s potato crop than Ireland’s. Conclusion: monoculture is bad; embrace diversity or die along with everyone else.

This analysis makes sense as far as it goes, but suffers from three basic flaws. The first is the assumption that our IT monoculture is as simple as the potato’s. When the particularly virulent Storm worm hit, it only affected from 1–10 million of its billion-plus possible victims. Why? Because some computers were running updated antivirus software, or were within locked-down networks, or whatever. Two computers might be running the same OS or applications software, but they’ll be inside different networks with different firewalls and IDSs and router policies, they’ll have different antivirus programs and different patch levels and different configurations, and they’ll be in different parts of the Internet connected to different servers running different services. As Marcus pointed out back in 2003, they’ll be a little bit different themselves. That’s one of the reasons large-scale Internet worms don’t infect everyone—as well as the network’s ability to quickly develop and deploy patches, new antivirus signatures, new IPS signatures, and so on.

The second flaw in the monoculture analysis is that it downplays the cost of diversity. Sure, it would be great if a corporate IT department ran half Windows and half Linux, or half Apache and half Microsoft IIS, but doing so would require more expertise and cost more money. It wouldn’t cost twice the expertise and money—there is some overlap—but there are significant economies of scale that result from everyone using the same software and configuration. A single operating system locked down by experts is far more secure than two operating systems configured by sysadmins who aren’t so expert. Sometimes, as Mark Twain said: “Put all your eggs in one basket, and then guard that basket!”

The third flaw is that you can only get a limited amount of diversity by using two operating systems, or routers from three vendors. South American potato diversity comes from hundreds of different varieties. Genetic diversity comes from millions of different genomes. In monoculture terms, two is little better than one. Even worse, since a network’s security is primarily the minimum of the security of its components, a diverse network is less secure because it is vulnerable to attacks against any of its heterogeneous components.

Some monoculture is necessary in computer networks. As long as we have to talk to each other, we’re all going to have to use TCP/IP, HTML, PDF, and all sorts of other standards and protocols that guarantee interoperability. Yes, there will be different implementations of the same protocol—and this is a good thing—but that won’t protect you completely. You can’t be too different from everyone else on the Internet, because if you were, you couldn’t be on the Internet.

Species basically have two options for propagating their genes: the lobster strategy and the avian strategy. Lobsters lay 5,000 to 40,000 eggs at a time, and essentially ignore them. Only a minuscule percentage of the hatchlings live to be four weeks old, but that’s sufficient to ensure gene propagation; from every 50,000 eggs, an average of two lobsters is expected to survive to legal size. Conversely, birds produce only a few eggs at a time, then spend a lot of effort ensuring that most of the hatchlings survive. In ecology, this is known as r/K selection theory. In either case, each of those offspring varies slightly genetically, so if a new threat arises, some of them will be more likely to survive. But even so, extinctions happen regularly on our planet; neither strategy is foolproof.

Our IT infrastructure is a lot more like a bird than a lobster. Yes, monoculture is dangerous and diversity is important. But investing time and effort in ensuring our current infrastructure’s survival is even more important.

This essay was originally published in Information Security, and is the first half of a point/counterpoint with Marcus Ranum. You can read his response there as well.

EDITED TO ADD (12/13): Commentary.

Posted on December 1, 2010 at 5:55 AMView Comments

1 2 3 5

Sidebar photo of Bruce Schneier by Joe MacInnis.