Entries Tagged "forensics"
Page 1 of 10
Vice has an article about how data brokers sell access to the Internet backbone. This is netflow data. It’s useful for cybersecurity forensics, but can also be used for things like tracing VPN activity.
At a high level, netflow data creates a picture of traffic flow and volume across a network. It can show which server communicated with another, information that may ordinarily only be available to the server owner or the ISP carrying the traffic. Crucially, this data can be used for, among other things, tracking traffic through virtual private networks, which are used to mask where someone is connecting to a server from, and by extension, their approximate physical location.
In the hands of some governments, that could be dangerous.
Over at Lawfare, Susan Landau has an excellent essay on the risks posed by software used to collect evidence (a Breathalyzer is probably the most obvious example).
Bugs and vulnerabilities can lead to inaccurate evidence, but the proprietary nature of software makes it hard for defendants to examine it.
The software engineers proposed a three-part test. First, the court should have access to the “Known Error Log,” which should be part of any professionally developed software project. Next the court should consider whether the evidence being presented could be materially affected by a software error. Ladkin and his co-authors noted that a chain of emails back and forth are unlikely to have such an error, but the time that a software tool logs when an application was used could easily be incorrect. Finally, the reliability experts recommended seeing whether the code adheres to an industry standard used in an non-computerized version of the task (e.g., bookkeepers always record every transaction, and thus so should bookkeeping software).
Inanimate objects have long served as evidence in courts of law: the door handle with a fingerprint, the glove found at a murder scene, the Breathalyzer result that shows a blood alcohol level three times the legal limit. But the last of those examples is substantively different from the other two. Data from a Breathalyzer is not the physical entity itself, but rather a software calculation of the level of alcohol in the breath of a potentially drunk driver. As long as the breath sample has been preserved, one can always go back and retest it on a different device.
What happens if the software makes an error and there is no sample to check or if the software itself produces the evidence? At the time of our writing the article on the use of software as evidence, there was no overriding requirement that law enforcement provide a defendant with the code so that they might examine it themselves.
Given the high rate of bugs in complex software systems, my colleagues and I concluded that when computer programs produce the evidence, courts cannot assume that the evidentiary software is reliable. Instead the prosecution must make the code available for an “adversarial audit” by the defendant’s experts. And to avoid problems in which the government doesn’t have the code, government procurement contracts must include delivery of source code—code that is more-or-less readable by people—for every version of the code or device.
Microsoft analyzed details of the SolarWinds attack:
Microsoft and FireEye only detected the Sunburst or Solorigate malware in December, but Crowdstrike reported this month that another related piece of malware, Sunspot, was deployed in September 2019, at the time hackers breached SolarWinds’ internal network. Other related malware includes Teardrop aka Raindrop.
Details are in the Microsoft blog:
We have published our in-depth analysis of the Solorigate backdoor malware (also referred to as SUNBURST by FireEye), the compromised DLL that was deployed on networks as part of SolarWinds products, that allowed attackers to gain backdoor access to affected devices. We have also detailed the hands-on-keyboard techniques that attackers employed on compromised endpoints using a powerful second-stage payload, one of several custom Cobalt Strike loaders, including the loader dubbed TEARDROP by FireEye and a variant named Raindrop by Symantec.
One missing link in the complex Solorigate attack chain is the handover from the Solorigate DLL backdoor to the Cobalt Strike loader. Our investigations show that the attackers went out of their way to ensure that these two components are separated as much as possible to evade detection. This blog provides details about this handover based on a limited number of cases where this process occurred. To uncover these cases, we used the powerful, cross-domain optics of Microsoft 365 Defender to gain visibility across the entire attack chain in one complete and consolidated view.
Many of the attacks gained initial footholds by password spraying to compromise individual email accounts at targeted organizations. Once the attackers had that initial foothold, they used a variety of complex privilege escalation and authentication attacks to exploit flaws in Microsoft’s cloud services. Another of the Advanced Persistent Threat (APT)’s targets, security firm CrowdStrike, said the attacker tried unsuccessfully to read its email by leveraging a compromised account of a Microsoft reseller the firm had worked with.
On attribution: Earlier this month, the US government has stated the attack is “likely Russian in origin.” This echos what then Secretary of State Mike Pompeo said in December, and the Washington Post‘s reporting (both from December). (The New York Times has repeated this attribution—a good article that also discusses the magnitude of the attack.) More evidence comes from code forensics, which links it to Turla, another Russian threat actor.
And lastly, a long ProPublica story on an unused piece of government-developed tech that might have caught the supply-chain attack much earlier:
The in-toto system requires software vendors to map out their process for assembling computer code that will be sent to customers, and it records what’s done at each step along the way. It then verifies electronically that no hacker has inserted something in between steps. Immediately before installation, a pre-installed tool automatically runs a final check to make sure that what the customer received matches the final product the software vendor generated for delivery, confirming that it wasn’t tampered with in transit.
I don’t want to hype this defense too much without knowing a lot more, but I like the approach of verifying the software build process.
Bellingcat has investigated the near-fatal poisoning of Alexey Navalny by the Russian FSB back in August. The details display some impressive traffic analysis. Navalny got a confession out of one of the poisoners, displaying some masterful social engineering.
Lots of interesting opsec details in all of this.
EDITED TO ADD (1/13) Bellingcat on their methodology.
Gizmodo is reporting that schools in the US are buying equipment to unlock cell phones from companies like Cellebrite:
Gizmodo has reviewed similar accounting documents from eight school districts, seven of which are in Texas, showing that administrators paid as much $11,582 for the controversial surveillance technology. Known as mobile device forensic tools (MDFTs), this type of tech is able to siphon text messages, photos, and application data from student’s devices. Together, the districts encompass hundreds of schools, potentially exposing hundreds of thousands of students to invasive cell phone searches.
The eighth district was in Los Angeles.
The US Department of Justice unraveled a dark web child-porn website, leading to the arrest of 337 people in at least 18 countries. This was all accomplished not through any backdoors in communications systems, but by analyzing the bitcoin transactions and following the money:
Welcome to Video made money by charging fees in bitcoin, and gave each user a unique bitcoin wallet address when they created an account. Son operated the site as a Tor hidden service, a dark web site with a special address that helps mask the identity of the site’s host and its location. But Son and others made mistakes that allowed law enforcement to track them. For example, according to the indictment, very basic assessments of the Welcome to Video website revealed two unconcealed IP addresses managed by a South Korean internet service provider and assigned to an account that provided service to Son’s home address. When agents searched Son’s residence, they found the server running Welcome to Video.
To “follow the money,” as officials put it in Wednesday’s press conference, law enforcement agents sent fairly small amounts of bitcoin—roughly equivalent at the time to $125 to $290—to the bitcoin wallets Welcome to Video listed for payments. Since the bitcoin blockchain leaves all transactions visible and verifiable, they could observe the currency in these wallets being transferred to another wallet. Law enforcement learned from a bitcoin exchange that the second wallet was registered to Son with his personal phone number and one of his personal email addresses.
Remember this the next time some law enforcement official tells us that they’re powerless to investigate crime without breaking cryptography for everyone.
The FBI announced that it dismantled a large Internet advertising fraud network, and arrested eight people:
A 13-count indictment was unsealed today in federal court in Brooklyn charging Aleksandr Zhukov, Boris Timokhin, Mikhail Andreev, Denis Avdeev, Dmitry Novikov, Sergey Ovsyannikov, Aleksandr Isaev and Yevgeniy Timchenko with criminal violations for their involvement in perpetrating widespread digital advertising fraud. The charges include wire fraud, computer intrusion, aggravated identity theft and money laundering. Ovsyannikov was arrested last month in Malaysia; Zhukov was arrested earlier this month in Bulgaria; and Timchenko was arrested earlier this month in Estonia, all pursuant to provisional arrest warrants issued at the request of the United States. They await extradition. The remaining defendants are at large.
It looks like an impressive piece of police work.
Details of the forensics that led to the arrests.
According to a new CSIS report, “going dark” is not the most pressing problem facing law enforcement in the age of digital data:
Over the past year, we conducted a series of interviews with federal, state, and local law enforcement officials, attorneys, service providers, and civil society groups. We also commissioned a survey of law enforcement officers from across the country to better understand the full range of difficulties they are facing in accessing and using digital evidence in their cases. Survey results indicate that accessing data from service providers—much of which is not encrypted—is the biggest problem that law enforcement currently faces in leveraging digital evidence.
This is a problem that has not received adequate attention or resources to date. An array of federal and state training centers, crime labs, and other efforts have arisen to help fill the gaps, but they are able to fill only a fraction of the need. And there is no central entity responsible for monitoring these efforts, taking stock of the demand, and providing the assistance needed. The key federal entity with an explicit mission to assist state and local law enforcement with their digital evidence needs—the National Domestic Communications Assistance Center (NDCAC)has a budget of $11.4 million, spread among several different programs designed to distribute knowledge about service providers’ policies and products, develop and share technical tools, and train law enforcement on new services and technologies, among other initiatives.
From a news article:
In addition to bemoaning the lack of guidance and help from tech companies—a quarter of survey respondents said their top issue was convincing companies to hand over suspects’ data—law enforcement officials also reported receiving barely any digital evidence training. Local police said they’d received only 10 hours of training in the past 12 months; state police received 13 and federal officials received 16. A plurality of respondents said they only received annual training. Only 16 percent said their organizations scheduled training sessions at least twice per year.
Here’s the report.
Sidebar photo of Bruce Schneier by Joe MacInnis.