An Analysis of the EU’s Cyber Resilience Act
A good—long, complex—analysis of the EU’s new Cyber Resilience Act.
A good—long, complex—analysis of the EU’s new Cyber Resilience Act.
Daniel Popescu • September 27, 2024 12:12 AM
@Bruce – very interesting, thanks. Quite intelligible for techical folks like us, but I wonder if the CRA law says something about the supply chains involved in the products it refers to.
Winter • September 27, 2024 3:12 AM
From the analysis:
Consequently, the use of a product by a nation-state, a terrorist group, or freedom fighters during armed conflict or for self-defense against foreign or domestic threats anywhere in the world could be “non-technical” sufficient grounds for removing that product from the European Union market if the European Commission or a national market surveillance authority can be persuaded to instigate a fundamental rights assessment.
Furthermore, as the CRA includes integrated components of products in its scope, manufacturers of specialized processors, sensors, or algorithms could find that all products utilizing these components might be removed from the European Union due to their use in other jurisdictions.
I am not sure whether I understand this correctly, but my reading is that if a product is used by “evil” people to harm other people or put them in danger, it can be removed from the market.
That does sound reasonable. If a product is used to kill people, I would advocate to impose policies to stop that use, including to remove it from the market.
This is also not new. The EU already bans the sale of products that are used for torture in foreign countries.
The EU simply does not recognize a legal right to sell products that are mainly[1] used to harm people, anywhere in the world. Just as EU countries have strong regulations for sales and export of arms.
However, the conflation of cybersecurity and fundamental rights introduces a worrying mechanism, even if used infrequently, that would allow the adoption of corrective and restrictive measures up to the removal of products from the European Union, on the basis of the degree of their compliance with fundamental rights law.
Fundamental rights are fundamental. I do not really see why denying people their fundamental rights differs from harming them in other ways. There are differences in degree between murder or torture and censorship and privacy, but when does a difference in degree become a difference in principle? Courts of law have worked with these principles from their inceptions.
I would say that products that are used to deny people their fundamental rights, eg, to invade their privacy and bodily autonomy of women by tracking their periods without consent, should rightfully be banned.
[1] “mainly” used here in the sense that the amount of harm far outweighs the benefits.
Clive Robinson • September 27, 2024 5:16 AM
The aim of the EU policy is to,
“Minimize Harm to Users”
As is increasingly the case the US State of California has legislation with similar basic aims but different and usually more limited implementations.
In this case Governor Gavin Newsom has just signed CA AB2426 in,
https://digitaldemocracy.calmatters.org/bills/ca_202320240ab2426
Whilst very limited in what it does, quite a few think it will have rather more of an impact,
In fact some think it may be sufficient to start to cause a radical effect on the primary method of “selling software”.
But also will stop the likes of Microsoft and other major silicon valley companies limiting the life of their software by tying it to hardware changes and vice versa which forces into being the faux-upgrade market that in effect kills off the expected life of a consumer computer to just a year or three not the ten to twenty years of other consumer electronics.
Clive Robinson • September 29, 2024 3:18 PM
@ Bruce,
Speaking of US legislation that might be compared to EU legislation in terms of privacy, protection, harm limitation and similar InfoSys user protection, you might find this of interest,
“In this piece, I will highlight the price of ignoring the GDPR. Then, I will present several conceptual flaws of the GDPR that have been acknowledged by one of the lead architects of the law. Next, I will propose certain characteristics and design requirements that countries like the United States should consider when developing a privacy protection law. Lastly, I provide a few reasons why everyone should care about this project.”
https://arstechnica.com/tech-policy/2024/09/opinion-how-to-design-a-us-data-privacy-law/
Who? • September 30, 2024 5:33 AM
@ Clive Robinson
“… introduces the ability for the European Commission to restrict the sale of technology products (including both hardware and standalone software products) from the European market if the use of that technology enabled the breach of fundamental rights.”
I see it the other way: restricting the sale of technology products from the European market if the use of that technology enables, let us say, the breach of fundamental rights by “put here the name of your favourite horseman of the infocalypsis“.
Or, putting it in a simpler way, …if the use of that technology enables the break of the fundamental right of a government to listen to the communications of their citizens.
I certainly do not trust on EU ruling in these matters.
traced by ip • October 1, 2024 5:17 PM
Germany’s Constitutional Court strikes down some surveillance powers
https://news.yahoo.com/news/germanys-constitutional-court-strikes-down-100322119.html
New limits on the storage of personal data by police are also needed, the court
ruled. For example, previous accusations alone against a person are not enough to establish a sufficient likely relationship to future criminal offences.
The Constitutional Court previously struck down portions of the law on police
surveillance powers in 2016. Lawmakers revised the statutes in 2017 in response to the ruling, but the court’s decision on Tuesday again found constitutional
problems with the latest version of the law.
Subscribe to comments on this entry
Sidebar photo of Bruce Schneier by Joe MacInnis.
Clive Robinson • September 26, 2024 8:38 PM
As the article notes,
Should raise an immediate thought of
“Hang on a moment!”
Because all technology is agnostic to use, and it is the use by a directing mind that is seen by a supposably impartial observer as “good or bad”.
For instance the paddle of a boat or canoe is designed for efficient transfer of muscle movement into pushing water –generally– backward thus the boat or canoe forward by the old “equal and opposite reaction” law of motion.
However the paddle can also be used as a very effective weapon against some one.
Would you ban the sale of boat paddles just because somebody might use it as a weapon?
In the UK a change in law has made the sale or ownership of “zombie knives” illegal. The problem is the definition of such knives also covers many hand tools including saws and likewise kitchen implements. No doubt one major result will be pointless prosecutions, and considerable harm done to individuals for no reason.
Thus ordinary people will be criminalised for no good reason, whilst the “alleged targets” of the legislative change, will just find any one of many thousands of any other everyday items to carry legally and use unlawfully as weapons.
This sort of legal nonsense only serves one provable purpose. Which is to bring the legislative process into disrepute, and make fools of those bringing in such legislation.
But also you have a secondary issue,
“All systems are an assemblage of parts and sub systems.”
Generally the further down the tree the more general or non specific a part or sub system is.
Thus where do you draw the line as to when a technology can or can not breach fundamental rights?
But there is another aspect to consider which is going to cause problems. As noted a technology is agnostic it’s the use that “is seen” as good or bad.
Consider enforced backdoors in information security systems the simplest being the US CALE Act of 1994 requirements. As we know there is no way to make any such capability “NOBUS” if it’s there it will be used for both good and bad thus will represent a “breach of fundamental rights”.
And as was seen with the “Greek Olympics” if you build systems to have a capability, even if you do not have the capability installed, it does not stop “persons unknown” getting it installed covertly at a time after the system has been installed and commissioned.