Friday Squid Blogging: Squidfall Safety

Watchmen supporting material.

As usual, you can also use this squid post to talk about the security stories in the news that I haven't covered.

Read my blog posting guidelines here.

Posted on December 6, 2019 at 4:20 PM • 134 Comments

Comments

D-503December 6, 2019 5:08 PM

There are a lot of threats to the open web these days, but the privatization of the .org top level domain hasn't gotten the attention it deserves:
https://www.eff.org/deeplinks/2019/12/we-need-save-org-arbitrary-censorship-halting-private-equity-buy-out
https://www.theregister.co.uk/2019/12/03/internet_society_org/
https://www.theregister.co.uk/2019/11/29/isoc_ceo_dot_org_sale/
What's not to like about it? Not just the theft hundreds of millions of dollars from charities, NGOs, scientific bodies, educational institutions, human rights groups, etc. But also economic rent in perpetuity. And a body blow to civil society. (Given the threat of their websites being abruptly unlinked or redirected without due process by whatever firm ends up owning the .org TLD, NGOs will think twice before saying anything that might embarrass the authorities).
The Electronic Frontier Foundation is hosting a petition.

Wesley ParishDecember 6, 2019 5:27 PM

Just when you thought it was safe to go back into the ...

How to fool infosec wonks into pinning a cyber attack on China, Russia, Iran, whomever
https://www.theregister.co.uk/2019/12/05/fooling_attribution_breadcrumbs/

Black Hat Europe Faking digital evidence during a cyber attack – planting a false flag – is simple if you know how, as noted infosec veteran Jake Williams told London's Black Hat Europe conference. [...]
The simplest of all the fake breadcrumbs is the origin of the attacker's traffic. Referring to now-defunct threat intel firm Norse Corp's rather dubious "DDoS attack map" from 2015 which showed the points from whence cyber attackers were launching their attacks ("100 per cent was done by IP," sniffed Williams), the infosec consultant said it was trivially easy to rent infrastructure in countries known for harbouring purveyors of online badness.[...]
PowerShell, long known as a favourite of malicious folk, can also be a useful tool in laying a trail of false breadcrumbs. Williams said you can move PowerShell transcripts from one machine to another – say, an attacker's box to target server. Being a text log of all PowerShell commands and outputs during a session, these transcripts can be useful information for investigators... and those looking to deceive them.[...]
Simple really. Cui bono? to whom is it an advantage? Who is using it to take advantage of you?

Then I'll get on my knees and pray
We don't get fooled again

https://www.youtube.com/watch?v=zYMD_W_r3Fg

Meet the new boss
Same as the old boss

SpaceLifeFormDecember 6, 2019 5:32 PM

Here's another, 'well, no kidding'

(I was going to use the sherlock line, but I'll be nice)

https[:]//www.zdnet.com/article/fbi-recommends-that-you-keep-your-iot-devices-on-a-separate-network/

"The FBI says owners of IoT (Internet of Things) devices should isolate this equipment on a separate WiFi network, different from the one they're using for their primary devices, such as laptops, desktops, or smartphones."

[Good luck getting non-techies to understand multiple routers in their home]


SpaceLifeFormDecember 6, 2019 5:45 PM

@ en ess eh tee eh oh

It is better on paper.

And, you know that.

Well, if you have a mutt.

Good luck with cats.

Table A. TurnDecember 6, 2019 5:48 PM

NSA Phone Surveillance Program Faces an End as Parties Come Together

Eighteen years after George W. Bush [1] and the National Security Agency began secretly and warrantlessly collecting the phone records of every American, the House of Representatives is taking a major step to kill the vestige of one of the most controversial domestic surveillance programs in American history. 

“We would not be in this position today if Edward Snowden had not revealed the bulk collection program,” said Liza Goitein of the Brennan Center for Justice.
https://www.thedailybeast.com/house-democrats-deal-death-blow-to-domestic-nsa-phone-spying

The Obama administration was forced to admit to spying on the personal phones of 35 world leaders[2].
Senior Republicans have been on the receiving end of opposition politicians using intelligence agencies for political purposes. The latest impeachment report from Congress reveals representatives spied on journalists and their political opposition within Congress[3][4]
Who remembers the out-of-control CIA spying fiasco against speaker Nancy Pelosi and her staff?

In summary if you give intelligence agencies too much unbridled power, they will always turn inward and weaponized against their political opposition.
The drunk ‘total control’ Chinese Communist Party is spying in countries they do business with throughout the world. You can’t keep up with the stories of data-mining abuses throughout Africa. The Chinese intelligence officer defecting to Australia should be required to brief the widespread ignorance within western parliaments.

Likewise Congress needs to gain some empathy and protect the privacy of ALL Americans from Silicon Valley. Just cut-and-patse the GDPR together with the new CA privacy law guys. Just don’t expect any leadership or from the White House[5].

[1] student George W. Bush acting under a Dick Cheney. What a dark legacy to leave. It took only 20 years to recover. Such disgusting, abusive and paranoid leadership!

[2] only when it became personal did Germany’s Angela Merkel get upset

[3] the reports timing was perfect as it induced a 180 degree turn from their traditional stand of blindly supporting law and order

[4] don't be surprised if Senate Judiciary Chairman Lindsey Graham (R-S.C.) obtains call records of Schiff and other media sources
https://www.theblaze.com/newsletter_writeup/schiff-obtained-journalist-john-solomons-phone-records-and-nobody-in-the-media-seems-to-care

[5] what the hell is privacy anyways?

SpaceLifeFormDecember 6, 2019 6:03 PM

@ en ess eh tee eh oh

Again, you are not saying anything I do not know. Again, you need to drop some pertinent clues. Otherwise, I really don't care at this point. If you are really who you claim to be, you need to be looking *inside*.

REPEATING: LOOK INSIDE.

THE PROBLEM IS INSIDE.

If you are as you claim, then you may know I pointed out this problem years ago.

THE PROBLEM IS INSIDE.

Maybe you are too compartmented, but can see some 'stuff'. Maybe you should go to IG.

Maybe you should become a Whistleblower.

I can understand why you would not, BTW.


SpaceLifeFormDecember 6, 2019 6:57 PM

Remember, attribution is hard.

https[:]//www.theregister.co.uk/2019/12/05/fooling_attribution_breadcrumbs/

abziDecember 6, 2019 7:05 PM

Something of an idea that *may possibly* be helpful is to use a screencorder to record only a few various hand-picked sections of mundane GUI processes and then save them as animations and play them back when you need to block some automatic hackerthingies.

More on this later

SpaceLifeFormDecember 6, 2019 8:56 PM

@ en ess eh tee eh oh

One more point.

I have talked to FBI, and have been in Federal Court.

If you think they do not know who I am, then you are missing some clues.


Mr. Peed OffDecember 6, 2019 10:06 PM

@ Wesley Parish
@SpaceLifeForm

Attribution is relatively unimportant.

First job is to keep the unwanted out.

Second job is to repair any damage caused.

Fault....Who connected to the malware and spy net?

Clive RobinsonDecember 6, 2019 11:42 PM

@ Wesley Parish, SpaceLifeForm,

Remember, attribution is hard.

Just how long is it we've been warning about this? As well as documenting ways it could be done...

If my memory serves me correctly, about the same time as the then US executive were talking about "going kinetic" and having an "Internet off switch"...

As I've remarked before it takes around eight years, for things you first hear on this blog to make it "mainstream attention" be it academic or media...

Sed Contra December 7, 2019 12:34 AM

Re: Watchmen, squid

So the real question is not “who will watch the watchmen”, but rather “who will cook scampi for the watchmen” .

Clive RobinsonDecember 7, 2019 5:09 AM

@ SpaceLifeForm,

You appear to have a "Nasty-cup-of-teaho" behaving in a below the bridge way.

This is not an unknown problem here, one or two others including our host have suffered from possibly the same low living life form trying to bounce it's rocks.

WaelDecember 7, 2019 1:05 PM

@echolalia,

I feel the link above is helpful for reference for averting mistranslation errors.

I bough down to ewe! Most useful link this weak witch will help Ewesers, my deer. One can chews so many of these pictures to perfect translations!

Ewe are so unlike some people that're "colder than a which"s did", and get people more confused than a "fort in a fan factory"!

vas pupDecember 7, 2019 2:08 PM

@Table A. Turn • December 6, 2019 5:48 PM
Usually, swamp start acting on legislation as soon as they [legislators] become victims of the actions which previous negatively affected millions of their constituents.

SpaceLifeFormDecember 7, 2019 2:25 PM

@ Clive

"Nasty-cup-of-teaho"

'en ess eh tee eh oh' may actually be doing a comm.

I'm working the other bridge end.

We can split the profits ;-)

AndersDecember 7, 2019 4:27 PM

@Clive

It's interesting to learn the slang.
What's the "Nasty-cup-of-teaho" ? :)

ps. everything is OK now?

AndersDecember 7, 2019 4:28 PM

Some real FUN.

securelist.com/hacking-microcontroller-firmware-through-a-usb/89919/

SpaceLifeFormDecember 7, 2019 5:15 PM

@ Clive

"Encryption is not enough"

Yep, yep, yep.

Must circumvent traffic analysis.

As an old philosopher noted, when you come to the fork in the road, take it.

WaelDecember 7, 2019 5:45 PM

@Anders,

If aliens visit the Earth, they do it via Bell 103 ;)

Of course! At about 1/3 the speed of light, it's the only thing that makes sense! The other interpretation: Bell 103 = Bell 206/2 is trivial.

Show that Bell 103 goes at 1/3 the speed of light, and get a +300 :) [1]

[1] Hint: 300 BPS :)

Clive RobinsonDecember 7, 2019 6:35 PM

@ SpaceLifeForm,

We can split the profits ;-)

Ahh for whom the bridge tolls ;-)

With regards,

As an old philosopher noted, when you come to the fork in the road, take it.

True but he did not say which one to take, left or right, or which way to go at the second or more forks...

If you flip and flop you kind of maintain a course, at the other extream of always making thr same decision you just go in circles. In between well, you grt that drunkards walk, which on mass moves generally in the same collective direction especially "in say a nice fresh cup of tea".

Clive RobinsonDecember 7, 2019 7:31 PM

@ Bruce and the usual suspects,

This on potential 5G threats might be of interest,

https://securelist.com/5g-predictions-2020/95386/

It's important to not that the base of what users will be told is "5G" will actually be "LTE" across "4G" which will give users phones network interoperability across existing networks.

5G it's self is not yet standardized and different countries are allocating non interoprable frequency blocks. Not just non interoperable from country to country but non interoperable between service providers in the same co-located geo-regions.

Worse 5G millimetric bands require small cell size across very wide frequency ranges to get even fractionaly close to the promissed user bandwidths. Which means each service provider will have to put in vastly more cell masts which will give a rediculous amount of base stations. Almost the same number as their are lamp posts in urban areas. As for high density areas such as major business districts the cell coverage areas could be down to the same size or less than WiFi networks and be different on every floor of a building.

I for one will not be an "early adopter of 5G because it's "always on" nature and small cell site coverage means that the level of passive surveillance capability built in is several orders of magnitude greater than our current "smart devices" on 3G and 4G. In fact I may miss 5G out entirely.

Due to "embedded infrastructure" issues such as "smart traffic lights", "smary meters" and a lot of other utility "smart infrastructure" 2G is still going to be around for some time as wil 4G. So as I don't do "bandwidth heavy" activities 4G will be sufficient for my needs and 4G LTE sufficient will be sufficient for most users today.

There is also another issue, whilst sticking 5G millimetric sites on lamp posts won't be that difficult, building up the necessary "back haul" to give the available bandwidth will be much more difficult to put in place and in effect will be a slow process.

FranklyDecember 7, 2019 8:17 PM

Door security system (new) "don't worry if someone watches you enter the code. Add any extra digits before or after your code and Wyze Lock will still unlock."

Doesn't this greatly compromise the security against a brute force attack?

lurkerDecember 7, 2019 9:34 PM

Another week, another SW update ...
This time the update was applied by the vendor, SAP, without the instant knowledge of the client (the lawyers will argue the TOS about prior consent). We don't know if an "unreserved apology" is sufficient to assuage exposed citizens in a nation that doesn't have a second amendment.
https://seclists.org/dataloss/2019/q4/163

WaelDecember 7, 2019 9:57 PM

@Clive Robinson,

I for one will not be an "early adopter of 5G…

You may have the choice now.

WaelDecember 7, 2019 10:05 PM

100 times the speed of light.
Been a long night!

Final hint: BPS doesn't stand for Bits Per Second (Bauds:) it stands for: B___ of P___ per Second. The 'B' is a multiplier and the 'P' is a unit of distance.

Clive RobinsonDecember 8, 2019 12:28 AM

@ Frankly,

Doesn't this greatly compromise the security against a brute force attack?

Not half... It's decades since I worked it out after seeing a film with a scene of somebody getting back through a security door in panic just pressing key after key till they got the green light.

If I remember correctly it came to 173x3 key entries instead of 999x3 key entries.

After all, it's simple to see that 123456789 matches, 123,234,345,456,567,678,789. So seven guesses for the same number of key presses as three guesses if this "security feature" was not in place...

Clive RobinsonDecember 8, 2019 12:49 AM

@ Wael,

You may have the choice now.

True, a bit like my health...

@ ALL,

Thanks for the "get well" wishes.

As I said it was realy quite excruciatingly painful, but this time round not serious.

Mad as it might sound the temporary cure is to "wear two pairs of wooly/thermal socks in bed at night, instead of bare feet", keep taking exercise, and take some mild CNS pain killers if needed... Untill I get to see the cardiac specialist for a change in prescription.

WaelDecember 8, 2019 1:45 AM

@Clive Robinson,

"wear two pairs of wooly/thermal socks in bed at night, instead of bare feet"

What's up with that? I heard a while back that it's not healthy to sleep with the feet uncovered. Didn't make sense to me.

Clive RobinsonDecember 8, 2019 2:32 AM

@ Wael,

Bits Per Second (Bauds:)

How loudly should I shout "Noooo" ;-)

For those who might be puzzled there is a difference between "bits per second" and "bauds".

Firstly "baud" is only used with respect to the "transmission channel" that is "after the modulator" and refers to "the transmition rate" in the number of symbols not "bits".

Secondly "bits" is only used in the digital circuits "before the modulator" and refers to "the information rate" not the transmission rate in "symbols".

As a simple example, lets assume a VDU terminal with serial output one bit at a time at a rate of 9600 bits/S. It goes into a modem for transmission into a --telephon-- channel that has a maximum bandwidth of 3kHz.

Obviously due to the Nyquist frrquency the channel will not support 9600 data symbols a second just under 1500 symbols at best. Thus the trick is to make each symbol hold more than one bit of information. One way to do this is by aplitude modulation.

One bit per symbol needs two levels (21) at 9600baud, two bits four levels (22) at 4800baud, three bits eight levels (23) at 3200baud. Eventually with 8 bits two hundred and fifty six levels (28) you get down to 1200baud which will go comfortably down the 3kHz transmission channel.

In practice Amplitude Shift Keying (ASK) modulation is not the best way to go. Phase modulation like bi-phase in a synchronous system will give the best performance. If the channel is non sychronous Frequency Shift Keying (FSK) is often used. But you frequently end up using a mixture of both Amplitude and Phase Shift Keying (APSK) as this gives the minimum number of phase angles and amplitude levels.

The point is that for each constellation point in APSK the basic symbol rate remains the same thus an 8bit to one symbol gives an sixteen level by sixteen phase constellation of 256 points and the 9600 bit/sec "information rate" becomes a 1200baud symbol rate that will go down the 3kHz transmission channel all be it at a much reduced signal to noise ratio on a bit basis which incresses the Bit Error Rate (BER).

You can go up to an arbitarily high number of amplitude levels and phase shifts, provided you remain within the Shanon Channel limits. However as you do your transmitted signal looks more and more like "random noise" and it's usable range at any given power diminishes. One side effect of this is, if there are other intelligences in the universe they will almost certainly be using such modulation techniques, and much as we are doing with GSM and other cellular networks reducing output power thus cell range in order to get much higher information rate density for any given geographic area or space volume, thus SETI has only a very small time window in which to detect them (less than 150years based on human communications technology development).

Clive RobinsonDecember 8, 2019 3:11 AM

@ Wael,

I heard a while back that it's not healthy to sleep with the feet uncovered. Didn't make sense to me.

It's all to do with chemistry and phase change with temprature.

Basically when you sleep your blood preasure thus flow to the extremities like your hands and feet drops quite a bit over night. Which with the modern habit of using duvets causes problems.

Back in times past people were shorter and slept in beds long enough that the sheets and blankets would remain tucked in at the bottom thus trapping warmth around the feet. However people have grown quite a bit over the past century but beds have not realy got any longer. Further when sleeping out people used to put the blanket down diagonally to sleep on their back. Thus their head would be at one corner and the opposit corner flap would be folded up over the feet, then the two side flaps folded up and across the body effectively covering the shoulders as well. Thus the only bit sticking out was the face or part of it, keeping most body heat in.

So whilst most people these days keep their hands under cover in the warmth, their icreased size and the use of duvets has caused feet to stick out into the air thus cool down towards room temprature, where certain chemicals change their phase state and don't work the way they would normally do thus causing medical issues.

WaelDecember 8, 2019 3:25 AM

@Clive Robinson,

How loudly should I shout "Noooo" ;-)

Splitting hairs on me! Assuming two symbols, ma man.

AndersDecember 8, 2019 5:37 AM

@Clive @Wael

Baud may be a complicated thing at first glance but this is how
it was organized at the Teletype era and this gives better overview
of it.

300 baud = 30 character/sec * 10 characters (1 start 8 data, 1 stop)
110 baud = 10 character/sec * 11 characters (1 start 8 data, 2 stop)
75 baud = 10 character/sec * 7.5 characters (1 start, 5 data , 1.5 stop)
50 baud = 6.6 character/sec * 7.5 characters (1 start, 5 data , 1.5 stop)
45.45 baud = 6 character/sec * 7.5 characters (1 start, 5 data , 1.5 stop)

So although 75 and 110 baud channel speeds are equal, transmission
unit length in characters is different due to different amount of
data and stop bits.

AndersDecember 8, 2019 9:25 AM

But now i need an old school BASIC language hacker help.

How can i add a new line to the program from *INSIDE* the running program?
So that i RUN a program and after that the LIST shows there a new line in the
program? Basically a self-modifying code.

Deleting program lines from *INSIDE* the running program is easy,
piece of cake. But i need to add a new line. Or a whole new program.

Dialect is MS BASIC (=Altair, IBM PC etc).

WaelDecember 8, 2019 10:40 AM

@Anders,

How can i add a new line to the program from *INSIDE* the running program?

Many moons ago, on a Commodore 128 (or 64, Vic 20/16,) games were entered using BASIC and 'opcodes' using hexadecimal in conjunction with peek and poke commands. This Stack Exchange article may give some ideas.. They mention the command 'Alter', which I don't recall ever using.

AndersDecember 8, 2019 10:58 AM

@Wael

Thanks, no luck yet, because there's still some limits.
It's IBM PC Cassette BASIC, so no possibility to LOAD or SAVE
or MERGE.
I want to download a new program over the COM port (trivial)
and then overwrite the current downloader.

Sigh, no luck yet. I want to avoid altering the memory,
in tokenized format etc.

WaelDecember 8, 2019 10:58 AM

@Anders,

Deleting program lines from *INSIDE* the running program is easy, piece of cake.

Share a slice of the cake and tell us how!

AndersDecember 8, 2019 11:12 AM

new
Ok 
10 print "Test"
20 print "Test2"
30 delete 20-30
list
10 PRINT "Test"
20 PRINT "Test2"
30 DELETE 20-30
Ok
run
Test
Test2
Ok 
list
10 PRINT "Test"
Ok

WaelDecember 8, 2019 11:19 AM

@Anders,

I want to download a new program over the COM port (trivial) and then overwrite the current downloader.

I'm not sure I understand. You want to modify the downloader after you download a program? What's the relationship between the downloader and the new program, and why modify it 'after', not 'before'?

Then again, why self-modifying, and not 'post processing'? Like with a Perl script or something? Obviously there're some run-time components that you need to change, but it seems you need to post-process something, unless the downloaded program downloads something else.[1]

By the way, as you know, self-modifying code is easier done with low level programming languages, like 'C' or Assembler.

[1] Or... you're a free-loader :)

Who?December 8, 2019 11:21 AM

@ Wael, Anders

I think using PEEK and POKE is the easier way:

https://www.atarimagazines.com/compute/issue20/197_1_Self-Modifying_Programs_In_BASIC.php

I would have suggested using an Spectrum or MSX and write the equivalent in Zilog Z80-A assembler, and calling the code from the BASIC program using "RANDOMIZE USR" instead. In few words, PEEK and POKE are your friends.

You may have some luck writing the to-be-added code in a REM line and try to convert the REM token itself into a white space with a single POKE (if you are working on a Spectrum) or three POKEs in other platforms.

AndersDecember 8, 2019 11:33 AM

@Wael

Reason is easy - IBM PC Cassette BASIC (without actually having cassette interface)

So no ANY means to save program anywhere, whatsoever, nor load it from somewhere in normal means (disk etc).

And i'm lazy, i don't want to spend a half a day to enter manually new program.
So i want to write SMALL downloader, load new program over COM port and then replace the loader with downloaded program.

But no easy solution yet...

Who?December 8, 2019 11:38 AM

@ Anders

Perhaps the easier way is coding the downloaded program into DATAs, at the end of the downloader itself, read it from the DATA structures (using a FOR loop and a PEEK instruction) and overwrite the downloader (POKE) from the beginning.

WaelDecember 8, 2019 11:44 AM

@Who?, @Anders,

coding the downloaded program into DATAs...

I was thinking along these lines. High level: peek from memory and poke into the new program.

AndersDecember 8, 2019 11:52 AM

@Who?, Wael,

That means i must convert all programs into tokenized
format as they are in form of bytes in memory.

More easy would be wrote all programs again in pure machine
code, poke them into memory at appropriate segment:offset and
run.

But i'm still searching some trivial genius way :)

WaelDecember 8, 2019 12:02 PM

@Anders,

But i'm still searching some trivial genius way :)

Adding a line in the middle of your program means you'll need to relocate the remainder of your original code by that much offset. Somehow I get the feeling you already know the answer or at least you partially know it.

Now tell me the full story ;)

Who?December 8, 2019 12:19 PM

@ Wael

Not to say Anders will need to fix the GOTO and GOSUB jumps once the new code is inserted. I do not see how inserting code will make it easier hiding the downloader.

Ok, a last approach. You can run something like:

 10 REM program starts here
 20 LET D=1: REM D=1 (executes downloader), D=0 (does not)
 30 IF (D=0) THEN GOTO 100
 40 REM downloader starts here
 50 [...]
100 REM now the code itself
110 POKE D_address, 0: REM D_address is address of variable D
120 [...]

Sorry, I am start feeling I do not really understand the problem you are trying to solve.

AndersDecember 8, 2019 12:42 PM

@Who?

"Sorry, I am start feeling I do not really understand the problem you are trying to solve"

Problem is very easy - imagine (early) computer with no other communication
means that serial port (RS-232). No disk or any other recording media.
This computer has BASIC interpreter. Your task is to get there
10000 line BASIC program without typing it in.

Same problem that once there was with early computers - you have a computer, you had to get a program into the memory. This was quite a task so they didn't turn off the early computers for months.

MikeADecember 8, 2019 12:55 PM

@Clive -- Thank you for taking on the baud == bit-per-second myth.

_Maybe_ with your august personage weighing in, that misconception will lessen. Or, maybe once all of us who have ever dealt with the messy analog nature of "digital" communication will die out and the world will just keep ticking along with whatever became popular just before the nerdpocalypse (albeit with various revisions to the "speeds up to" claims of wireless carriers, with no change in delivered speed).

In any case, on a note somewhat related to this blog, I have from time to time messed around with sending Bell 103 signals over a "modern" wireless link, with _very_ limited success. It appears that between failure to honor the echo-suppression disable, codecs that value highly compressed almost intelligible speech over fidelity, and chains of dubious conversions, it is difficult to get remotely usable error rates, even dropping to lower bit-rates. Adding enough forward error correction to be usable drops throughput even further, to the point where RFC 1926 (or even RFC 1149) looks more useful.

As you may guess, the point is to be able to use a computer old enough to _probably_ lack backdoors, over a COTS mobile, to help with the "security endpoints should be outside communications endpoints" problem. Having to do any sort of processing on the mobile phone itself is of course a non-starter. And, yes, still doesn't handle the Traffic Analysis issue, unless maybe I gather a bushel of prepaid mobiles via cash purchases by homeless folks in a variety of cities.

Because I'm currently avoiding some real work, and to forestall some objections, yes, I am aware that the Weitbrecht modem is possibly better supported and does not need full duplex. OTOH, the fact that mobiles that support it at all generally provide the ability to use the phone-native screen and keyboard as endpoints, and that implies that a processor much faster than a vintage computer potentially has full access to the ciphertext (which due to FEC may provide quite a few cribs), and at the low data rate has a _lot_ of time to work on it.

WaelDecember 8, 2019 1:44 PM

@Andres,

Regarding Bell 103

I feel like stirring some trouble:

Q: When is 1/2 more trouble than 1?

A: Bell 103 vs. Bell 206.
Now let's start a discussion about Bell 103 != 1/2 Bell 206 ;)

I have a feeling someone is sweating bullets trying to dispute this 'myth'. Want to talk analog? I'm your huckleberry;)

SpaceLifeFormDecember 8, 2019 1:49 PM

@ Anders

I'm guessing your BASIC Interpreter does not support APPEND

If it does, I would write a stub, with huge line numbers, that after loading, does a GOTO 10 (or another line number).

May need to RENUM.

APPEND is a misnomer.

It should be called OVERLAY.

But, it will depend on the interpreter semantics.

SpaceLifeFormDecember 8, 2019 2:09 PM

@ Clive

"True but he did not say which one to take, left or right, or which way to go at the second or more forks..."

Which is the point. Routing traffic.

Eventually, hopefully the car (encrypted payload), will find a petrol station.

And another. And another.

Think NNTP.

Another car (same payload), is in another twisty maze of passages, all different.


WaelDecember 8, 2019 2:13 PM

@Anders

Excuse the inadvertent letter transposition. The eyes are cloudy, the skull is heavy, and the mood is goofy.

@Clive Robinson,

The harsh dame is working extra hard! Can you invite her over and give me a break?

AndersDecember 8, 2019 2:40 PM

@Wael

No problem, i hardly even noticed it ;)

But we need to bring alone also Bell 101, because
from this EVERYTHING started, including social media ;)

en.wikipedia.org/wiki/Community_Memory

Clive RobinsonDecember 8, 2019 3:37 PM

@ Wael,

Splitting hairs on me! Assuming two symbols, ma man.

Not realy, whilst RS232 serial signaling is still around where the information rate[1] and channal baud rate are both bi-level, this is very much less the case these days. I regularly have to work with QAM-64 and QAM-256 and knowing their strength and weaknesses is important as is the difference between the information rate into the modulator and baud rate on the line. As well as of course knowing what the information companding rate out of a compander that goes into the modulator is.

But it also stretches further Baud rate is not the real channel transfer rate. If you consider the likes of "Forward Error Correction" where the actual data is sent the equivalent of three times to improve the "effective" BER then it quickly becomes clear just how messy things are and how the "bit rate" and "baud rate" may have no fixed relationship.

@ Anders, Wael,

So although 75 and 110 baud channel speeds are equal, transmission unit length in characters is different due to different amount of data and stop bits.

Err no you are confusing two seperate usages of "symbol". The ITU changed from the "character/word" rate based on the old telegraph "Five charecters is a word" charging rate in 1928 to the Baud rate. Which is why we still talk about original telex machines being 50 or 75 baud meaning 5chars/sec or 7.5chars/sec independent of start bits, stop bits or the later parity bits, due to the need to "transcode" not just between diferent base line speeds used in different countries/continents but also between character sets such as International Telegraph Alphabet No. 1 (ITA1, Baudot), and the later International Telegraph Alphabet No. 2 (ITA2), which was the most common teleprinter code in use untill the 1960's gave birth to ASCII. Transcoding was important as it was the cludge that enabled telexe to crossing national borders when Governments had a very proprietary belief in their way of doing things. It was this sort of problem that originally gave rise in 1924 to two technical committees to standardize technical and operational questions of international long-distance telephony and telegraphy. A year later they formed the nexus of the International Telecommunications Union (ITU) which later found a home under the United Nations. But things changed and the two technical committees were merged in 1956 to become the International Telegraph and Telephone Consultative Committee (CCITT[2]) which later got renamed as ITU-T.

[1] Depending on who you study under "information rate" is preferable to "gross bit rate" because many systems are not actually that related to the definition of a "bit". For instance a complex analogue modulation system whilst not involving any "bits" can be quantified in a similar way. Hartely did this by using the log2 of the information rate thus all modulation systems could be compared on an equivalent footing.

[2] In Europe however the CCITT was seen as not just "knuckle draggers" but also "diggers in of heals" and in the hands of certain large international companies and entities (Five Eyes SigInt for one). Thus was not much in favour in continental Europe, which resulted in CEPT and ETSI which amongst many other things gave us GSM in "37 weeks of 87" which later switched focus from business comms to consumer comms based on a single document "Phones on the Move" from the UK Dept of Trade and Industry (DTI). Now at over 6 billion users world wide GSM is the most successful cellular / mobile phone system, and has caused more than a few ruffled feathers which we see have arisen yet again, this time with 5G and faux arguments about China.

AndersDecember 8, 2019 4:01 PM

@Clive

Don't confuse here those WPM's.
Those are evil.

Read this.

books.google.ee/books?id=Ww4SBQAAQBAJ&pg=PA107&lpg=PA107

Another MouseDecember 8, 2019 4:06 PM

@clive

Why is all the world setting 5g equal to small cells?

5g is just a little face lift of lte. You can run it over any frequency band thats available for 4g plus a few more.

In my country one operator is rolling out 5g on previous 3g frequencies, its also as slow as 3g was i heeard :-P

If you are their customer you profit of worse connection with a 4g mobile not bought from this operator. As they won't let me do volte so im now falling back to the downgraded 3g network...

So even if you boycott 5g it catches you...

Clive RobinsonDecember 8, 2019 4:17 PM

@ MarkA,

As you may guess, the point is to be able to use a computer old enough to _probably_ lack backdoors, over a COTS mobile

It probably can not be made to work. The reason is still the same as the one I pointed out why the "JackPair" system could probably never be made to work the way they wanted. It's the use of hidden effectively inband control data and CELP codecs which originated from work by the NSA. The CELP codecs actually do not send the audio signal but a poor reconstruction facsimile of one, that relys on the fact the human brain is very very insensitive to pitch and phase and short "holes" in the reconstructed signal. Unfortunatly most modems are very sensitive to frequency, phase and "dropout". That is a mobile phone CELP codec works more like a quite lossy vocoda than a true lossless codec.

The solution you are looking for is to use the native "AT Command set" on either the serial or USB interface into the RF front end / modem chip, mostly it's an extension of the original Hayes AT Command set.

Have a search for backwards compliance with the,

    ETSI GTS GSM 07.07 "AT command"

It came out in july 96 and should form a subset of any certified GSM handset. If you look at some of the cheap Raspberry Pi or Arduino "GSM Shield" such as,

http://www.tinyosshop.com/arduino-gsm-shield

You should find info on using the Hayes compliant AT Commands such that the GSM modual behaves just like an old fashioned analogue POTS modem.

AndersDecember 8, 2019 4:38 PM

@Clive

Baud is nothing more and nothing less than amount of signalling units per second.
Each signalling unit may be one bit, may be more than one bit
or may be less than one bit. But if they are equal then bitrate=baud.

If we have 75 baud (= 75 signalling units per second) and our protocol
consist of 1 start bit, 5 data bits and 1.5 stop bits, then we transfer
75/7.5=10 characters per second. See my table above.

SpaceLifeFormDecember 8, 2019 4:52 PM

@ Anders

'Same problem that once there was with early computers - you have a computer, you had to get a program into the memory. This was quite a task so they didn't turn off the early computers for months"

Maybe. I have booted computers with Hollerith cards, or switches.

Of course, I am old.

Clive RobinsonDecember 8, 2019 4:52 PM

@ Anders,

How can i add a new line to the program from *INSIDE* the running program? So that i RUN a program and after that the LIST shows there a new line in the program? Basically a self-modifying code.

With old BASIC interpreters with little memory, the real problem was a way to make sufficient space to store the program you are loading. In some you could use REM statments in others DIM statments. But you still had the "line number" issues. You still see this with many of the "BASIC written in C" programs you can download. Either they have their own malloc(3) replacement to make garbage collection easier or they do a one off malloc(3) at the begining. Whilst the use of realloc(3) is usually avoided.

In the late 70's / early 80's when the price of EPROM chips dropped but DRAM remained high many BASIC interpreters started to use one byte "tokens" instead of three or more bytes to store commands. This kind of went to extreames on the Sinclair ZX80/81 where you did not type in commands at all, you simply pressed a key with the command printed adjacent or on the key. Everything except for the "display" stayed in tokenised form.

By the time PC / MS BASIC came along memory managment due to garbage collection had become difficult and you had to use the inbuilt support of the BASIC interpreter. But also watch out for "compile on the fly" type interpreters. In essence they convert the text into optomized opcode to get significant speed advantages.

So "Step 1" is to identify exactly how your BASIC interpreter/compiler deals with it's internal memory allocation and garbage collection. When you've done this generally the rest follows on.

[1] Such as the "Boehm-Demers-Weiser Conservative Garbage Collector", which can be used as a garbage collecting replacement for C malloc or C++ new. It allows you to allocate memory basically as you normally would, without explicitly deallocating memory that is no longer useful. Also there is the "GC" Garbage Collecting malloc(3) replacment. Have a read of,

https://maplant.com/gc.html

AndersDecember 8, 2019 5:19 PM

@SpaceLifeForm

"I've never met her either."

But you still have your chance :)

www.toopics.com/bri_healthy_mons/followers?lang=ja

Clive RobinsonDecember 8, 2019 5:32 PM

@ Another Mouse,

Why is all the world setting 5g equal to small cells?

To get the 5G millimetric bands to have a very high user bandwidth per geographic area (though how they will do the physical "back haul" to suppprt it is anybodies guess currently).

5G as advertised is a mish mash, it has both 4G and LTE as the fall back (which is mostly the default operation currently). but this limits user bandwidth in it's larger geographic area and due to other reasons has significantly greater latency than the "proposed" 5G millimetric bands.

The problem with this 5G millimetric band add on to the existing 3G, 4G and LTE is that in some cases it does not like buildings, wet trees, fences and even humans standing in the wrong place, thus it's a candidate for complex MIMO which has other issues.

But these aspects that are the real part of 5G is not yet standardized so some countries might use low end microwaves taken from existing Amature and Satellite allocations and in others bands up in the 25GHz or higher range... The big problem with this is there will probably be no 5G roaming, this will use 3G / 4G and if available LTE. Likewise handset compatability will at best be problematical if not impossible between not just countries but service providers inside a country (this appears to be a ploy to raise handset prices and increase control on individuals).

So from my perspective moving to 5G offers me nothing, and will probably not do so for the rest of my life.

But there is more, as it also appears that the US is doing everything it can politically to be compleatly disruprive to 5G as they want to kill it off... Primarily to force in US Patented Technology from which revenue can be earned or more importantly control maintained, with a side order of US surveillance / back doors etc. The US comms corps have tried this sort of nonsense in the past and it ended up in there propriatry systems getting replaced by GSM. Through other hoops and loops untill later those ridiculous Patent cases in the US. Mostly these were at best frivolous and restrictive trade practice tantamount to establishing cartels, which brought the US judicial process into disrepute. So much so that one US judge made it clear publically he was going to stamp down on them.

Clive RobinsonDecember 8, 2019 5:39 PM

@ Anders,

But i'm still searching some trivial genius way :)

Oh that's easy, just write your own "byte code threded interpreter" like a striped down Forth or Java in BASIC and shove the "dictionary" in DIM or DATA arrays.

Almost too trivial to mention ;-)

SpaceLifeFormDecember 8, 2019 5:44 PM

@ Anders

The problem is simple. They may have been giving me a clue. It may have been a comm.

We just don't know yet.


The Nutshell is this:

Until any TLA proves their worth, none can be trusted.


AndersDecember 8, 2019 5:50 PM

@Clive

Yes, that's one option.

Another one i'm thinking is to build a keyboard
emulator that "plays back" keypresses at high speed.
Raspberry PI or Arduino, BASIC programs reside on
flash SD card. Similar like this.

null-byte.wonderhowto.com/how-to/load-use-keystroke-injection-payloads-usb-rubber-ducky-0176829/

SpaceLifeFormDecember 8, 2019 5:58 PM

@ Clive

"Oh that's easy, just write your own "byte code threded interpreter" like a striped down Forth or Java in BASIC and shove the
"dictionary" in DIM or DATA arrays."

LOL. Where is WebAssembly?

You are really slacking dude. ;-)

Clive RobinsonDecember 8, 2019 6:20 PM

@ Wael,

If there are only two symbols in the system (typically 0 and 1), then baud and bits per second (bps) are equivalent.

You realy need to kick that harsh mistress into touch...

A "symbol" is in effect a container or object of "state" whilst there can be many states available the symbol can only ever be in one state at a time.

So if your states are simply "0 and 1" your symbol is the equivalent of a single bit.

If your meaningfull states are -1,0,+1 then your symbol is the equivalent of a trit. Likewise if they are 0,1,2.

If the set of states is {0,1,2,3} then whilst you still have a single symbol it holds the equivalent information rate of two bits. Thus the information rate in to your modulator would be 2bits/S but your symbol rate out of the modulator would be 1 Baud.

In the case of QAM-64 the state set size is 64 which is the equivalent of 6bits. So your information rate into the modulator would be 6bit/S and your symbol rate out 1 Baud.

In theory the set of states size can be increased indefinitely as long as you have the power to maintain a sufficient signal to noise ratio, and 4096 is in use currently. So 12bits/S information rate in, symbol rate out 1 Baud. If you cranked the information rate up at the input to 12000bit/S then the symbol rate out would be 1000 Baud.

P.S. As for Mouser it wants javascript on to display the page you mention. As I've said before I don't do cookies or javascript, so I've not seen it. There loss not mine.

Sherman JayDecember 8, 2019 6:26 PM

@SpaceLifeForm from December 8, 2019 2:09 PM

that car caught in the twisty maze of passages has two passengers: Woods and Crowther LOL

@Another Mouse • December 8, 2019 4:06 PM
referencing:
@clive

the computer clinics I hold are in a building that has a 'cable' ISP delivering wifi. They offer two access points one labelled with the name of the building, the other is labelled with the name of the building appended with '5G'.

However, there is a lot of (intentional!?) confusion by the term '5G'.

Some are using it to mean Fifth Generation of Cell protocols. But, that 'cable' ISP actually means that that access is at 5GHz (as opposed to the other 2.4GHz std wifi connection).

Some tech articles are talking about the Fifth Generation 5G technology eventually extending up near the 'ionizing' radiation frequencies. Some one's bird is going to get fried! And those upper frequencies will have extremely little ability to penetrate building walls at any distance, so I've seen artist renderings showing ugly 5G towers spaced about every 200 feet in a residential neighborhood.

Also, NOT as an after thought, Clive, I wish you always the best outcome for your health challanges. I remember Jack Palance playing an old cowboy in 'city slickers' smoking an unfiltered cigarette and telling Billy Crystal "getting old isn't for sissies!"

Clive RobinsonDecember 8, 2019 6:47 PM

@ SpaceLifeForm, Anders,

Until any TLA proves their worth, none can be trusted.

Just one caveat, the opposit applies,

    You are not a murderer untill you kill someone, thus potentially everyone is a murderer.

In otherwords "trust" is highly ephemeral, and past behavior good or bad is no real indicator of future behaviour good or bad...

The CIA motto is "In God We Trust" actually meaning "Every other bugger we check continuously".

But as a friend once put it,

    Humans generaly over trust, thus get hurt by friends. But those who don't trust, don't have any friends.

You just have to throw the dice or mitigate.

SpaceLifeFormDecember 8, 2019 7:58 PM

@ Clive

I kid about WebAssembly.

Years ago, I wrote an interpreter using Perl.

Clive RobinsonDecember 8, 2019 8:35 PM

@ Anders,

How many baud have new ITU V.44 modem standard

You may want to think about what you are asking, your link comes up as being about vodka...

That said,

V.44 is an ITU-T standard for modem data compression. Not for the modem (V.92). In theory V.44 provides for upto a 6:1 compression ratio.

V.92 is a Digital not analogue line modem that is an augmentation of V.90. It uses digital PCM in both directions on a "two pair" "4-Wire" electrical interface.

V.90 is an ITU-T standard for 56 Kbps combined digital and analog modem communications on digital/analog PSTN lines. It uses 56 kbit/s PCM digital download on one pair and 33.6 kbit/s analog upload on another pair using V.34bis.

Since the 1960's Digital PSTN lines have used PCM, 8000Hz sampling for 8bit audio, at a fixed signaling rate defined by the network not the modem often on a 4-Wire interface. This gives a maximum single channel rate of 64,000bits/S. However for various reasons the actual data bandwidth available is limited to only seven bits due to "in-band signalling by bit-robbing" thus the maximum user data rate is capped at 56kbits/S.

Importantly the 4-Wire interface used may carry more than one 64kbit/s circuit and an additional control channel. Because these are multiplexed onto a single wire pair for either direction talking about "baud rate" is misleading hence the use of kbit/S.

V.34bis is the result of upgrading earlier V.34 standards and ad hoc interim industry standards. Although the analog line is capable of higher rates it is capped in the standard to user rates of 33.6 kbits/S.

If I've not answered your question we will need to talk about the actuall interface used. This might be an ISDN Basic Rate Interface (BRI) at 144kbits/s across an S-interface carrying what is called 2B+D for two 64kbits/S "bearer channels" and a low data rate 16kbits/S "control channel". Also there is PRI as an alternative to BRI which sometimes has 32x64kbits/S "E1" capacity or 24x64kbits/S "T1" capacity. In either case where the control channel uses one or two of the 64kbits/S channels.

TatütataDecember 8, 2019 9:05 PM

2G is still going to be around for some time

Clive, I'm sorry to break this to you, but GSM has already been completely retired in quite a few countries including Germany, Australia, Canada. 3G/UMTS is on its way out too.

For the difference between bps and baud, consider an OFDM symbol such as used DVB-T or DAB, with a duration in the hundreds of µs, but with bitrates in MBPS.

WaelDecember 8, 2019 9:50 PM

@ Tatütata, et all,

For the difference between bps and baud, consider an OFDM symbol ...

1) How many bits per symbol?
2) If you have two symbols representing 0, 1 then the Bit Rate = The symbol rate = baud

Do you disagree with the second point?

Clive RobinsonDecember 9, 2019 1:23 AM

@ Tatütata,

Clive, I'm sorry to break this to you, but GSM has already been completely retired in quite a few countries including Germany, Australia, Canada. 3G/UMTS is on its way out too.

Firstly I was quite specific about 2G technology and said why it applied in certain places (infrastructure usage) and that service providers wanted shot of it.

Secondly is the issue of nomenclature, saying "GSM has been retired" rather depends on what you mean by "GSM" which neither you nor the article you link to are very clear on.

GSM is the organisation originaly called "Groupe spécial mobile" which now calls it's self the "Group special Mobile Association" (GSMA). But few use GSMA because GSM is recognized in context as well as the standard generational groupings being proceaded by numbers (2GSM 3GSM etc).

https://www.gsma.com/aboutus/history

Also involved in 3G but not 4G was the 3rd Generation Partnership Project (3GPP) of standards bodies,

https://www.3gpp.org/about-3gpp

Who having not realy succeeded with 3G-GSM decided they are going to be the 5G specification organisation for certain parts of 5G...

You will see from their pages that the standards for the various generations are also called GSM with a sub technology lable.

It's all a bit of a mess which is why people also tend to talk of technology generations 1G, 2G, 3G, 4G and 5G without mentioning specific core technologies such as GSM or CDMA. Just for real strangness you have some political advisors muttering quietly about their 6G plans what ever the heck they might be (possibly to sow seeds of doubt about 5G and it's core GSM standards).

So 1G or First Generation was the original analogue cell systems and some early ideas for digital networks. In essence pre-GSM, and all "museum pieces" (which as I worked on some says what a fossil I've become ;-)

2G or Second Generation technologies GSM-GPRS, GSM-EDGE, CDMAOne, D-AMPS, etc. Of which only GPRS and EDGE are effectively still around in any numbers with some CDMA in places. One set of people who wanted to see 2G GSM stay are Law Enforcment Agencies (LEAs) for various reasons. Others are those with a lot of infrastructure investment in it. However ETSI and other standards organisations want the UN ITU to make rullings to free up spectrum for other services. The ITU has just had it's World Conference, but I've not yet looked up what they have decided. I know France and Russia had petitioned to take away Amature radio bands rather than touch other commercial spectrum.

3G or Third Generation GSM-GPRS, GSM-EDGE, GSM-UTMS, with UTMS-HSPA bringing in broadband data in prefrence to voice services. CDMAone was also fighting a regard action. Importantly the GSM-LTE that is still developing and will become not just part of 4G as it has but 5G as "Core GSM technologies".

So saying "GSM has already been completely retired" makes absolutly no sense what so ever.

If you are arguing about 2G GSM and 3G GSM radio frequencies yes the standards committees want to get their hands on them primarily because they are effectively the only globaly standard radio bands, also due to other changes in the telecomms world they want to get rid of "circuit switched" functionality and only have "packet switched" functionality with 5G as it's seen to be both more efficient and of lower latency.

But as I said there are people fighting to keep 2G GSM whilst in other places 3G-GSM-LTE is the main system as they did not go to 4G-GSM, if they jump to 4G-GSM-LTE-Advanced as a necessary part of 5G-GSM we will have to wait and see, as it's a mainly financially constrained decision.

But as the page you link to notes,

    In some European countries it could be the case where 2G may even outlive 3G, UMTS. Norwegian MNO Telenor announced 3G switch-off in 2020 – five years before 2G.

    Vodafone has announced it is to phase-out 3G networks across Europe in 2020 and 2021 while Deutsche Telekom plans to continue 3G, UMTS until the end of 2020. However, there are no plans by either Vodafone, Deutsche Telekom or Telefonica for 2G switch off.

Shows that 2G GSM will be around haunting us untill legislators step in and force change and in general that requires the UN ITU to get it's act in gear then a couple of years later national legislation comes into play.

The article which is in reality a "technical marketing piece" to push goods and services also notes the same about Africa.

The thing to note is despite 3GPP lording things up 3G was never very popular and few if any want to keep it alive unless they have little other choice. As for non Core GSM technology, that is dropping by the wayside numbers wise as well.

TatütataDecember 9, 2019 3:51 AM

1) How many bits per symbol?

I was quite conversant 25 years ago in the original Eureka 147 DAB standard (but never had a receiver!).

ETSI EN 300 401 defined several transmission modes, including mode II that survived in DAB+.

In that mode, symbols are spaced 312µs apart, of which 250µs must be acquired before demodulation can be performed with a FFT. These durations are much longer than the multipath spread in the expected environment, and are also determined in transmitter placement and synchronisation scenarios in Single Frequency Networks.

The symbol has 384 subcarriers impressed with QPSK modulation. So one symbol carries 2*384=768 raw bits.

The overall raw rate of DAB(+) is ~1.5Mb/s, before error decoding.

2) If you have two symbols representing 0, 1 then the Bit Rate = The symbol rate = baud

Yeah, sure.

BTW, I didn't get the helicopter reference at once, I was jogging my memory on Bell 202 (1200 bps+baud, FSK, half-duplex) and Bell 212 (1200 bps, 600 baud, PSK, full-duplex).

The US models had AT&T's dreadful and clunky (even for the 1970s) Data Access Arrangements. The European CCITT equivalents were hardly better.

CuriousDecember 9, 2019 4:47 AM

I think I read about something called ultra wideband wireless signals the other day. I wonder if this reference to Irish tech below might be the same type of thing. Re. Brian Kreb's recent article on Apple's short range ultra wideband tech (which apparently has a privacy violation issue involved with it).

https://www.irishtimes.com/news/science/2019-parsons-medal-awarded-to-michael-mclaughlin-1.4107021 (not really a tech sentric article)

https://krebsonsecurity.com/2019/12/apple-explains-mysterious-iphone-11-location-requests/

I wonder, it perhaps these are the same things and then what kind of infrastructure might be involved in such a positining system. I also wonder what the difference would be, between ultra wideband signals and anything RFID (I assume it is some kind of standard, and not some unwieldly larg and generic term for something).

I also sort of wonder if the phrase 'automobile security', but sort of contrary to my intuition, might also include mobile phones by some people in an instance of perchance ironic distancing. :) I wouldn't think so, that would be weird I think.

WaelDecember 9, 2019 4:48 AM

@Tatütata,

BTW, I didn't get the helicopter reference at once,

It's the one that most resembles the hieroglyphic symbol at Abydos. In my opinion.

CuriousDecember 9, 2019 8:10 AM

"NTSB Investigation Into Deadly Uber Self-Driving Car Crash Reveals Lax Attitude Toward Safety"

https://spectrum.ieee.org/cars-that-think/transportation/self-driving/ntsb-investigation-into-deadly-uber-selfdriving-car-crash-reveals-lax-attitude-toward-safety

For the next five seconds, the system alternated between classifying Herzberg as a vehicle, a bike and an unknown object. Each inaccurate classification had dangerous consequences. When the car thought Herzberg a vehicle or bicycle, it assumed she would be travelling in the same direction as the Uber vehicle but in the neighboring lane. When it classified her as an unknown object, it assumed she was static.

I haven't yet read all of this, but I thought it could be an interesting read.

CuriousDecember 9, 2019 8:20 AM

For sake of clarity, the linked article re. that self driving car pedestrian death above was dated 7. nov, so a month old article. I unfortunately overlooked the date for the article when following the link to the article on twitter earlier. :| (Those darn pinned tweets.)

WaelDecember 9, 2019 11:21 AM

@Currious,

They say:

However, the car’s self-driving system did not have the capability to classify an object as a pedestrian unless they were near a crosswalk.

Rather strange design. An object is an object, whether it's a human being or an inanimate object. One can conclude that if the object is a big rock (not near a crosswalk -- like smack in the middle of the road,) then it's acceptable to drive through it.

Clive RobinsonDecember 9, 2019 4:22 PM

@ Wael, Tatütata,

2) then the Bit Rate = The symbol rate = baud

The problem is not wether what you are saying is true or not, it's what you are saying that leads to all sorts of problems.

Saying,

    If you have two symbols representing 0, 1

To most people would imply that the "two symbols" are sufficiently different as to be potentially unrelated.

Where as the symbole is basically the same (duration or frequency wise) it just is in one of a finite number of states.

If you think of a carrier with time or phase modulation then,

Vinst = Vmax cos(Wf+Ø)

The instantanious amplitude Vinst is at maximum level (Vmax) when Ø is at zero and 2pi and it's even multiples, and minimum (-Vmax)at pi and it's odd multiples. It's zero at pi/2 and 3pi/2 and odd multiples of pi/2.

Obviously Vinst changes with time. Howrver with two synchronized carriers Wf is in effect constant as, is the phase difference Ø for the duration of the symbol time.

Thus in OFDM there are multiple Wf's that are frequently orthagonal to each other (delta f = 1 / symbol duration). That is for a 0.1S duration dekta f = 10Hz and all carriers are at a multiple n of 10Hz. The reason for this is in a "perfect resonator" that is tuned to the frequency defined by n the amplitude rises linearly, and all other resonators the amplitude rises and falls to zero after the symbol duration time. Which gives maximum differentiation at the filter outputs.

The point to remember is no matter what n delta f becomes as Wf the phase offset for state Ø[2pi/p] in p states remains the same. Further in APSK Vmax can become another state thus is Vmax[a] in Vmax/amax increments. As the amplitude and phase are orthagonal to each other the total number of states is Pmax x Amax which gives a square constalation. Decoding APSK needs two synchronised carriers I (for in phase) and Q (for in phase quadriture) the amplitude is recovered as Vamp = sqr( VinstI^2 + VinstQ^2) and the phase recovered by measuring the I and Q phase difference via a PLL or Costas Loop or via Ø = arctan(I/Q).

Thus you should not be asking,

    1) How many bits per symbol?

Because that presupposes that the number of states to convert the "information rate" and the "information rate" quantatisation are "binary powers" which they need not be. For instance there might be six amplitude states levels and six phase states phases giving a total of 36states, likewise 5 and 5 giving 25 total states.

And I can assure you that there have been signalling systems using 5x5 and 6x6 type signalling for a very long time.

MikeADecember 9, 2019 4:27 PM

@Clive: "Native AT command set", ah, yes. First used it for the first use I made of mobile data. Well, "fixed transportable" data, using a mini-brick sized analog phone with a DB9 connection on the side. Worked well for gathering modest amounts of log data from a scattered set of equipment. As for its availability on modern phones via serial or USB, that would depend on your carrier allowing such access. Mine does not, and is the only one whose coverage map has "not that much dead area in it" (Yes I live in the U.S.). Oddly (or not) they _do_ allow tethering (aka WiFi hot spot), but charge as if this was magically an entirely different device, so charging for two data plans. So, yes, I am considering a GPRS widget.

On 2G and 3G. Around here, 4G is not known for reliability, so if the carriers shut down 3G those dead areas will get even larger. And of course 2G will continue since Stingrays need it (worse security), no?

@SpaceLifeForm: getting programs into computers was not (given the right computer) always a big problem, EDSAC (1949) used a "ROM" made of old telephone switches to copy the (40 word?) "Initial Orders" into RW memory, and those were sufficient to bootstrap from paper tape. It was quite a sophisticated system compared to the sort of stuff being reinvented with no knowledge of history by the microprocessor folks. I am always amused (and annoyed) to read about how horrid it must have been to program computers in the 1950s, based on how horrid uP development software was in the 1970s, assuming "_of_course_ the older stuff must have been worse"

SpaceLifeFormDecember 9, 2019 6:05 PM

@ Anders

Another thought on your BASIC problem.

Go back to my orginal thought about your loader being at very large line numbers.

If you can read via rs232 one line at a time, can you not manipulate the line numbers on the fly?

Can you hack the interpreter to support INSERT? Not likely easy because it is in ROM.

So...

Maybe you can POKE a FORTH interpreter into RAM. FORTH is small, RAM wise.

Would be a completely crazy hack.

Seriously crazy. But it may work.

Using a BASIC interpreter to POKE in a FORTH interpreter to emulate a BASIC interpreter.

Not going to be a trivial exercise.

But, if you can pull it off, then your loader should be complete, and load any large program.

http://www.nicholson.com/rhn/files/Tiny_BASIC_in_Forth.txt

Transfer of execution is left as an exercise for the reader.

SpaceLifeFormDecember 9, 2019 6:55 PM

@ Anders

Blame Clive for this crazy thought.

The transfer of execution will be the big problem.

Somehow, you will need to manipulate registers or other to jump from the original BASIC interpreter to the FORTH interpreter.

Not going to be simple.

But, there are likely bugs that can be exploited.

This is old code. There is no libc that you have to wash your hands from after return.


WeatherDecember 9, 2019 7:46 PM

@Anders and all
I was looking to find the Int 0x14 eg for copying stuff to ram, I'm guessing Poke loads it to program space. A for loop that reads the Com port and has in basic a array, load it by goto array as that is a jmp instruction, the array points to a struct with the data to load and eax what type of call, you could put the com port in it as well. Its been a very long time that I haven't used Basic or Pascal.
Convert array to Opcode's and jmp, but the first stage is to copy it to real ram, the second part a jmp 7c00 or 0800:address of were you copyed it to, might have to and cr0 11111110 then long jmp into program space to get in realmode.

Clive RobinsonDecember 9, 2019 8:07 PM

@ MikeA, SpaceLifeForm,

used a "ROM" made of old telephone switches to copy ... "Initial Orders" into RW memory, and those were sufficient to bootstrap from paper tape.

Takes me back a ways...

The original Z80 had a real problem with "loading code" especially from switches.

The cause of it was the NMOS CPU had the equivalent of DRAM inside. Which ment that not only did the Z80 chip have a maximum clock speed of approximately 4MHz due to the fact it was internally a 4bit CPU faking it as an 8bit CPU. It also had a minimum clock speed at 500kHz though some people found 200-250kHz worked.

Hence the popularity of other CPU types that did not have a minimum clock speed with hobby constructors. To see why have a look at what I had to go through to get the NMOS Z80 up and running,

The first big problem is even at a 250kHz clock speed it was a very real hurdle to get over such things as as switch open contacts having more than 20pf of capacitance and wiring from front pannels had a fair degree of inductance and if "loomed" --which was a standard construction technique of the time-- a fair degree of cross capacitance, thus cross talk on rising and falling edges.

The solution I used was "Diode Array ROM" on two pieces of Vero --strip-- board that were mounted back to back with the copper strips at 90 degrees to each other and four 74138 3-8 decoders a transistor as an inverter and a 74244 8bit tristate buffer to get 5bits of address or 32bytes of ROM on each board. Another board that had manual entry switches also had a 74138 that was used to drive upto eight ROM boards to give another 3bits of address. Thus a total of 2kbits (256 bytes) of ROM. Which was "not to shabby" as the only RAM at less than "weekly wages" prices at the time was 256bits...

Thus "boot ROM Loader" debugging was done with a soldering iron and wire cutters... Once it worked RAM could be loaded from the switches.

The funny/sad thing was that shortly after I got it working the first moderately priced EPROM chips became available. So those hand built Diode Array ROM boards were redundant and quickly got replaced by a single board with space for upto 64kbits (8 kbyte) of EPROM. Oh and the CMOS version of the Z80 that had no minimum clock speed became available, making "single steping" from front pannel switches easy.

So yeah, even back then I was getting outpaced by technology...

But the big thing was my 32kByte ROM 64kByte DRAM single board computer (SBC) which I put my own version of CP/M on that I had reverse engineered from a commercial system. It had two serial ports one parallel port and a Philips "answer-phone" casset recorder faking both a disk-drive and a tape-drive. With a little cheating I kind of got a spare Apple ][ floppy drive up and running with it but I never realy got the best out of it because somebody gave me not just a real full blown floppy drive but an S100 bus dual floppy controler board that I had up and running in just a couple of days by modifying the CP/M code I had reverse engineered.

Oh another anoyance of the Z80 was no "CLRA/CLRX" instruction many used LDA 0, but that was not as good as using XOR (when you XOR a value with it's self nomatter what the value you end up with zero so XOR A with A...).

The funny thing is I was looking through some old code I had in a box file back in the summet and found the Z80 Fig Forth code I had modified to make my own stand-alone controler board, and my eyes just glazed over I had forgotton all the little Z80 tricks I used and could no longer understand what I had written a tads under four decades ago...

WaelDecember 9, 2019 8:08 PM

@Clive Robinsom, @Anders, @Tatütata at all,

First of all, the context of this discussion was serial communications with the Bell 103 Modem. It allowed digital data to be transmitted over regular unconditioned telephone lines at a speed of 300 bits per second.

Now, The term baud rate has sometimes incorrectly been used to mean bit rate, since these rates are the same in old modems as well as in the simplest digital communication links using only one bit per symbol, such that binary "0" is represented by one symbol, and binary "1" by another symbol.

Hence I used "Bits per Seocnd (Baud)", and I honestly thought about not mentinoning "Baud" at all, but one cannot unsay what has already been said... The discussion should have ended here with agreement. But... not on this blog -- we can't let something like that pass by ;)

If you think of a carrier with time or phase modulation then,

Here you're leading to examples where Symbol rates are different than Bit rates -- already taking what I said out of context. Let's continue, though:

If you think of a carrier with time or phase modulation then, Vinst = Vmax cos(Wf+Ø)

I think you mean:

Vinst = Vmax cos(ωt + θ) ; ω = 2π𝑓 [1]

Howrver with two synchronized carriers...

We're not talking about that, and there's no disagreement there.

Thus you should not be asking, "1) How many bits per symbol?"

Come on now!

Because that presupposes that the number of states to convert the "information rate" and the "information rate" quantatisation are "binary powers" which they need not be.

So binary digits cannot represnt binary powers?

I'll let you have the final word, chief. And I admit that I was sloppy and you got me on a technicality. Watch your back now! It won't be long before I pay you with 1.5% interest :)

[1] I am used to θ rather than ϕ in these expressions. Perhaps a US / UK thing

Clive RobinsonDecember 9, 2019 8:53 PM

@ SpaceLifeForm, Anders,

The transfer of execution will be the big problem.

Depends as much on the CPU as it does the version of BASIC.

Most BASICs from around 85 onwards alowed for "inline assembler" in one way or another (some realy quit bizarre). If the CPU has "software interrupts" as the 8088/6 did then sorting out the CPU register stack is not much of a problem especially if the BASIC is written in C as quite a few are.

However Forth is quirky in many ways, one being you can make it fully stack based with everything depending on a single base location. If you want to know how to do this have a look at some of the "Multi-tasking Forths" where Forth effectively becomes it's own OS. In effect every thing is done with respect to that single base location by "offset addresses". This alowes the Forth to be fully relocatable and remain self consistant thus you can move it up or down memory even by a few bytes and the Forth will neither know nor care.

The downside is that whilst you can multi-task it's all in a single processes memory space so there is no real security or even protection from one Forth instance to another Forth instance.

You can make the Forth instances increadibly efficient by using two dictionaries. The base dictionary common to all instances and the runtime dictionary that contains individual instances "words".

But Forth can get realy realy small, there is some argument as to just how big the base dictionary has to be. In reality it only needs to hold the words that require assembler code. This is somewhere between 20 and 30 base words to still get reasonable performance.

Another Forth trick is to actually make the stacks not "push down pop off buffers" as most are taught stacks should be in their CS "data types" classes, but actually "circular buffers" of only 8-16 16bit int values. I won't even attempt to explain how you use this to best advantage, but it is something Charles Moore does apparently intuitively. And his "GreenArray" 144 Forth core IC uses circular buffers for it's stacks and they are only 8 values in size if my memory serves me correctly.

Any way it's 3AM in the UK and @Wael's "Harsh mistress" has also been having her wicked way with me :-S So I'm turning in and hopefully she will get the hint and let me get some sleep...

WaelDecember 9, 2019 9:01 PM

@Clive Robinson,

"Harsh mistress" has also been having her wicked way with me...

Please keep her a bit longer. She's hit me with so many rights, I'm begging for a left.

JG4December 10, 2019 8:36 AM

Wishes everyone who celebrates a winter holiday, or two, or three, or more, a happy, safe and sane time. Everything that has gone before in electronics will be recapitulated in photonics.

@Clive
http://www.noagendashow.com/

The first effort to regulate AI was a spectacular failure
https://www.fastcompany.com/90436012/the-first-effort-to-regulate-ai-was-a-spectacular-failure

Fascinating. Putting the yellow dots on your documents.

‘We Committed Copyright Infringement and Want to Be Sued by Disney’
https://fortune.com/2019/12/06/we-committed-copyright-infringement-and-want-to-be-sued-by-disney/

Sherman JayDecember 10, 2019 2:26 PM

@JG4
RE printer insecurity --
All the HP color printers (and most other brands) print coded single, tiny yellow dots on all pages printed. The original intent was to 'prevent u.s. currency counterfeiting by color printer". However, the codes provide a lot of identifying info and can be decoded to track the printer, and maybe even its owner. (there are a number of articles on this on the internet. Please use duckduckgo not g00gle to find them)

A close friend of mine has a technique where he prints randomized yellow dots on paper to obfuscate and then runs it through again with the 'content' he wants to print.

SpaceLifeFormDecember 10, 2019 5:51 PM

@Sherman Jay

Tell your friend that she may still not be safe.

It is basically a security thru obscurity idea.

We know how that doesn't work.

The dots are not random, there are parity dots.

Even if you printed an entirely yellow page, and put it back thru the printer, with high quality forensics equipment, I believe the codes will be found that id the timestamp and printer serial number (the serial likely leaked long ago).

It is not invisible ink.

SpaceLifeFormDecember 10, 2019 6:34 PM

@ Clive

"my eyes just glazed over I had forgotton all the little Z80 tricks I used and could no longer understand what I had written a tads under four decades ago..."

You still recall more than I have forgotten.

Documentation. Comment your code.

Yeah, right. Years later...

(looking at code I wrote, *WITH* comments)

"These comments make no sense, WTF was I thinking?"

The code was fine.

It's just that the comments were not clear.

This is a big problem.

I've written 20 to 30 lines of comments just to explain a few lines of code.

But, most of the time, there is no time.

And, you are on a roll, coding away your idea.

No time to comment, I'm on a roll, and I'm going to get this working!

Clive RobinsonDecember 10, 2019 7:48 PM

@ SpaceLifeForm,

I've written 20 to 30 lines of comments just to explain a few lines of code.

A --supposed-- sin I've always committed, and been told off for by "development managers" but oddly never "maintainence managers".

Another --supposed-- sin is when writing in assembly using a "common subset" of instructions. That is after my fourth or fifth CPU family I in effect abstracted the instructions they had in common and only used those for the first cut working code. Only later using architecture speciffic instrictions on the second or third cut if resources were in short supply.

I actually have a code library written in a "common subset" that all it needs to become working code is a text editor with a "find all and replace" to make a CPU specific assembler name. Yes it saves a lot of time, but it wastes resources such as memory space as a trade off, which most times is unimportant.

SpaceLifeFormDecember 10, 2019 8:04 PM

@ FILO

[This attack with other vectors can be very useful. Do not think for a minute that it is not remotely exploitable. Think Bluekeep, for example]

hxxps://www.zdnet.com/article/new-plundervolt-attack-impacts-intel-cpus/


Besides not crashing systems, there's another detail that makes Plundervolt a dangerous attack. It's fast -- or at least faster than most other attacks on Intel CPUs, like Spectre, Meltdown, Zombieload, RIDL, and others.

"Typically we get bitflips in multiplications or AES very quickly. For example, extracting an AES key takes a few minutes, including the computation required to get the key from the faulty ciphertext," Oswald said.

SpaceLifeFormDecember 10, 2019 9:50 PM

@ Clive

I actually have a code library written in a "common subset" that all it needs to become working code is a text editor with a "find all and replace" to make a CPU specific assembler name


Please, please, don't mention the E word.

If you must, go ahead, but I want to know why.

Clive RobinsonDecember 11, 2019 6:18 AM

@ Thoth, SpaceLifeForm, ALL,

After one bunch of attacks on Intel SGX and Secure Enclaves now yet another new wave

"Tis the season of good will" and "Santa's little helpers" have been working hard all year ;-)

Whilst these are not "remote attacks" that malware writers are going to be very much interested in, it does raise the question of "who will be" interested (along with the lesser question of "Why?").

To which the fairly obviois but some what trite answer is, "It depends on what is put in the secure enclave".

Broadly it's "information protection" which falls into two other broad areas, "your information" and "others information". So can be seen as,

1) Product (IP) protection.
2) Communications protection.

A little history back to the "Fritz Chip" idea pushed by "The senetor for Disney Corp" tells you that one big push behind such hardware security is the "Protection of movie and music rights holders to 5cr3w you and the artists over" without having to use an "Online licencing system".

Whilst the market has moved towards "online delivery" for movies and music, there are still other forms of "Offline licencing" for games and high price CAD/CAM and other business related products. Thus one group of people who might profit by these hardware attacks are those breaking licencing systems.

But as we are all supposadly moving into an "online world" where we are "always connected" by our smart devices... Whilst broadly true for many, they generaly do not realise what is required for privacy. That is they tend to think that "secure comms" on public networks is all that is required. So for some of these people they have seen these secure enclaves as a way to manage various aspects of "secure comms" such as communications encryption keys and even the crypto algorithms. In effect treating them like "built in" "external media encryptors" which is a mistake.

Thus to the likes of certain government agencies being able to get at selected targets "key stores" would be highly desirable, and thus these attacks when added to another attack such as a RAT would be advantageous.

However there are others of us who realise that for privacy the current design of hardware can never be secure for the three information uses of "Processing, storage and communication". Thus the mitigation of full offline operation via "energy gapped" systems is the way to go with these hardware attacks.

But is it sufficient? Sadly no.

Even if you have energy gapped systems you need to remember that there are "physical and insider" threats to privacy that still need to be mittigated, otherwise these attacks could be used along with other attacks to cross the privacy gap and turn an offline system into an online system. Thus safes and cages will still feature in the lives of the more cautious, and maids evil or not will be watched.

Hence the old saying about "eternal vigilance" is still rather more than valid and applies to all those who want privacy. Which after all is said and done is considered by many a foundation stone of freedom and liberty from tyrants, be they the elites, their puppets and their respective guard labour.

Clive RobinsonDecember 11, 2019 6:34 AM

@ SpaceLifeForm,

Please, please, don't mention the E word.

If you are thinking about "The Word with the LISP in it" no, I'm not even talking about a "programmers editor", just any text editor from around the mid to late 1980's will do, even Microsoft's "Notepad" or the editor they put in their early versions of BASIC.

Mind you "the E word" could cover "ed" I guess, but then if you were going to use that, then "sed" would be easier in the long run.

SpaceLifeFormDecember 11, 2019 2:51 PM

@ Clive

Your parse spot on.

A script of find commands using the exec option with sed

Is what I would use if those tools are available.


SpaceLifeFormDecember 11, 2019 5:17 PM

@ Clive, ALL

Just want to expound on this a bit, for readers that may have heard other terminology:

' However there are others of us who realise that for privacy the current design of hardware can never be secure for the three information uses of "Processing, storage and communication". '

Processing - data in RAM, CACHE, REGISTERS
Storage - data at REST
Communication - data in FLIGHT

Note: data at REST, may include secure enclave, and possibly other hidden flash on mobo or in controller boards. It does not have to be traditional 'drive' storage.

Data in FLIGHT is not limited to Ethernet, fibre, wifi, bluetooth, RS232, X.25, or cell, as Clive can expound upon about EMF leakage.

FILODecember 11, 2019 5:56 PM

@SpaceLifeForm,

History is what's written to disk, we cat and we tee everything else.

Addwords?

EOF

ThothDecember 12, 2019 12:17 AM

@Clove Robinson

Intel's fix for Plundervolt - no voltage manipulation allowed in Secure Enclave executions.

The best fix is definitely not fixing but disabling parts of problem inducing subsystems. Wonderful how problem solving in IT Security circles are done these days since its less of an effort to create an Intel digitally signed microcode patch file to disable relevant subsystems and just pipe it down to everyone.

Links:
- https://www.darkreading.com/vulnerability-management.asp
- https://www.intel.com/content/www/us/en/security-center/advisory/intel-sa-00289.html

HellBoiDecember 12, 2019 2:07 PM

Wes, thanks for the acknowledgement of the ID spoofing reality that currently plagues several billions of lives.

I still believe that a few malicious groups might have intended for this very bad situation to proliferate. I (re)discovered the group(s) by pondering which group(s) consistently historicly attacks the several differing victim demographic groups.

Another way of thinking about this: I considered which group(s) most fits the likely profile of an attacker obsessed with consistenly attacking the same certain groups for several very large consecutive contiguous durations of time.

The only group(s) which match the formerly hypothetical profile of current ID spoofers astonishingly are those who fit several other profiles of perpetraitors, both known and unknown, for the past several decades.

NAZI's and NAZIish/NAZIlike organizations and groups fit several profiles:

0) already existed and were NOT destroyed (PROJECT PAPERCLIP, 1946)
1) against USA
2) against Russia
3) against United Kingdom (UK)
4) against United Nations, cooperative treaties, peace pacts, mutualism and commensualism
5) against Civil Rights Era, immigrants, minorities, diversity, biodiversity, and dissent
6) in favor of fascism and extremes of experimental heirarchy and control and domination
7) in favor of severe propaganda and psychological warfare
8) in favor of extremist transhumanism and relentless anatomical violations
9) in favor of ecological destruction despite total lack of sustainability (salting the earth, slash and burn techniques)
10) partially cultlike and yet also antireligious
11) historicly involved in extremes of neurotropic and psychotropic experimentation, including the most invasive and disturbing types of cybernetics
12) heavily involved with statistics, surveillance, logging, forced eugenics, DNA experimentation, atomic/nuclear/quantum weapons reasearch, and always seeking to be superior in all fields of knowledge and technology etc.

Clive RobinsonDecember 13, 2019 12:45 PM

@ Thoth, ALL,

... but disabling parts of problem inducing subsystems. Wonderful how problem solving in IT Security circles are done these days ...

There are several ways to look at this, especially considering how long Intel engineers have had to look at the problem.

The first is as you say just "dis and move on" due to lazyness etc.

But if that were realy the case Intel could have done this months ago... So the question is,

    What have Intel been doing in that time?

One argument might be that like last time they were trying to keep things quiet untill after the major consumer market sales of Black Friday and Xmass week. Another might be like last time a senior exec has been "share dumping" or in otherways getting out ahead of the fiscal curve.

But there might be other reasons. One of which is they can not fix the problem any other way.

But there is also the fact that if it's a general to many sub systems fault, which the voltage regulator certainly would be, then this particular reported fault is just the very tip of the iceberg.

If that is the case, then that means this gift is going to keep giving, lots more goodies...

If this is the case and the facts as we know them don't contradict the hypothesis, then there is another few questions that come to mind,

    Firstly this was a two part attack, so what other "second parts" are there yet to be made public? And how many likewise form other two or more part attacks?

But, as the voltage regulator is connected to so many other subsustems, the obvious question is,

    Secondly, are there any other vulnerabilities that effect multiple sub systems? And if so how many and what combanatorial effects will they have?

Remember they don't have to be "obvious or designed" effects they can be both "subtle and inadvertant".

It's known for instance that Intel made the ISA of their x86 architectute so complex they ended up making it independently a Turing machine in one instruction in it's own right over and above it's intended finctions,

    mov is Turing-complete (PDF)

Originally at,

https://www.cl.cam.ac.uk/~sd601/papers/mov.pdf

But now gone from there it can be found at,

http://stedolan.net/research/mov.pdf

Any program written this way would effectively be compleatly meaningless to by far the majority of programmers.

Oh and neither the "mov" instruction nor the x86 architecture are unique in this respect. All sorts of unobvious state machines can be built by abusing the ISA of modern CPU's. Oh and it gets more fun with Intel's MMU,

https://board.flatassembler.net/topic.php?t=15135

The point is they can get around what are otherwise thought to be "hardware architecture limitations" such as the Harvard Architecture. With seperate data and instruction core memory it was --till "Return Orientated Programing" became known-- assumed by many secure against certain malware attacks.

Funnily enough I warned back in may about these issues,

https://www.schneier.com/blog/archives/2019/05/thangrycat_a_se.html

name.withheld.for.obvious.reasonsDecember 14, 2019 3:41 AM

In the context of querying those interested in a public interest surrounding technological issues, a few questions.

Does anyone known if in the recent past any actions the United States has taken to pursue an individual, group, or any other styled “hostile non-state intelligence service?” Curiously, while the hair on the back of my neck stood up (no, the Telsa coil and the Marx generator in the lab is powered down), it occurred to me that executive authorities (which I understand to be completely unconstitutional and in affect gives government primacy over the citizenry) by way of a directive may provide insight about the current geopolitical environment. The technology exploited is referenced in the Presidential Policy Directive PPD-20.

Firstly, it is the evisceration of the relationship between the citizen and her government as authorities enumerated require the submission of the citizen—ex-parte. Wherein the state’s power is derived from decree, not consent, and; that the state is superior respecting no citizen-sovereign.

What authority you might ask? Before answering that, how about an additional gift to and from the executive; unquestionably powered by secret order, the devolution of executive authority reserved exclusively to the executive in two circumstances only; bequeath to cabinet and sub-cabinet level directorate appointees the most awesome, consequential, and dangerous power—the power to wage war. This directly contradicts the constitutional authority reserved for congress.

Simply said; the president has assigned to her subordinates the power of warfare without congressional involvement (I don’t know if this is also true for the drafting of this directive. I am guessing, at best, it went by OLC—a farcical element within the executive attempting to short circuit the court).

Under PPD-20, with a cursory legal analysis and a thorough reading of the policy, suggests that this policy is a hybrid—statue authority derived from memorandum level effort. The Executive cannot in any manner prescribe authorities wherein all are subject, and, leaves narrow not one.

You can go ahead and burn your copy of the Declaration of Independence now, it has been rendered impotent.

The invitation here is if anyone can suggest or identify the completely subversive clauses and describe the retreat from constitutional legitimacy this document represents…it is in there.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Sidebar photo of Bruce Schneier by Joe MacInnis.