Friday, October 11, 2024

Annoyances in Europe: The consequences of a totally bodged EU cookies law.

Websites of influential companies including the BBC now have punishments for exercising your rights.

What you as a user want is this:

1. Disable (default. it's an opt-in).
2. Go to a settings page (with data usage implications for each choice and externally managed spot checks so they are kept honest).
3. Disable (operator should not be allowed to change the content you can access).

Would this damage the analytics space? Yep. Probably. Who cares? We need ethics, not profits, to guide our rights. We should all be pleased that ethical legislation makes it harder to profit from unethical activity.

(Shit. Another blog post down the Facebook toilet). I'll post this to my tech blog where the government steal my ideas 😊).

Sunday, February 04, 2024

What I learned from COVID with two years of hindsight.

 If there's one thing that will stay with me after lockdown, it's..... eczema. I have never had atopic dermatitis before and now I have.


It came from overuse of antibacterial handwash during peak lockdown.

Turns out, if you didn't already know, that overwashing is not a great idea because it actually washes away the acid mantle which.... hey presto...... actually protects you from bacteria. So, too much sanitising actually makes you More prone. To viruses? I don't know. This isn't my area of expertise. What I have read is you can fuck yourself up if you wash too much.

So, during lockdown I won myself an itchy rash on the backs of my hands around the knuckles where abrasive contact while washing was most intense.

It had subsided but now, maybe thanks to the fact I still take long daily baths and refuse to leave the house unless I have, the skin rash has spread. I have it on the backs of my hands and feet and arms.

I have looked it up and apparently there is no known effective treatment so I have had it now for three years.

I also don't really believe the government advice. Am I conspiracy theorist? Well, not really, but it is true that nobody ever seemed to really work out how it was transmitted. Everybody knew it was being transmitted but, being a respiratory sickness, it Might have been through inhalation, not touch.

The guidelines then were based on a basic understanding of the germ theory of disease which dates back a hundred years.

Because the disease was very new and the medium of propagation wasn't known the truth was that government medical advisers didn't really know what advice to give so they came up with that because... at least it's something. Not only is it something it's probably the only thing which could be done. Its why we had to wear masks too because they didn't know how you got it.

There were theories you could get it through your eyes because if the suspicious the virus was airborne was right then it could fall into your eyes like dust and make its way into your body that way. There was no advice on that because it sounded like speculation, even though it had about as much empirical backing as anything else, and what can you ask people to do? Wrap a scarf around their head? Wear swimming goggles?

So, touch and mouth where the ones which got the attention and I still haven't seen any really good research on how it is caught, because you can't. God only knows, maybe it was transmitted by insect bites? You don't know exactly how you caught it, it's too small to see in the air, so what can you look for?

What you could do is prepare different room and in each one isolate the various theories. Have a covid aerosol in one, have a covid covered surfaces in another and in a third room have covid aerosol but have everybody wear swimming goggles and see what percentage of people contract it in each room, which won't be reliable, because you probably can't account for differences in susceptibility and you need volunteers who want to expose themselves to a respiratory disease. Hey, rats and beagles, we could use your help. For ethically questionable reasons humans care less about members of other species dying. Why don't they care less about members of their own species dying? It's for their benefit and there's 8 billion of them. Humans, if you killed even a few billion people it would probably do more to help your own species, and almost everything else, than killing a few dozen rats.

So at least some of the government advice was not well grounded but based on a best guess by science advisors who didn't know what to do nor what to expect because the whole thing which made it scary is nobody knew what to expect because they hadn't seen it before.

The lesson here is in how the media capitalising on the debate by politicising it taking polemical opposite sides of the argument where one set of opinions became the left-wing view and another became the right-wing view with both sides calling the other stupid.

It's definitely a less in ad hominem since anything said by right-wing commentators was deemed wrong by the left because it came from the right-wing media, and vice versa, with both sides calling the other stupid.

Both sides had their points:

The left were right in that it is wiser to take a view from somebody qualified to talk about the subject rather than some random bozo.

The right were right in saying the advice was shady.

The synthesis of these apparently opposing sides is that, in this case, "the experts" were not specifically experts in COVID vectors because nobody was so what you got was speculative generic advice based on creaky germ theory of disease which could offer no greater specific advice in this case. So, wash your hands, you might be touching deadly germs which might wipe out the human population.

Monday, January 01, 2024

A step closer to generalised artificial intelligence.

Gemini AI is claimed to perform better in some tasks than human experts.


Years ago I read something which inspired me. The conclusion I came to then was that although computers are known to far exceed humans in some tasks in speed, accuracy and reliability, such as number crunching, there are others deemed somehow "magical" like empathy, creativity and humour. However, there is no reason at all to think computers cannot eventually exceed humans at Everything.

The denial they will begs the questions: Why? When you don't have a reason to accept a claim is true then it is at least as important to acknowledge when you also do not have a reason to believe it is false.

I now firmly believe no sort of magic exists. My take on all phenomena is essentially Newtonian to the extent I believe even things like Probability Density Functions and other strange quantum behaviour will have a simple causal explanation even if we can't discover it.

Conway's Game of Life shows us that very complex behaviour can grow from very simple production rules. The Game of Life has been proven to be Turing complete. This means using the exact same rules Conway's Game of Life uses, you can build any artificial intelligence which can be computed any other way. It would be hugely inefficient to try it but with sufficiently fast processing it could be done.

This implies to me it may not still be unreasonable to suppose the universe is some sort of vast automaton underpinned by very simple rules at the ultramicroscopic scale.

I sometimes now even think of motion as quantised and discrete in the same way the motion of gliders in Conway's game of life shows that motion can be interpreted in things which are not really moving. When am object appears to move from one place to the next it is really a chain reaction causing a phenomenon in one location to dissolve while generating another very similar phenomenon in an adjacent space. So, I think we may eventually prove motion is an illusion.

Sunday, December 10, 2023

The infinite regress paradox of time. A proof-by-contradiction that time cannot exist.

For time to exist it must require a mechanism through which its function is defined (even if humans are unable to discover, determine or define it)
But for such a mechanism to exist the mechanism itself would need to operate within the paradigm of time thus producing infinite regress without plausible origin.
Therefore human perception of time must be an illusion.
Corollary: Any duration of time, be it a second or the photon absorption rate by transitions between the two hyperfine ground states of a caesium-133 atom, can only be made meaningful when stated in terms of its existence as a frequency ratio of some other known recurring phenomenon. Therefore any duration of time can only be stated in terms of being a factor of some other duration of time, never having any absolute frame of reference. Consequently any concept of time is a self-referential supposition rooted in circular reasoning.
Like
Comment
Share

Tuesday, December 05, 2023

Let's stop maths: We started at the wrong place.

 I know that nobody else is as interested in mathematical foundations as I am, but hey, I rap about things you aren't interested in all the time. :)


This time it's the succession function.

https://en.wikipedia.org/wiki/Successor_function

It's like succ(n). Sure, I don't think there's anything much wrong with adding 1 to something, especially, but it's just that "maths starts here" thing which makes you go "WHY?". These things always seem to be some arbitrary cop-out like "We can't actually explain why we did this but, well, we felt we had to do something, and this is that thing". It isn't packaged with a perfectly reasonable explanation for why that thing should even exist and should not be discarded or replaced with something else.

You might start arguing against it by saying adding one is just a special case of adding, probably not more magical than adding 5 or 17 but I think even adding is just a special case of something else.

That way it moves up, rather than down, levels of abstraction.

I prefer a top-down approach to forming a philosophy. Beginning with the highest level of abstraction as also the most fundamental and each thing filling in at reduced levels of abstractions and more specificity.

My own starting point is a relator function. This is what I was taught at 11 in the elite-kids maths class of my middle school.

You begin with a transformer shell with no content:

->R->

The R is a label on a box. Inside the box is a mechanism which takes some form of input, does something with it, then produces output having done that thing.

That is a high level of abstraction: The set of things which do things to things.

Then in that box you can place whatever you want. Not only a successor function but addition, multiplication, a random number generator, anything you want. So, at first, the whole set of functions themselves become the most general case.

What you then have is a tool which you can use to make a kit bag of things you need for whatever you are trying to do. The kit bag of things you can make can include an axiomatic basis for mathematics and all the bits you need to do it.

I think it's time people stepped out and admitted there isn't only one possible axiomatic basis for maths and that any such basis is, at some level, quite arbitrary. So, rather than merely have a mathematical basis, I think the starting point should be the acceptance that there is an arbitrariness to mathematics as a thing. Maths is not a fundamentally existing thing but a tool, one of an infinite variety of tools and arbitrary things we could have made out of the more fundamental and highly abstract idea of things which do things to other things.

That "things which do things to things" folds in the crucial concept of cause and effect. Having defined some kind of substance and its properties you then define the things with properties which act on those things.

In that way I think existential matters, such as what are the basic laws of the universe, can escape from being placed inside mathematics. It separates the two. Real world object behaviours exist. Mathematics is something we invented and now maths is the tool we use to quantify and describe observations about real world things and how real things act on other things. However, if you want to do that then why force-fit those real things to mathematics? It means saying "Hey, we got all this real stuff and we have this tool we made so let's take this conceptual tool and use it to describe what manifests in the real world".

How about we don't? How about we first accept maths and real-world physical interactions are two separate realms? We may devise a fundamentally far more appropriate tool for describing reality than mathematics. The tool we devise may be completely foreign to mathematics and incompatible with it. We can create a tool especially for dealing with real-world physics without going via the pre-existing tool of mathematics. Not only do I think we can do this, I think we should. Let's scrub maths except for physics and what we can say about physics using maths and using that body of theory then devise a far better and more appropriate tool for coping with those real world phenomena than mathematics.

A relational operator can do this because real-world physics is still "things that do things to things" (or perhaps more fundamentally something which acts on itself) but I don't think mathematics is its basis. Mathematics is a just a tool we already had which we now use. So I say starting with "R" we can build maths, or physics, without having have both. We don't do enough to keep those worlds apart and we don't do enough to point out both of those worlds fit under the one umbrella of things and the things which change them.

Monday, November 13, 2023

High-end PC, November 2023.

 Want a top-of-the range PC? I found one for you!

The NVIDIA DGX GH200 Deep Learning Console.

Look along corridor and you will notice pairs of gold panels on both sides of the image arranged in banks of four pairs. Each single panels of those 8 is an Nvidia DGX H200 GPU.

Those GPUs cost upwards of $289,000 each.

The DGX GH200 console connects 256 of them,.

In this image you can only see about 80 so the full length of the corridor is three times the length of the gangway you see here.

It's a bit memory constrained, if you ask me. Each GPU has half a terabyte of RAM which means it only has 1.4 terabytes of RAM in total.

That means if you have a 32GB computer it only has about 4 million times as much RAM*

*This figure is not exaggeration for effect, it is an approximation. The actual value is 4,718,592. So, 4.7 million times as much RAM as a 32GB computer.

It is very hard to imagine how much it would cost and it is hard to find information as the sales department do not want to say anything which would stop companies like Microsoft and Google beating a path to their door. My estimate is it would cost something like £50 million.

Wednesday, November 01, 2023

Why the 68000 is not 32bit.

For James Ross and Simon Harris. Your argument is remarkable because you have, pretty much, described why your subsequent argument is not valid. You begin by delineating a prescient explanation of exactly why what the user sees /does not mean anything/ and the reason is the one you gave: What it can be made to /look like/ to the user does not matter. What matters is what the CPU does at an operational level. After that, the higher layers can be made to /look like/ anything you want.

What you can do on, say, a 6502, is have code like this:

CLC
LDA augend
ADC addend
STA sumtot
LDA augend+1
ADC addend+1
STA sumtot+1
RTS

augend .word $8328
addend .word $abfd
sumtot .word $0000

Oh look at those 16bit words!

Look at the fact it summed a 16bit value!

Tell yourself the truth: They are NOT 16bit words are they?

The fact it is /presented to the programmer as if a 16bit value/ is not relevant, is it?

You could just as easily have .dword .qword directives, couldn't you?

You could have assembly lines like:

.dword $5629abd5, $4c8a9f5c

Is the 6502 a 32bit CPU because you can have assembly instructions which will present RAM contents as if they are 32bit?

NO!

It makes no difference.

Why?

You know why.

It makes no difference because the 6502 is still 8bit no matter how many bits of notional representation you have.... which can be extended arbitrarily. It is STILL 8bit because what it /looks like/ to the user or programmer is IRRELEVANT. It's irrelevant because what it looks like to the programmer is NOT what happens operationally on the CPU. The CPU remains 8bit even if you have 128bits of assembly language representation.

The flaw in your argument is your desire to pass of pairs of 16bit values expressed notionally as one 32bit value as somehow changing the facts of the underlying architecture. It no more does that than a .word directive on a Commodore 64 assembler does that.

"however wide their data bus have opcodes that can work on 8,16 or 32 bit values"

No. It can't. That's the point.

1st: Opcodes don't DO anything. An opcode isn't a functional thing. It isn't a piece of hardware. It's an abstract notional entity.

The only relevance it has to what the bit rating of the CPU is the size of the opcode in bits which, on the 68000 is 16, because the 68000 is a 16bit CPU. The 68000 has 16 bit opcodes. So you're back to where you started again: It is like this because it is, at most, 16bit.


Also..... the whole point of your sentence is your desire to actively try to push this whole argument out of the realm in which it is situated, in the actual corporeal world on hardware which can be touched and weighed and Into the realm of pure notion which isn't real, it's just in your head, then you have to try to argue that the reality is irrelevant compared to whatever is in your head.

You have to accept that argument has no credibility because clearly how it looks to you /in your imagination/ is NOT what is important.

By your own initial gambit you have already accepted it can be made to /look like/ anything you want.

You can easily theorise about an imaginary chip with a few machine code instructions which is an 8bit CPU with up to 16 instructions indicated by the top four bits of each 8bit instruction word. You can imagine the lower four bits are the number of 8bit machine words to pull through the ALU. A bit like a vector processor. It would mean you could..... using your argument...... declare it to be a 128 bit CPU because you will have microcoded instructions which start processes which act on up to 16 successive 8bit values IN SEQUENCE.

Surely you can see clearly why that is a false premise. Starting a production-line process which can theoretically operate on arbitrarily long series of 8bit numbers tells you what it really says: That these operations are occurring on an 8bit micro.

If I do something like this:

LDX bytes
loop:
CLC
LDA $1000,X
ADC $1100,X
STA $1200,X
LDA $1300,x
ADC $1400,X
STA $1500,X
DEX
CPX #$FF
BNE loop
RTS

I have thus set up a piece of assembly code which operates on an arbitrary number of bytes up to 256.

Does this imply the 6510 is an 2,048 bit CPU?

No. Obviously is does not and the mere suggestion it does is clearly silly. It is NOT a 2,048 bit CPU because it only operates on 8bits at a time and nothing you can do programmatically changes the fact. No argument about having .word assembly directives changes the fact either. You know for a fact those "arguments" are not relevant. There is nothing any programmer or assembler can do to change the fact the 6510 is 8bit. No number of successive 8bit operations and no amount of representing larger than 8bit values in an assembler.

OK. I want to take that one more step before relating it back to the 68000.

Imagine I did this instead:

CLC
LDA augend
ADC addend
STA sum
LDX 8bit
BEQ end
LDA augend+1
ADC addend+1
STA sum+1
end
RTS

augent .word $1234
addend .word $5678
sum .word $0000

So.... what does that do?

Well, it does this:

That program will do one of two things. It will Either sum an 8bit value OR if a flag is set it will continue to add two more 8bit values with carry thus creating an 8bit pair which can be presented to the user as a single 16bit value.

Now..... did that make the 6510 16bit?

Obviously NO! So why did I do that? It just happens to be Exactly how the MICROCODED 68000 actually works.

If I write a 68000 instruction in assembly:

add.w d0,$1234

It is assembled as:

00001000 D1
00001001 78
00001002 12
00001003 34

Before moving on from that: Anything else you notice about it?

The memory addressing is Not 32bit. It is not even 16bit. It is 8bit. The Motorola 68000 was released in 1979. It many ways, for example instruction timing and memory addressing, it behaves more like an 8bit CPU than a 16bit CPU. Anyway.... I digress.

On another tangent, look at the branding too: 68000. It's great isn't it? What if they /called it/ the 6900? It isn't irrelevant to point out just how easily people's brains are hijacked by cool branding. The 68000 definitely does have cool branding and great marketing and a great public image. If it was /called/ the 6900 then for no reason except having 4 numbers in it name it doesn't /look/ quite so 16bit anymore does it? This is psychological. 4 digit names are more highly correlated with 8bit CPUs and 5 number names to 16bit and above. It's a statistical correlation which influences notional thinking. You can do a lot with branding as Motorolas then marketing manager did. 68000 sounds far cooler, bigger, cleaner and more sophisticated than 8086 but it /means/ nothing. 68000, being a cool, big, clean number doesn't mean anything but it bends people's minds anyway.

Back to the 68000 as it is......

If I write:

add.w #$7654, d0

It is assembled as:

00001000 0640
00001002 7654

If I write:

add.l #$76543210, d0

It is assembled as:

00001000 0680
00001002 7654
00001004 3210

What can you say about the high nibble of the low byte of the machine code for the instruction?

You can say that bits 2 and 3 state how many 16bit bytes to load after the instruction.

In fact, to be really pedantic about it, it's log_2 of how many 8bit bytes have to be loaded.

2^0 = 1
2^1 = 2
2^2 = 4

This is consistent with:

add.b #$76, d0

Which assembles to (can you guess?):

00000100 0600
00000102 0076

So.... what's consistent about all three?

1. The instruction code is always 16 bit.
2. The operands are always measured in 16bit clusters. Even when you choose an 8bit value.
3. Each of those 16bit parts must be 16bit aligned: They cannot begin on odd numbered memory addresses. They must begin on 16bit aligned RAM locations. Not 8, not 32, but 16.

Why? Because the 68000 is 16bit microprocessor.

So, what happens then when you use:

add.l #$76543210, d0

32 bit somehow?

No.

It works like the assembly code I wrote earlier.

What happens is this:

1. It first takes a 16bit instruction code from RAM.
2. It then decodes that instruction.
3. The instruction code incorporates in it a code to instruct the microprocessor to then take in succession each of two 16bit values in RAM following the instruction and feed them into the ALU, one after the other, and add each, one at a time, to each of the two 16bit on chip RAM locations reserved for the d0 register space.

So... AT NO POINT DOES ANYTHING 32BIT OCCUR.

Let me say that one more time:

AT NO POINT DOES ANYTHING 32BIT OCCUR. Not even if you use .l instructions.

So, add.l doesn't get you anything. It is just a prompt to the CPU to perform two 16bit operations one after the other.

It does this because a 68000 is a 16bit CPU. What a disassembly can make it look like clearly didn't change anything just like .word directive on a 6510 or word-size memory dumps on a 6510 do not make the 6510 an 8/16 bit CPU. The programmer representation is inadmissible because you can make it anything you like and it doesn't change the underlying reality. The ground-truth is still the same. 68000: 16 bit instruction length. 16 bit ALU 16 bit data bus. 16 bit aligned assembly language.