Jack and The Beanstalk
Well, my intention was not really to recount a fairy story; I could have entitled this topic “How your Computer Works” or more correctly “How I think your computer Works”.
With those two potential titles at the back of my mind, I am hoping the title chosen has prompted some to read to this point if only out of natural curiosity.
My first problem in trying to explain (even to myself) what is undoubtedly a highly complex topic, in what I hope will be a rational, easy to understand way, was where to start and in what order to proceed? Explanation of such complexity does not lend itself to taking a nice sequential path.
So, I did what many engineers do when faced with a problem: I looked around to see what nature did, and do things the same way. If I tried to tackle things in the same order as we followed the educational process (es) that we’ve all been through, it may prove to be the easiest and quickest path to full understanding.
We are all individuals, with mindsets that are unique to each of us. No one can know exactly what is in another persons’ mind unless they tell us. Even then we don’t know whether they’re telling us the truth or a lie. Just as beauty is in the eye of the beholder, so truth or lie is in the mind of the recipient. (a fact which many political commentators conveniently forget – how many times recently have I heard the question “does telling the truth matter”?)
In nature, communication is always between two. Even when only one person is involved, such as reading and writing, it is between that person as a writer and that same person as a reader. To achieve understanding of what we may have written is always, in the first instance, to write then “read and modify” to make sure we’re conveying the message we intend. Once we introduce a third party as reader, then correct understanding is achieved by question and answer.
Truth or lie translates into electronic or programming terms as true or false.
In the real world inter-personal communication is not so clear cut. Between total truth and total lie there exists a grey area where statements contain something which may be interpreted either way. Perhaps the full truth is extremely complex and would take days or months to fully explain. We don’t have time for this and it may not be vital at the current level of understanding anyway, so we may try to convey a rough “picture of things” by analogy and/or exaggeration or both.
I would ask you, the reader, to mull the foregoing around in your mind. I’ve done so and come to the conclusion that truth/falsehood represents a digitisation of our thinking process and enables us to mentally translate “things” to logical true/false statements ideally suited to electronic manipulation.
Of course, in this process, we have lost all the information in-between that can only be processed in the human (animal) mind.
Our natural way of thinking makes it very easy, perhaps too easy, to wander off-topic to things that lead nowhere, or perhaps would be relevant but take us on a path so long that we never actually get anywhere. I’ll try to resist that. But I feel, I must at this stage state what I’m hoping to ultimately achieve.
It is not that you, the reader, will be able to say: “my computer works like …….”.
It is that you will be able to say to yourself: “I know how my computer works and am confident in my own mind that the picture I have is correct”.
For this to come about it will require readers to mull over the facts, inter-relationships, concepts and exaggerations, whilst rationalising it all in their own mind. Some factors will be illustrated in different ways, some in more than one way. Ultimately however, everything should fit together to give a clear picture, unique to each reader, rather like some clear image emerging out of a fog. This, after all, is just the normal learning process we take for granted.
Of course, I will not know any of this unless I get questions or feedback in some way. Nor will I know if such knowledge is of any value. Nor will I know if I have made mistakes in my reasoning unless it is challenged.
Let’s get back to Jack.
When my parents took me to that pantomime many years ago now, I had no idea that it was teaching me anything. I had no idea if what it was teaching would ever be useful let alone in a programming context of 2020.
But it was. In fact, it was teaching quite a lot. Jack did not take his beans to market as his mother instructed, whether just lazy or curious – the reason does not matter – he scattered them on what turned out to be a fertile patch of land. A beanstalk grew etc, etc.
This story brings together the inter-relationships of real-world entities: animal, vegetable and mineral. It illustrates Jack getting out of the real world (climbing the beanstalk) into a world that can only be imagined.
It illustrates how, even in this imaginary world proportionality must be maintained a Giant (animal), a Castle (mineral) and a Beanstalk (vegetable).
Notice there are TWO categories: Real and Imaginary.
Consider this in conjunction with the TWO categories mentioned earlier: True or False.
Now the TWO categories of Reader and Writer.
Now we’re ready to take a first peek into Mathematics. We can consider mathematics as addressing TWO separate categories of calculation: Real and Imaginary.
The first of these (Real) is what we’re all familiar with: addition, multiplication and so on. Such calculations are ideally matched directly to the workings of a digital computer.
The second (Imaginary), some of us may not be so familiar with. It embraces such things as infinitesimal calculus and matrix manipulation. We need not go into this in any more detail at this stage. Suffice to say calculations made in this category do not directly match the workings of a digital computer. The human mind is not just the only place where they’ll be understood it is also the only place where they can be undertaken.
It is simply impossible to breach the gap between real and imaginary other than mentally.
But, because mathematical calculations are essentially abstract, we can effectively bridge the gap, allowing us to calculate the imaginary factors in the same way as real. And, as we know a bridge runs between TWO points. In this case the start point occurs when the evaluation of a calculation gets close to zero, but not quite, and the endpoint gets close to infinity, but not quite.
Now, of course, the further apart the start point and end point are, the longer the bridge will be, and the longer the bridge the more time it will take to cross. The bridging process has introduced Time. (When we move on to understanding the electronics, we shall see we cannot simply walk faster as we may do in real-life.)
A brief look at Time
It is appropriate at this point to take a brief look at time. We all know that nothing can be done in zero time. Nor can we learn anything in zero time. By using Jack and the Beanstalk I have in effect compressed learning time, by not explaining the imaginary category of mathematics in any detail, I have, at least for the time being, reduced explanation time to that required to read the above paragraph. Ok, I know it told you nothing!
Telling the Time
Of course, we can all tell the time. Easy just look at our watch.
But is that all it is telling us? On the face of it yes, but it also gives us some insight into both the real world and electronics that we shall need later. We can read the hour reasonably closely as it moves past a numerical graduation on the watch face. But now look at the second hand – trying to read its’ position to within a second is not so easy. In fact, the marked graduations represent a digitised time difference, whereas the hands themselves are moving continuously.
As we shall see (or reason) later, it is just not possible to identify exact instant the second hand initially finds itself over any fixed graduation on the watch face – it is always just before or just after. We can always make the separation smaller, the hands thinner, leading to the concept of thresholds. In real life we’re always dealing with thresholds (Bride over the threshold – but when exactly?). It is the same with electronics; we can never measure anything exactly; time is a factor – so any measurement is always before or after some threshold.
The thing to note at this stage is that the difference between the “working” of things in real life and the “working” of things in electronics is simply one of speed. (There are other differences that we’ll come to later.)
Bridging the Gap
To explain how we can bridge this timing gap, it is easier if I ask you to forget about your watch for a while and instead think about a time-piece that preceded it – a Grandfather Clock. It ticks and tocks.
It ticks and tocks from one side of the bridge to the other; the time it takes to do this being regulated by a “swinging” pendulum. The rate at which it swings is controlled by TWO factors – the length of the pendulum and the weight it is carrying. We can, of course, vary either independently, but it is natural law that determines the rate of swing.
If we were to attach a pen to the end of the pendulum so that it was able to write on a moving sheet of paper behind, it would draw a sinusoidal wave shape.
The foregoing corresponds to exactly what is going on within the electronic wizardry of a digital computer – the natural laws of physics which control pendulum swing rate exactly apply in just the same way as the natural laws of physics which control the frequency of an inbuilt oscillator exactly. (I will expand on this later.)
There are of course differences between electronic and real-world operations, but for the purposes of understanding what is going on, in this instance it is simply timing which we can conveniently ignore. The pendulum takes seconds to swing, the oscillator takes micro-seconds or less to swing.
Processing Information
Pr
JB2
JB2
Hello Terry,
Your "Jack and The Beanstalk" is highly philosophical and that is not my strongest point :unsure:
But about your remark "It is that you will be able to say to yourself: “I know how my computer works and am confident in my own mind that the picture I have is correct”. I can say that I was indeed confident to say that in the beginning years. Now I can only say the opposite. Like the situation I posted in https://www.xsharp.eu/forum/public-vo-v ... op-working, things I see more often.
Dick
Your "Jack and The Beanstalk" is highly philosophical and that is not my strongest point :unsure:
But about your remark "It is that you will be able to say to yourself: “I know how my computer works and am confident in my own mind that the picture I have is correct”. I can say that I was indeed confident to say that in the beginning years. Now I can only say the opposite. Like the situation I posted in https://www.xsharp.eu/forum/public-vo-v ... op-working, things I see more often.
Dick
JB2
Hello Dick
Thank you for your comment.
Just as we probably all accept the idea of electrons whizzing round a central nucleus of protons, neutrons etc. we have never actually seen such a thing.
Nevertheless there is abundant evidence around us to point to it being a fundamental truth we can all accept and build on.
Same with my "Air Traffic Picture" reasoning. If we can simply accept the "picture" as being correct (whether we follow the underlying philosophy or not) we can use our source code to generate a diagram we can all understand.
That is where I'm leading.
Our diagram follows a standard "diagramming pattern". In fact it would take the form of thousands of Route Diagrams having consistent pattern.
Too many diagrams to check manually, but the logic generating the diagram could itself be computer-checked and errors flagged.
Even if it were to take a few hours of "computer-checking" how much better than publishing with undiscovered bugs?
If we had such Software-Tooling / Diagramming Software, the code you posted (Data Dialog/bBrowser) switched from compled output to generating a Diagram Output would then have flagged up where the problem was.
Terry
Thank you for your comment.
Just as we probably all accept the idea of electrons whizzing round a central nucleus of protons, neutrons etc. we have never actually seen such a thing.
Nevertheless there is abundant evidence around us to point to it being a fundamental truth we can all accept and build on.
Same with my "Air Traffic Picture" reasoning. If we can simply accept the "picture" as being correct (whether we follow the underlying philosophy or not) we can use our source code to generate a diagram we can all understand.
That is where I'm leading.
Our diagram follows a standard "diagramming pattern". In fact it would take the form of thousands of Route Diagrams having consistent pattern.
Too many diagrams to check manually, but the logic generating the diagram could itself be computer-checked and errors flagged.
Even if it were to take a few hours of "computer-checking" how much better than publishing with undiscovered bugs?
If we had such Software-Tooling / Diagramming Software, the code you posted (Data Dialog/bBrowser) switched from compled output to generating a Diagram Output would then have flagged up where the problem was.
Terry
-
- Posts: 50
- Joined: Fri Feb 16, 2018 7:52 am
JB2
Hi Terry,
This post is a nice read. You are a really good writer, and the peek you get from your thoughts are interesting, but...
If you really want to understand how a computer works, this page would be better I think; https://eater.net/8bit
About time: A swedish scientist said in a radio program that science knows very little about how time works. Basically time is the thing that we measure with a watch. The rest is still a mystery.
/Mathias
This post is a nice read. You are a really good writer, and the peek you get from your thoughts are interesting, but...
If you really want to understand how a computer works, this page would be better I think; https://eater.net/8bit
About time: A swedish scientist said in a radio program that science knows very little about how time works. Basically time is the thing that we measure with a watch. The rest is still a mystery.
/Mathias
JB2
Hi Mathias
Thank you for your comments.
Building an 8-Bit computer from scratch does indeed address the logic gates, and wiring thereof.
Understanding this is difficult enough, extending it to a modern 64-bit machine and at the same time trying to conceptually tie it in to the operation of an actual application, would simply be impossible for me, and I suspect for many others.
This topic -Jack and the Beanstalk- was meant to take a step back from the complexities of electronic switching and take a purely conceptual view of what is happening behind the scenes.
The two topics, I suggest, complement one another, rather than being alternatives.of the same thing.
With hindsight I should perhaps have entitled this differently: "Conceptualisation of How your Computer does things"?
It is true we don't know much about how time works. We never have, and my guess is we never will. But we can accept the ideas behind periods of time - close to zero, close to ifinity - in calculus. We can accept the idea of terminal velocity. Our minds can accept the mathematical concept of continuity. All of this ties up, not in ways we can see, but in ways which can be evidenced by what we know about our environment. eg Radio Transmission.
That is why I tried to tie my view of computer operation to things that can be evidenced in our environment - e.g pendulum swings, oscillators, the national grid.
Digitisation always results in loss of information.
Some time ago I posted "Dot Net from the ground up" (page 4 Chit Chat). It expressed some of the ideas behind this J&B post.
Regards
Terry
Thank you for your comments.
Building an 8-Bit computer from scratch does indeed address the logic gates, and wiring thereof.
Understanding this is difficult enough, extending it to a modern 64-bit machine and at the same time trying to conceptually tie it in to the operation of an actual application, would simply be impossible for me, and I suspect for many others.
This topic -Jack and the Beanstalk- was meant to take a step back from the complexities of electronic switching and take a purely conceptual view of what is happening behind the scenes.
The two topics, I suggest, complement one another, rather than being alternatives.of the same thing.
With hindsight I should perhaps have entitled this differently: "Conceptualisation of How your Computer does things"?
It is true we don't know much about how time works. We never have, and my guess is we never will. But we can accept the ideas behind periods of time - close to zero, close to ifinity - in calculus. We can accept the idea of terminal velocity. Our minds can accept the mathematical concept of continuity. All of this ties up, not in ways we can see, but in ways which can be evidenced by what we know about our environment. eg Radio Transmission.
That is why I tried to tie my view of computer operation to things that can be evidenced in our environment - e.g pendulum swings, oscillators, the national grid.
Digitisation always results in loss of information.
Some time ago I posted "Dot Net from the ground up" (page 4 Chit Chat). It expressed some of the ideas behind this J&B post.
Regards
Terry
JB2
Hi Dick / Mathias
I have been reflecting on the comments you both made, and realise now that this post did not get across as intended.
That was to a large extent my fault. The title was wrong, or insufficiently focused. What I really meant was “How to envisage what any application is doing when it is running”.
The fact is that the underlying reality in terms of circuit-switching, time taken to effect that switching and then do something whilst those switches are set, is far too complex for anyone to follow.
As Dick said, in days gone by, he was able to envisage how his programs were running. It was easy. I suggest that was because we could all “see things” in two dimensions and relate that in our minds to doing (calculating) things with pen and paper.
Things move on, however, but fortunately there is still an easy way to envisage “a running program”. Perhaps not so immediately obvious, and hidden behind the scenes, we can “see” it as operating in 3 dimensions.
I suggest this will make things easier since it is exactly how we have to operate effectively in real life.
I realise that to understand J&B reasoning some knowledge of, unsurprisingly, electronics is required. But it is not vital. I could simply just ask you to think in 3-D terms.
I am sure you will find many advantages in thinking that way – but not all are for today.
Terry
I have been reflecting on the comments you both made, and realise now that this post did not get across as intended.
That was to a large extent my fault. The title was wrong, or insufficiently focused. What I really meant was “How to envisage what any application is doing when it is running”.
The fact is that the underlying reality in terms of circuit-switching, time taken to effect that switching and then do something whilst those switches are set, is far too complex for anyone to follow.
As Dick said, in days gone by, he was able to envisage how his programs were running. It was easy. I suggest that was because we could all “see things” in two dimensions and relate that in our minds to doing (calculating) things with pen and paper.
Things move on, however, but fortunately there is still an easy way to envisage “a running program”. Perhaps not so immediately obvious, and hidden behind the scenes, we can “see” it as operating in 3 dimensions.
I suggest this will make things easier since it is exactly how we have to operate effectively in real life.
I realise that to understand J&B reasoning some knowledge of, unsurprisingly, electronics is required. But it is not vital. I could simply just ask you to think in 3-D terms.
I am sure you will find many advantages in thinking that way – but not all are for today.
Terry