Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

How does timing work in a computer?

Von Neumann Architecture

von Neumann Architecture

We have talked about the ALU at length and we have touched on memory, but next we will start to focus on the Control unit.

We discussed that the operations we need to carry out is mostly

Control Unit

The control unit converts binary instructions to signals and timing to direct the other components.

What signals?

We will go to the ALU again since the control unit serves it to figure out what it needs.
Remember in the ALU, has input signals that determine which calculation it will execute based on the input.

8bit ALU

Why Timing signals?

Again, the ALU itself tells us why we need this, we saw that different calculations the ALU does take different amount of times to propagate through the circuit.

Even adding numbers of different numbers that require different number of carries can take different amount of times.

So the control unit waits an amount of time, that’s longer than the slowest calculation before sending the next instruction. It also has to wait that long before reading the output of the ALU and sending that result to memory as needed.

What is a clock?

In a computer the clock refers to a clock signal, historically this was called a logic beat. This is represented by a sinusoidal (sine wave) or square (on, off, on, off) signal that operates with a constant frequency.

This has changed a lot over time.

The first mechanical analog computer, the Z1 operated at 1 Hz, or one cycle per second; its most direct successor moved up to 5-10Hz; later there were computers at 100kHz or 100,000Hz, but where one instruction took 20 cycles, so it had an effective rate at 5kHz.

Execution Times

We will go to our clones of the course website repo:

cd fall2025/
ls
_data			faq			notes
_lab			files.sh		README.md
_practice		genindex.md		references.bib
_prepare		img			requirements.txt
_review			index.md		resources
_static			LICENSE			syllabus
_worksheets		local.sh		systools-fav.ico
activities		myst.yml		systools.png

How many glossary terms are used per class session

grep term notes/*

This code read each individual file, found all instances of term

We can get the computer to time it for us with time

time grep term notes/*
real	0m0.015s
user	0m0.009s
sys	0m0.004s

and at the bottom we see the timing results.

Three types of time

The real time includes the user time, the system time, and any scheduling or waiting time that that occurs.

We can do this a bunch of times to compare how the times vary:

1
2
3
4
5
6
time grep term notes/*
real	0m0.022s
user	0m0.009s
sys	0m0.007s

Notice:

Solution to Exercise 2

the user time

Solution to Exercise 3

Consider closing other programs that are running on your computer, the problem is not related to your code

Solution to Exercise 4

Check for operating system related actions, for example writing and reading files.

As a quick proof of concept compare the following two actions:

No file writing
With file writing
time for i in {1..1000}; do echo 'hello'; done | wc -l

See the timing example in the notes of the last class too!

In the Exercise 2 solution

Prepare for Next Class

Badges

Review
Practice
  1. Review the notes from today

  2. Update your KWL chart.

  3. If you were to use something from this course in an internship for an interview, what story could you tell?

  4. Use time to compare using a bash loop to do the same operation on every file in a folder vs using the wildcard operator and sending the list of files to command. In loop_v_list.md include your code exerpts, the results and hypothesize why the faster one is faster.

Experience Report Evidence

Questions After Today’s Class

Is it possible to turn the clock off and how badly would that damage the computer?

The CPU literally cannot function without it.

Is the clock supposed to “tick” at a constant interval? If so, how does it do it?

yes!

Currently, the typical implementation is (possibly all, but I am not super confident on that) uses a component called an Crystal oscillator. This is a physical device that naturaly produces a fixed frequency sine wave. You can read more inthe Clock rate#Engineering.

Why can’t the computer skip ahead when the add is fast?

Theoretically, it could be designed to do that, but there would need to be extra logic encoded in the device in order to catch that. This is non trivial and since the “ticks” are so fast (GHz=billions per second) we decide to save space on the board for other things (like more ALUs) or save money in prodcing it or give more unused space to dissipate heat better instead of this choice