Computer Engineering and the End of Moore’s Law: Crash Course Engineering #35


Whether you’re watching this video on your laptop,
smartphone, or smart watch – although, why would you
do that!? – they’re all different types of computers. The widespread use of computers in the last
century has radically changed the economy,
society and even our personal lives! And, like any useful machine, engineers are
always looking for new ways to build and improve them. If you need evidence of how good a job engineers
have done at making computers smaller, faster & more
efficient, try using an old cell phone from the 90s. But the relationship goes both ways. While engineers are making more effective
computers, computers are making more effective
engineers. [Theme Music] Computers are a little tricky to define, but
generally, you know one when you see one. Technically speaking, they’re machines that perform, or
‘compute,’ a series of mathematical calculations, like
addition or subtraction, usually with electronic circuitry. The exact nature of those calculations depends
on the electrical inputs to the computer, and they
happen much faster than humans are capable of. Computers also have machinery that stores
the states associated with its electrical inputs
and outputs, called memory. But they’re so much more than glorified
calculators! Because computers can execute different kinds
of computer programs using the same physical
hardware, they’re incredibly versatile tools. But to be useful, computers need computer
engineers. Like in other fields of engineering, computer engineers
are concerned with improving the various parts of a
computer and developing new ways to use them. Usually, that involves dealing with two main
categories: hardware and software. Hardware consists of the physical parts of
a computer. The exact components can be different
depending on what the computer is for, but virtually
all computers have two core parts: memory, and a central processing unit, or
CPU, which executes computer programs. The CPU contains the electronic circuitry
that actually performs calculations. It can also coordinate the different processes
happening in a computer simultaneously, and allocates
computing resources to different tasks. Memory, meanwhile, can serve a few different
purposes. Computer memory provides the physical space
where computer outputs can be permanently stored, like that picture you took of your
cat trying to fit into a tiny box. It also provides a temporary working space
for a CPU to store relevant bits of information
while it carries out a task. The signals carrying that information, even
if they were originally recorded as analogue,
are passed between computers in digital form. With digital signals, the voltages in the
circuits occupy binary states – some form of
‘on’ or ‘off’ – that represent 1s and 0s. Binary is the underlying representation that
computers use to operate. As a human, though, you’re not going to sit there and
manually send an enormous string of voltage signals to
a CPU yourself, unless you have a lot of time to spare. That’s why computers tend to have what
are called peripherals – things that make it
easier for people to actually use them. That might include a set up like a keyboard
and mouse for sending signals to a computer. To see what your computer outputs, like
this video, you’ll probably have a screen and
a speaker somewhere on the device. In some cases, like on a touch screen, the
input and output peripherals can even be the
same thing. Peripherals take human-style outputs, like keystrokes
on a keyboard, and convert them into the appropriate
binary signal for computers to interpret and vice versa. Other hardware associated with computers includes
things like printers, sensors, and network cables. These are the sorts of things a computer engineer
might bring their electrical engineering expertise
to design and improve. The other side of computer engineering involves
software. Unlike the hardware of your computer, which you need to
physically replace to change a computer’s capabilities, software can be added to or changed to produce
different results with the same hardware. So it’s essentially the programs your computer
runs. For example, you can write a piece of software to store the phone numbers and opening times of every pizza place in the area to your computer’s memory and retrieve it as needed. If you have a camera connected to a computer, you could even program the software to recognize when the pizza delivery person comes to your door and turn down your music so you can hear the doorbell. In short, software is how you tell a computer
what task to perform. Writing software to accomplish a task on the
hardware you have is what’s broadly known
as computer programming. Those are the two main elements of what computer
engineers work with. On the hardware front, they find ways to physically
improve the capacity of the machinery that carries out computations, exchanges signals,
stores them to memory, and connects everything
together. On the software front, computer engineering
has a lot in common with programming. But in addition to programming specific tasks,
computer engineers might, say, find the best way to
carry out a task on a given piece of hardware. Or they could find more efficient forms of
software that make computer programs run faster. Besides for improving the general designs
of computers, computer engineers can also apply those skills to
developing specific devices for aerospace, transport, municipal engineering, medicine, and
telecommunications. So there are a lot of options! But you can get a sense of the sorts of things
computer engineers work on by looking at some
of the challenges facing the field today. For example, you might have noticed that when
it comes to size, most commercial computers
have been getting smaller over the years. Things like laptops, smartphones, and gaming
consoles are able to fit much more computing
power into smaller hardware. The reason that’s happened is because more and
more computer circuit components, like transistors,
were developed to fit into less and less physical space. In fact, since the 1970s, the number of
transistors able to fit on a computer chip has
doubled roughly every two years! That’s what’s known as Moore’s law,
named after American engineer Gordon Moore. Moore’s law describes how engineers have
managed to create more sophisticated computers
in smaller physical spaces. But the law may not last much longer, because
we’re approaching the limit of what we can
do with electrons. Some think Moore’s law has already ended. Electrical components are meant to direct
the flow of current in a particular way. For example, transistors use a smaller current
to stop and start the flow of a larger current. But that job gets tricky as you shrink the
components down. A thin channel can often be hard for the electrons
in the current to pass through. And if you’re packing all that circuitry right next
to each other, you also have to keep the current
from hopping from one circuit to another. Not to mention, you have to be able to make
your transistors out of something. To keep shrinking them down to fit more of
them onto a computer chip, you need to use less
and less material for a single transistor. Eventually, you’ll have to build your transistor
from just a few individual molecules, or maybe
even just a few atoms. But you can’t really build with anything
smaller than that! To reach the limit of tiny electrical components, engineers are looking into alternatives to
the standard way we’ve been constructing
transistors, like by using nanotechnology. Some nanoengineering designs aim to create
transistors that operate on a current of just
a single electron. There are already chip manufacturers on their
way to developing transistors just five nanometers
long – so a few dozen atoms wide. But having a large number of transistors,
while generally great for computing purposes,
creates other issues. One major consideration is the energy computers
need. Like most sophisticated electrical devices,
the internal circuitry consumes a lot of power. Providing all that power is becoming more
of an issue. Computers are being designed with greater processing
power in their CPUs and bigger amounts of memory
storage, which all generates more energy demand. Right now, about 3% of the energy produced
on Earth is used for computing. So making computers more energy efficient
would not only reduce the amount of carbon
dioxide released from burning fossil fuels, but it could save large companies billions
of dollars. Engineers have a few tricks up their sleeves
to try and tackle this. A lot of the actual energy consumption comes from
producing the binary signals computers use, the 1s and
the 0s represented by voltages being turned on and off. In the memory, the smallest unit of that signal,
called a bit, is stored by changing the state of
an electrical component, such as turning a transistor on or off, or by
charging up a capacitor. Switching a bit from a 0 to a 1 or vice versa
takes some amount of energy. So engineers are looking into methods of computing
that can somehow keep the “1” bits intact as they’re
passed through the circuit, so they don’t have to be rewritten during
processing, saving energy. On the software side, computer engineers are also
developing algorithms, special sets of rules used in
computer programs, that work more efficiently. For example, they’ve developed ways of sorting and searching for information that require fewer calculations to be performed by the computer, which can also save lots of energy. Even better, using less electrical energy
means less heat building up within the computer, which in turn could allow computers to operate
faster. So that’s what engineers are doing for computers. But computers are also doing a lot for engineers. For example, computers are essential for the
control systems we’ve talked about, automating the measurement and adjustment
of industrial devices like heat exchangers to make
sure everything operates smoothly. But computers can also help engineers design
and create components for use in other fields
of engineering. That’s accomplished by Computer Aided Design
and Computer Aided Manufacturing, or as they’re
more commonly called, CAD and CAM. CAD is the process of using special software
to design two or three dimensional objects
on a computer. With CAM, you take those CAD designs and manufacture
them. Both CAD and CAM allow for well designed,
precise, and replicable components. For example, printed circuit boards, or PCBs,
are found in lots of common household electronics,
like remote controls. Designing them can be tricky, and you don’t want to have
to print several prototypes using an expensive material
like copper to test each one as you improve the design. CAD software provides tools to model your design
on a computer before physically manufacturing it. You can then check various design elements
in the model and simulate what might happen
in your circuit before it even exists. That saves the material, energy, and time
needed for testing physical components. In the same way, it’s easier to see if a
complicated system of gears and pulleys is
going to work as intended on a computer, rather than having to assemble them every time. Plus, CAD designs are useful for detailing the
exact specifications of a component and sharing
them with other engineers in a convenient way. Of course, once you’re happy with your design,
you’ll want to create the object in real life. CAM is simply the process of taking the designs you
created using CAD and interfacing with manufacturing
machinery, like circuit board printers or laser cutters, to tell the machine how to actually produce
the components you’ve designed. Both CAD and CAM are used everywhere in industry,
from designing and manufacturing cars to making
custom golf putters. NASA engineers are also testing ways to
use CAD and CAM to help astronauts on the
International Space Station. They can use CAD to design tools here on Earth,
then send them up to the station to be printed
on the 3D printer up there. So even engineers who aren’t strictly computer
engineers should be familiar with computers. Programming is also used in a wide range of
engineering disciplines, and the most complex and sophisticated
machines are often operated, or at least
designed, using computers. So, however you choose to apply your
engineering skills, computers are a tool you
probably can’t do without. And with the work being put into computer
engineering, the computers of the future will
be even better. Although they might still bug you about software
updates. In this episode we looked at computers and
computer engineering. We looked at the differences between hardware
and software; how engineers are working on making computers smaller and more energy efficient and how computer aided processes such as CAD
and CAM make it easier for engineers to design and
manufacture parts needed in machines and products. Crash Course Engineering is produced in association
with PBS Digital Studios, which also produces ReInventors, a show that introduces you to the scientists and tinkerers on the cutting edge of green technology. Subscribe at the link in the description. Crash Course is a Complexly production and this
episode was filmed in the Doctor Cheryl C. Kinney
Studio with the help of these wonderful people. And our amazing graphics team is Thought Cafe.

100 thoughts on “Computer Engineering and the End of Moore’s Law: Crash Course Engineering #35

  1. 5:11 I think there's an error with the Moore's law graph. The scale of the vertical axis is already logarithmic, but the graph still looks like an exponential on a linear scale would. "Doubling every x years" should look like a straight line.

  2. Aw, no mention of CrashCourse Computer Science?
    They e.g. covered a lot of the electrical-engineering-y hardware basics on the lowest levels, so have a look if your interest was sparked by this episode.

  3. Software engineer / applications developer / programmer / coder. There are many words for what I do, but please, don't call me an IT guy! 😛

  4. Nitpicking, but there is a difference between memory and storage. I wouldn't store a picture of my cat fitting into a tiny box on volatile memory. I would store it on a non-volatile HDD or SSD though 😉

  5. I watched a video partially about reducing computer energy use on a power-sucking dual Xeon workstation from 2005. Am I a bad person?

    (Joking, although my basement system I watched this on is a dual Xeon system from 2005 and it indeed does suck power… ).

  6. boss : i plan to build a software. my friend says it's a good business.

    tech director : my friend says building a software could takes too long, unexpected result, overpay risk, unstable dependencies, costly long term support, and yet too much competition.

    boss : how about talking about building a software ?

  7. As computer chips become more complex I'm sure their price will sky rocket. Oftentimes when I diagnose a control board as the broken part my customers often choose to buy a new appliance instead of having me replace it. Sometimes the control boards cost almost as much as the appliance itself!

  8. Should've expected this series to trickle back down to computers somehow because I thought this was CC Computer Science again.

  9. Fun fact. The IBM 1130 was simulated on an IBM 360 to determine how the single card bootstrap loader would work. The 12 rows of the standard punch card were expanded to the 16 bits of the 1130's words and the instructions were designed to make it work. OS, compilers, assemblers, peripherals, everything was simulated first.

  10. I want my next CPU to be a quattuortrigintillion core quantum processor powered by interdimensional dark energy. But it'll probably just be a 16 core Intel :'(

  11. Hey did you know:
    99% of Indian YouTuber add 'Tech' in their channel name doesn't know anything about tech.

    1% of Indian YouTuber that doesn't add 'Tech' in their channel name know so much about tech.

  12. Who wrote this? This is a terrible explanation. Theres a lot of details in this video that are tangential, obscure and insequential to someone who doesnt already understand these concepts. For those that already understand, they are too basic. Who was this written for? And she's also a terrible speaker for this. Sounds like shes babbling on and on, just one sentence after another. Crash course usually has good speakers. She is not.

  13. Does anyone else feel that this series is severely lacking in terms of generally useful and relative information? No specific explanation of embedded systems in this episode is simply heartbreaking. In case it helps, I've always found Computer Engineering is best explained from the ground up. By which I mean from a Physics point of view to high level programming abstractions. For future episodes please pick an angle and stick with it throughout instead of jumping around so much because these episodes are not nearly long enough for such a broad scope.

  14. There is a great difference between software and hardware engineers that's not even touched in this. It's telling which divisions you guys have chosen to focus on.

  15. Another thing to consider with energy efficiency is what is running on the computer as a well written C++ will be arguably far faster and more efficient than a well written Java or Node program as the C++ program traverses fewer layers of abstraction and is often optimised for specific hardware.

  16. An issue you approach but don't quite mention explicitly is dark silocon. The chips are getting smaller but the concentrated heat prevents the whole chip from running at once. If we did run power through the whole chip it would break so we need to leave some of the chip "dark". This leaves less of a benefit from jamming more transistors on the chip. One way around this is as was said, reduce the heat. Another idea has been to design hardware "accelerators" that can do one job very well, and just specialize different parts of the chip to do different roles. This way we maintain some benifits from shrinking the chip while not requiring the whole chip to be running.

  17. 5:10 "Calculations per sec per $1000"
    Say what now? Where are these human brains being sold? And why is the price not changing over the years?

  18. Goes to lecture, does an entire 3 hours on Logic Memory and Hardware, comes home, puts on Crash Course: Today we're learning about Moore's Law… UGH!

  19. I was so hooked on this video! I wanna be an electrical engineer with a concentration in computer engineering and I love it

  20. Crashcourse, a suggestion or maybe a request but do you plan to create a video about agricultural engineering and am I correct that it encompasses all four pillars of engineering namely chemical, civil, electrical and mechanical?

  21. And the 90 percent of the 3 percent of over energy budget is used by people running an amd system with an r9 295 and amd fx 9590

  22. The quality of Crash Course Engineering series has really been falling with the numerous poor choices in words and misunderstandings of key concepts. At 6:12, it is very misleading to say that nanotechnology is an alternative to standard transistor methods. Instead, nanotechnology is a natural progression of electronic engineering which fits into the Moore's Law narrative where numerous advances in engineering solutions make up what is known as nanotechnology.

    TLDR: 6:12, Nanotechnology is not an alternative, it is the current standard.

  23. The 3% of energy produced in the world being used by computers was interesting. It would be even more interesting to know what percent is being used for doing cryptocurrency calculations.

  24. First sentence: "Whether you watch this video on your laptop, smartphone or smart watch …"
    Me: Is nobody using desktop computers any more?

  25. Moore's law never ends. All Processor manufacturers have to do is start making negative nanometer chips. Problem solved!

Leave a Reply

Your email address will not be published. Required fields are marked *