Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm an embedded developer, and I've recently started teaching a coworker about microcontrollers. I think it's most helpful to first understand the basics of digital electronics. Transistors can be wired together to create logic gates, and logic gates can be wired together to create flip flops. A flip flop can be thought of as a single bit of RAM. Logic gates and flip flops can be used to create all sorts of useful circuits like counters, shift registers, adders, multipliers, multiplexers, etc.

With this foundation, one can understand a microcontroller as a big digital circuit. A CPU executes instructions by using a digital circuit. The instruction's op code is selecting the appropriate circuit to operate on the instruction's operands. Microcontrollers also have peripherals like timers, serial communications (spi, i2c, uart), and analog to digital converters. All of these are digital circuits that are configured by writing bytes to registers. Those registers can be thought of as the inputs and outputs of those circuits, just as if they were built with discrete flip flops and logic gates. In fact, many microcontroller datasheets provide diagrams of the logic circuitry for these peripherals.

Once you can truly understand how the hardware operates, writing the code to configure and operate that hardware is pretty straightforward.



For an excellent course that covers that full journey (and more): https://www.nand2tetris.org/.


That site links to the 2005 edition of the book The Elements of Computing Systems, but an updated edition was published in 2021. (I found the original edition excellent years ago. I'm not aware how the new edition differs, only that it exists.)

https://mitpress.mit.edu/books/elements-computing-systems-se...


> an updated edition was published in 2021

Good stuff. I wonder if the author has recorded new lectures. Probably not, since the website still links to 1ed...


https://nandgame.com/ is also great

edit: oh, and there's new levels. there goes my afternoon...


Approximately how long does a curriculum like that take?


I got a pretty solid foundation in embedded from ~1 year worth of uni EE courses. Learn a bit of Arduino, learn a bit of VHDL, fiddle with some registers and you're 90% of the way there. The last 10% comes from finding a job and just spending a lot of time learning all the other little pieces you need.


This is a very cool and playful introduction to the basics of the topic:

https://nandgame.com/


You would cover all of that material in an intro to digital logic course (no prereqs). I found it a very enjoyable topic to learn. I don't think it would be too difficult to learn on your own. I would try to find a course that included a lab component, testing things out on a bread board is very fun. Sole caveat; debugging circuits can be a challenge if you don't have anyone to help and you're learning.


You say "no prereqs" but I can assure you, it takes more than a single intro course to go from "I don't know what programming is, or the difference between voltage and current" to understanding a microcontroller at a logic gate level.


Re-read the list of topics that person provides. They are all authentically covered in a single no pre-reqs course. I took an 'Intro To Digital Logic" course a couple years ago hence my confidence about this.


I learned most of this in a first pass through Crash Course's Computer Science playlist. Episodes 2 – 10 discuss the electronics [1].

I normally dislike videos and prefer books, but these were well done. They did an excellent job of pointing out the specific abstractions of each component as they are introduced.

[1] https://www.youtube.com/watch?v=LN0ucKNX0hc&list=PL8dPuuaLjX...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: