←back to thread

157 points tdhttt | 1 comments | | HN request time: 0.2s | source
Show context
pclmulqdq ◴[] No.45125831[source]
EE encompasses a lot of "engineering that takes hard math" at a professional and research level (similar to "hard CS," just different fields of math), so it is very hard to do as an undergrad, when your background in complex analysis and E&M is weak.

Early classes on circuits in EE will usually take shortcuts using known circuit structures and simplified models. The abstraction underneath the field of analog circuits is extremely leaky, so you often learn to ignore it unless you absolutely need to pay attention.

Hobbyist and undergrad projects thus usually consist of cargo culting combinations of simple circuit building blocks connected to a microcontroller of some kind. A lot of research (not in EE) needs this kind of work, but it's not necessarily glamorous. This is the same as pulling software libraries off the shelf to do software work ("showing my advisor docker"), but the software work gets more credit in modern academia because the skills are rarer and the building blocks are newer.

Plenty of cutting-edge science needs hobbyist-level EE, it's just not work in EE. Actual CS research is largely the same as EE research: very, very heavy on math and very difficult to do without studying a lot. If you compare hard EE research to basic software engineering, it makes sense that you think there's a "wall," but you're ignoring the easy EE and the hard CS.

replies(7): >>45126229 #>>45126357 #>>45126514 #>>45127402 #>>45127675 #>>45128168 #>>45128950 #
dfawcus ◴[] No.45127402[source]
Yeah - there was a massive filtering of the students between the 1st year entry, and the second year at my Uni. Largely down to people unable to handle the (not terribly) complex maths at that stage.

I knew a number of folks in the first year who were very good at practical electronics, having come in from a technician side, but simply gave up due to the heavy maths load.

It got more complex when doing Control Theory, what with Laplace and Z transforms, freq domain analysis, and the apocryphal Poles and Zeros.

Further culling ensued at that point.

replies(2): >>45128577 #>>45131315 #
Eggpants ◴[] No.45128577[source]
I went into EE wanting to learn how to design CPU’s and thought the analog side would be boring.

However, control theory turned out to be my favorite class. Learning how negative feedback loops are everywhere was an eye opener.

Also learning Laplace transforms was one of my first “holy shit this is freaking clever and cool” moments. Just like how parity bits in data streams can be used to detect AND correct errors.

replies(4): >>45128876 #>>45129325 #>>45129968 #>>45133267 #
buildbot ◴[] No.45128876[source]
Same on the laplace transforms. I was kinda mad we had learned any other way. It was a lot easier than whatever we were doing before mathematically!

I wonder, how much control theory is there in CPU?

replies(2): >>45131071 #>>45133238 #
dreamcompiler ◴[] No.45133238[source]
There's Boolean algebra but no control theory is needed for logic design.

One minor caveat is that most CPUs nowadays contain phase-locked loop (PLL) clock multipliers. Those fall into the domain of control theory but strictly speaking they're not part of the logic.

replies(1): >>45137975 #
1. uxp100 ◴[] No.45137975[source]
And in my inexpert experience, they are IP developed by the fab. So if you are doing CPU design work, you may need to understand PLLs well enough to read a datasheet, but you will not need to design a PLL.

I maybe had the most trouble just figuring out which instantiated PLL in the chip belonged to which PLL design, and where someone stuck the documentation in the giant repo. Especially since a hardware designer may think, oh we don’t need to update the docs, “nothing changed,” but the PLLs did change names because of a process change and their parameters may have changed slightly, even if they’re essentially the same. And chasing down small changes and documenting them ends up being a lot of the job in software.