logoalt Hacker News

Can a Computer Science Student Be Taught to Design Hardware?

43 pointsby stn8188today at 5:29 PM57 commentsview on HN

Comments

EdNuttingtoday at 7:02 PM

* The fact that there are comments misunderstanding the article, that are talking about PCB Design rather than (Silicon) Chip Design, speaks to the problem facing the chip industry. Total lack of wider awareness and many misunderstandings.

* Chip design pays better than software in many cases and many places (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)

* Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.

* Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.

* We have a shortage of engineers in the chip industry, particularly in chip design and verification, but also architecture, modelling/simulation, and low-level software. Unfortunately, the decline in hardware courses in academia is very long standing, and AI Software is just the latest fuel on the fire. AI Hardware has inspired some new people to join the industry but nothing like the tidal wave of new software engineers.

* The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.

show 4 replies
mikewarottoday at 7:02 PM

I think there are two separate areas of concern here, hardware, and computation. I strongly believe that a Computer Science program that only includes variants of the Von Neumann model of computation is severely lacking. While it's interesting to think about Turing Machines and Church numbers, etc... the practical use of FPGAs and other non-CPU based logic should definitely be part of the modern CS education.

The vagaries of analog electronics, RF, noise, and the rest is another matter. While it's possible that a CS graduate might have a hint of how much they don't know, it's unreasonable to expect them to cover that territory as well.

Simple example, did you know that it's possible for 2 otherwise identical resistors to have more than 20db differences in their noise generation?[1] I've been messing with electronics and ham radio for 50+ years, and it was news to me. I'm not sure even a EE graduate would be aware of that.

[1] https://www.youtube.com/watch?v=omn_Lh0MLA4&t=445s

show 1 reply
realotoday at 6:46 PM

Not all hardware is digital.

RF design, radars, etc... are more an art than a science, in many aspects.

I would expect a Physics-trained student to be more adaptable to that type of EE work than a CS student...

show 1 reply
bsolestoday at 8:37 PM

>> "Either we hire good CS people who have the basic understanding of EE, and we train them to become good engineers, or we hire good engineers who are good in CS, and we try to upskill them on the CS side."

The former (CS -> EE) is very unlikely to happen at a large scale than the latter (EE -> CS). It is much easier to teach EEs to become (albeit, often bad) software engineers, than teaching CS student to be good engineers.

Also, the former (CS -> EE) will not happen in academia because of (1) turf wars, and (2) CS faculty not having any understanding, nor interest in electronics/hardware/engineering.

I once proposed to teach an IoT class in the CS department of a major university in US, the proposal basically fell on deaf ears.

Glyptodontoday at 6:32 PM

I have trouble believing there's a talent shortage in the chip industry. Lots of ECE grads I know never really found jobs and moved on to other things (including SWE). Others took major detours to eventually get jobs at places like Intel.

show 1 reply
assimpleaspossitoday at 7:37 PM

I'm a hardware designer. An EE. But over the last umpteen years I've gradually switched over to software because that's where I was needed. What I've found is that I became a very good software programmer but I still lack all the fundamentals of software engineering. There are things I won't or can't use because it would require too much study for me to get good at it or even understand it.

I would bet that a CS guy would have similar problems switching to hardware engineering.

AshamedCaptaintoday at 6:49 PM

Why would they? Pay is just much lower, despite the fact that there's way more responsability. I personally know more people who switch from hardware to software than viceversa.

show 1 reply
NoiseBert69today at 5:59 PM

As a computer engineer I usually copy reference schematics and board layouts from datasheets the vendors offers. 95% of my hardware problems can be solved with it.

Learning KiCad took me a few evenings with YT videos (greetings to Phil!).

Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.

Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.

But as always: the better your gear gets - the more fun it becomes.

show 1 reply
petefordetoday at 7:11 PM

I had written a whole big thing that could be summarized as "yes, of course" but then I read the article and realized that it is very specifically about designing silicon, not devices.

I understand that it makes sense for a blog called Semiconductor Engineering to be focused on semiconductor engineering, but I was caught off guard because I have been working on the reasonable assumption that "hardware designer" could be someone who... designs hardware, as in devices containing PCBs.

In the same way that not all software developers want to build libraries and/or compilers, surely not all hardware designers want to get hired at [big chip company] to design chips.

show 1 reply
anonymousiamtoday at 7:41 PM

I lived in both worlds (hardware/software) throughout my career. In school, I learned (in order): Analog electronics (including RF), Digital electronics, Microprocessors, Software, Systems. I've always thought that it's strange how few software people know hardware, and vice versa. In the software domain, when I began referencing hardware elements while explaining something, the software audience would usually just glaze over and act like they were incapable of understanding. Same goes for the hardware people when I would reference software elements.

I learned Ada sometime around 1991. Counting assembly for various platforms, I had already learned about a dozen other languages by then, and would later learn many more.

Sometime around 2000 I learned VHDL. In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada. I wish somebody had just produced a textbook describing the additional features and nomenclatures that VHDL added to Ada -- That would have made learning it even easier. The obvious reason that nobody had done that is that I was among a very small minority of hardware people who already knew Ada, and it just wouldn't be useful to most people.

In all of my work, but especially in systems integration work, I've found that my knowledge of multiple domains has really helped me outperform my peers. Having an understanding of what the computer is doing at the machine level, as well as what the software is doing (or trying to do) can make the integration work easy.

More on-topic: I think it would be a great improvement to add some basic hardware elements to CS software courses, and to add some basic CS elements to EE courses. It would benefit everyone.

agg23today at 7:01 PM

I wasn't taught directly (and don't know what I'm doing still), but I've had a lot of fun learning about retro hardware design as a software engineer. I've made a few of my own reverse engineered designs, trying to synthesize how the real designers would have built the chip at the time, and ported others for the Analogue Pocket and MiSTer project.

Here's an example of my implementation of the original Tamagotchi: https://news.ycombinator.com/item?id=45737872 (https://github.com/agg23/fpga-tamagotchi)

joezydecotoday at 6:41 PM

UIUC CS grad from the late 80s. CS students had to take a track of electrical engineering courses. Physics E&M, intro EE, digital circuits, microprocessor/ALU design, microprocessor interfacing.... It paid off immensely in my embedded development career.

I'm guessing this isn't part of most curricula anymore?

show 4 replies
stn8188today at 5:43 PM

The subheading to this article seems a little extreme: "To fill the talent gap, CS majors could be taught to design hardware, and the EE curriculum could be adapted or even shortened."

The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.

show 2 replies
contuberniotoday at 5:54 PM

Is this not what electrical engineers are for?

show 1 reply
bee_ridertoday at 6:26 PM

Is the idea here that the code-generation apocalypse will leave us with a huge surplus of software folks? Enabling software people to go over to hardware seems to be putting the cart before the horse, otherwise.

Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).

Tade0today at 8:13 PM

My degree is in computer science but I studied at the faculty of electrical engineering.

My courses didn't get into the details of semiconductor design (particularly manufacturing), but we had one on the physical principles behind this whole thing - bandgaps and all.

We also had to design analog circuits using the Ebers-Moll transistor model, so pretty basic, but still not exactly linear.

Overall these are very different fields but at the end of the day they both have models and systems, so you could make a student of one of them learn the other and vice versa.

It just has to be worth the effort.

dilawartoday at 5:52 PM

EE folks should design languages because they understand hardware better?!

And CS folks should design hardwares because they understand concurrency better?!

show 2 replies
SomaticPiratetoday at 6:46 PM

Hilarious to see Cadence and Synopsys in this article. They are arguably the cause. The complete lack of open source tooling and their agressive tooling price is the exact reason this ecosystem continues to be an absolute dumpster fire.

I used Vivado (from Xilinx) a bit during my undergrad in computer engineering and was constantly surprised at how much of a complete disaster the tooling chain was. Crashes that would erase all your work. Strange errors.

I briefed worked at a few hardware companies and I was always taken aback by the poor state of the tooling which was highly correlated with the license terms dicated by EDA tools. Software dev seemed much more interesting and portable. Working in hardware meant you would almost always be searching between Intel, Arm, AMD and maybe Nvidia if you were a rockstar.

Software by comparison offered plentiful opportunities and a skill set that could be used at an insurance firm or any of the fortune 100s. I've always loved hardware but the opaque datasheets and IP rules kills my interest everytime.

Also, I would argue software devs make better hardware engineers. Look at Oxide computer. They have fixed bugs in AMD's hardware datasets because of their insane attention to detail. Software has eaten the world and EEs should not be writing the software that brings up UEFI. We would have much more powerful hardware systems if we were able to shine a light on the inner workings of most hardware.

anthktoday at 7:30 PM

In Europe in order to get a CS degree and be an actual "Engineer" you must be able to so at least on a basic level.

Joel_Mckaytoday at 8:14 PM

Hardware is artificially underpaid work, good positions are sparse in the US, and generally most engineers end up in niche coding environments.

Most people that land a successful long career, also refuse to solve some clown firms ephemeral problems at a loss. The trend of externalizing costs onto perspective employees starts to fail in difficult fields requiring actual domain talent with $3.7m per seat equipment. Regulatory capture also fails in advanced areas, as large firms regress into state sponsored thievery instead.

Advice to students that is funny and accurate =3

"Mike Monteiro: F*ck You, Pay Me"

https://www.youtube.com/watch?v=jVkLVRt6c1U

IshKebabtoday at 6:01 PM

Obviously. Hardware designers absolutely love to think that hardware design is totally different to software design and only they have the skills, but in reality it's barely diffetent. Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.

The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.

If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.

show 3 replies