In EECS every undergrad does learn a bit of control/systems theory in their sophomore and junior requirements. I loved it but it was mathematically challenging and unfortunately the better background a student already had, the better and the more they got out of the courses. I wish there were a better way, because the result was students eventually got weeded out by an (inhumane, alienating) competitive system, rather than have each student actually learn something well on their own level.
Was it mandatory though? I took a Control Theory class for my Electrical and Computer Engineering undergrad degree in the US. But it was a senior/graduate-level elective and not required to graduate.
Minor aside, the class was the best of all my electives. I picked it based on advice from a friend who said "choose your electives based on the professor, not the material." One of those bits of advice I wish I'd absorbed (I'm sure I'd been told) earlier in my school career.
Same with my EE undergrad. Then I got to grad school and decided to take a course in Linear Systems, which is when I realized my ODEs course taught me nothing.
I really enjoyed this post having attended the same school and taken some of the same classes. I'm actually currently in my final months of a mechanical engineering B.S. at Madison, and I think the problem goes beyond a "hazing" of freshmen.
For me, my ECE 352 was Introduction to Dynamic Systems. For anyone here without a physics background, it's basically an introduction to "more real" mathematical modeling of physical systems, beyond what you would learn in your first few semesters of college physics. Think "applied differential equations."
This class naturally involved some fairly high level math compared to what we had been using before, and I think the professors honestly didn't know (or had forgotten) how to transition into it. Suddenly the class was using things that they might have done one or two homework assignments on in differential calculus a year a go, but there was never any "here are the overarching concepts used in this class and how they fit into the two years of calculus you have already taken." The class was just dumped off the Laplace transform deep end and left to memorize steps to solve problems.
I think the problem was that the professor had honestly forgotten what it was like to not have an intuitive grasp of what a Laplace transform did or how to linearize a dynamic system. I've found myself falling into almost the exact same trap when teaching people object oriented programming for example. To an experienced programmer, an object is the most natural thing in the world, you can pass them around, perform operations on them and they just "work" for you. It's so simple for you that it's hard to remember that it may be a completely foreign concept to someone brand new.
The thing is, it's comparatively easy to just teach someone the steps to solve a specific problem when the alternative is teaching them to think in such a way that they could solve it on their own. It's the same with debugging, it's the same with math, and with programming. I spent the rest of that semester watching the professor spend every single lecture doing one or two difficult examples, and nothing else. Teaching someone so that they can gain a deep understanding of a given subject is really hard, and I think it's the rare school and professor that can do it effectively.
I ended up doing well in that class, but it was really by rote memorization of every single mathematical "scenario" that I thought likely to be on the exams. It wasn't until a year or so later that dynamic systems finally clicked into place, and it was because a professor in an unrelated class happened to spend 45 minutes on a good intuitive explanation of what they meant.
I think teaching real systematic debugging is similar in that it's something that requires a real mental investment on the part of the teacher. That's not to say it's impossible, because it certainly isn't. It just requires someone to make the investment and explain the why as well as the how.
I still get a lot out of the systems approach to engineering from my university classes. Still, this is something that could be taught in weeks instead of years.
My whole BSc degree was in systems engineering and automatic control. You make me nostalgic about all those classes that at the time I hated. What a weird feeling.
I did "systems engineering", which was OR and control theory. While it was somewhat soft engineering for the first years, the final year was intense optimization once a proper mathematical and computational background was established.
Maybe I shouldn't be so hard on the teaching at my uni then. There was a pretty decent amount of attempts at getting an intuitive explanation of how the systems worked, and quite a few demos of different control schemes and how they worked and might fail (as well as tons of maths).
It was CS or some real Ted engineering? In my country the systems engineering curriculum is mostly only engineering topics (calculus, algebra, chemistry, physics, management, law) for the first two years. The good thing is that you can choose between different careers til that point without much hasle
As someone who went to a mid-tier state university for computer engineering in undergrad, I kinda disagree. The classes were not going to push into great depth on the topics. But there was plenty of opportunity to push myself beyond the classes, like doing independent study or research with professors.
At my university in the EE program you had to complete all freshman and sophomore level classes before you were allowed to take any junior or senior level classes.
You also had to meet with an adviser towards the end of each semester before you were released to schedule classes for the next. Apparently engineers tried to circumvent the rules too many times, probably because we're always focused on optimizing solutions.
Undergrad has been a joke for decades. Try learning differential equations as a 19 year old from a TA that can't speak intelligible English. At least nowadays you can watch an actually informative video on your headphones while you nod along in class.
My first year was programming and vector calculus. Second year was operations systems internals and differential equations. And this wasn't even a top tier school.
Another thing: assume that the people designing the curriculum might possibly know a little more than you do about what subjects are useful later on. I.e., don't say stuff like this:
"But why do I need to learn about matrices? I'm a meteorology major! We don't use matrices at all, it's all just partial derivatives!"
(This is an actual complaint I received when TAing ODE's, which was required by the meteorology major. I didn't even bother explaining how you discretize a linear PDE into a system of ODE's, or why that matters for weather prediction, I just referred her to her department head. )
Haha, this is how I feel too. I had the mistaken assumption in middle school that "if this is all they're teaching me, there must not be that much to know". Then when I got to college I realized that everything taught at the undergraduate level and below constitutes a negligible fraction of everything that has been discovered.
Instead of stretching mundane material that could have been taught in a week or two (calculus, etc.) over a period of years, I would have much preferred taking all of my engineering courses then. That would have left plenty of time in undergrad to learn topics as diverse as representation theory, semidefinite programming, quantum information theory, lambda calculus, etc. Instead, I have to learn these things on my own as a grad student in an ad-hoc fashion. Occasionally, courses on these topics become available, but they pop up at random semesters and you can't take all of them.
I don't know if it's true of all schools, but I had a very similar experience at another Ivy League school. Took the first semester of linear algebra, got a very respectable grade, but really didn't learn much because the class wasn't structured to do that. It was there to make STEM majors prove they were good enough. I wasn't a STEM major, so didn't need to keep going with that kind of hazing.
"[Education is] all telling people solutions to problems they don't have yet" — I experienced this exact thing very acutely in college.
In the first year of my EE degree, I had to jump back a quarter in math to relearn some concepts I just hadn't grasped in high school. This complicated the rigorous schedule laid out for the first two years, as it meant I was no longer taking the prerequisite math classes the quarter before the required EE classes — instead I was taking them concurrently.
Despite the troubles I had with the college of engineer's registration process (I wasn't following their rules), taking the classes concurrently with the EE courses that actually applied that knowledge (for example, taking Electricity and Magnetism at the same time as 3D calculus) was a completely different educational experience! Maybe I was lucky in that the classes were paced similarly, but learning theory in math class and then going the next day to a practical application of that knowledge was an amazing experience.
In my senior year, I worked with the professor I was doing research under to give feedback to the engineering and math departments about my experience, but sadly I don't think anything ever came of it. It's really too bad too — it's a pretty small change to make in the scheduling and I think it would help the people who are more practically minded vs. theoretically minded (builders vs. thinkers). My education was much more tailored to the latter, despite me being squarely the former.
reply