swift-dragon-0688
Edited
• 3 Credit Hours
Key adjectives used by students — color intensity reflects sentiment
swift-dragon-0688
Edited
cosmic-tiger-7950
Edited
happy-sparrow-3905
Edited
valiant-puma-0869
Edited
patient-heron-5351
Lecturers were great with plenty of examples! Lots of intense maths, but the effort required to complete the homeworks/exams is lower than other courses like DC, Compilers, AOS, and DL.
Edited
neat-beaver-9738
Edited
happy-orbit-4423
Edited
fearless-quasar-0343
Edited
dynamic-dove-4297
Edited
clever-otter-9273
Edited
ISYE 6525 - HDDA - is the first class I've taken at OMSCS that actually feels like what I expected when I enrolled into a master's level program. Lectures are short with no nonesense, and TAs will absolutely not hold your hand to guide you through the assignments. Coming into this class without knowing how to do matrix calculus and without being comfortable in numpy was tough. The first HW asked me to solve a linear regression without the professor explaining what a linear regression was. I was able to come arround and grasp the necessary material, but it's always frustrating when you realize you're paying money to self-learn. And even still, most of the assigned readings went straight over my head.
There is no single textbook, but snippets of various (non-required) books and papers to read for each module. You'll have one module every 2 weeks for about 1 hour of lectures, 60 pages of text, and 1 HW. HW can be completed in Matlab, R, or Python, or any other language you choose, though example codes generously provided by the professor only come in these 3 languages. The 2 exams are essentially slightly tougher open book HW problems with 1 week less to complete them. Grading is extremely generous, actually insultingly so - I was frequently given full marks despite my code producing the wrong output.
I found the lectures and the professor's style of teaching disappointing, and it seems most other reviews did as well. 1 hour's worth of lectures per 2 weeks is not enough for more than surface-level knowledge. It always bothers me when I hear a professor say "here's this formula. It can be proven to be accurate but I'm not going to show you how." You'll get a ton of that in this course. Don't expect to see proofs or derivations in the lectures themselves. This problem is compounded by the expectation that you will be using libraries to get your code to work (rather than implementing the algorithms yourself), so it'll be tough to grasp the details of what you're doing. Frustratingly, I sometimes found myself copy pasting the example codes without understanding how they work. Yet this was enough for an A.
In summary: although the material was very interesting, I can't confidently say that I've learned it.
I finished with a 100%, but realistically it felt more like I deserved a 60%. This course is extremely challenging—by far the hardest class I've taken in the entire OMSA/OMSCS program. It’s the first time in OMSCS I felt like I was genuinely doing graduate-level math and facing graduate-level demands. Each module required roughly a hundred pages of dense textbook reading (ESL—not ISL) and/or multiple research papers.
Lectures themselves weren't bad, but mostly focused on mathematical derivations of various algorithms. Unlike CDA or ML, where you can rely on lectures recorded at other schools like Cornell, MIT, Stanford, or even YouTube (StatsQuest, Ritvik Math), the algorithms in this course are pretty obscure (things like tensor decomposition, B-splines, SVT) so you're often stuck with just the provided materials and maybe some obscure online resource from a lesser-known university. ChatGPT was probably one of the most useful resources for clarifying these concepts.
Assignments are pretty tough, although grading is extremely lenient (probably as lenient as CDA but without bonus points). You get some skeleton code separately provided, which helps tremendously, and there have been one or two questions where I just couldn't understand the material but was able to just reuse the example code and still get full marks. Without it, homework difficulty would easily hit 10 out of 5. Overall, assignments vary quite a bit in difficulty, but that probably depends on your background as well. For example HW2 is pretty easy if you've taken DL/CV. Exams are significantly harder and more time-consuming than homework assignments, and unfortunately feedback on your work is basically nonexistent.
The instructor is completely MIA. However, one thing that stood out positively was that all TAs were actual PhD students—something rare in OMSCS. Unlike some courses where you can tell students know more than the TAs, here the TAs clearly knew their stuff. Office hours were pretty just quick Q&A sessions, nothing instructional or proactive like in DL. On the upside, TA responses on ted were fantastic. The smartest students in the program seemed drawn to this class, so the forums had unusually high-quality discussions. We had one student, Joey, who basically rewrote and updated large portions of example code, which was extremely helpful.
If you found CDA remotely challenging, avoid this class. For students who thought math in introductory courses (like SIM) was rigorous—definitely skip this one. To realistically prep for this class, Gilbert Strang’s linear algebra lectures on MIT OCW or Steve Brunton’s SVD playlist on YouTube are good starting points.
TLDR The class is genuinely very tough, the instructor is absent, and the grading is absurdly forgiving. Still, this was definitely one of my favorite courses in OMSA/OMSCS because it felt like authentic graduate-level learning. Highly recommended, but only if your math skills are already very solid.
This course covers a wide range of topics related to machine learning, optimization, data preparation, dimensionality reduction, and image manipulation and analysis. I was able to earn an A, but compared to other courses rated this high, the overall experience, especially in organization and course management, was somewhat disappointing.
In terms of difficulty and time requirements the course is on par with ISYE 6740/CDA. The course has a similar structure in terms of lectures and assignments.
Pros: • The material and assignments are interesting, engaging, and rewarding once completed. • Grading is fairly lenient, and you still receive a good amount of credit even if your solutions are incorrect or only partially completed.
Cons:
• The professor was completely absent from the course beyond the lecture videos.
• Office hours were not particularly helpful and limited to answering direct questions.
• TA responses on the forums were often brief and sometimes came across as curt or even rude. Questions took a long time to get responses, and some were never answered. I also received conflicting answers to problems from different TAs.
• Lecture materials contained minor errors and appeared to have not been updated in some time. The provided code files were not well maintained. Corrections, when made, were not clearly communicated to the class.
• While the lectures were concise, some lacked depth and would have benefited from more thorough explanations.
• Grading took forever, most of the assignments including the first exam were not graded prior to the drop date.
Advice:
• Learn and get comfortable with R. While the lectures use MATLAB, sample code is also provided in Python and R. Although I’m very comfortable with Python, I found about half of the assignments easier and cleaner to implement in R.
• Form a study group. Given the limited support from office hours and forums, you’ll need to rely on outside resources like YouTube or collaborate with classmates to understand the material.
• Participate in the forum to answer other students’ questions.
• Start homework early! The assignments may seem daunting at first and you might not even know where to begin but stick with it. Sometimes it took me a day or two before things clicked.
Despite my frustrations, I still recommend this course. The content is solid, and the assignments are genuinely rewarding learning experiences. I just hope the course gets a bit of a revamp in how it's managed—and it wouldn’t hurt for the professor to make a cameo.
This is my 7th class taken in OMSA (taken iAM, iCDA, DAB, SIM, CDA, and Digital Marketing prior) and most of my experience with Python/R comes from this program. This was by far the hardest class I've taken and I'm definitely glad its over with.
Most of the content is really interesting especially the image manipulation assignments, but if you haven't taken DO like me then the optimization modules might end up killing you. You get two weeks per homework so that about a month of time spent working on the optimization homeworks were the most intense it got. But given that we have two weeks to work on everything I still don't think I ever got to more than 20 hours of work for this class in one week. Some assignments I could even do in the week we got so that really brings down the average time spent per week.
The sample code comes in Matlab, Python, and R and the slides/videos all use Matlab. I feel like in some cases it was more convenient to use R but other than that I mostly used Python which is what it seemed like most people used. I feel like it could've been helpful to know and use Matlab for the class since the class was taught with it but it's definitely doable with any of the languages.
Each TA had several office hours they held each week and the ones with attendance got recorded. They were mostly Q&A sessions about topics in the slides or to go over exam problems. There were times where I felt like information from one TA was contradictory with another and had to wait a few days for that to get cleared up. Initially they did help people if they showed them their code but they eventually announced after the first 1-2 HWs that they wouldn't be helping with code but just understanding topics, after which the attendance for OHs went down by a lot. They only recorded sessions if people went so there weren't that many recordings by the latter half of the class but they were mostly answering questions on Ed. Also I think a previous review from 2022 mentioned a TA was pretty generous with their hints, I would say if you were to take it now that might not be the case and you should expect to search for resources and aid on your own.
There were occasionally typos in the slides and homeworks and there were 3 incidents I remember of assignments being released late ( 1 of which supposed to released on a Monday that was a Federal Holiday in the US and 2 being exams) and only the exam 2 had received an extension since there were issues with the files to be used as well.
Despite all the issues, the TAs were EXTREMELY lenient with their grading which I do appreciate. I think halfway through the class I was resigned to just accepting a C but somehow I barely ended up with an A. As long as you try your best to answer every question to the best of your ability, you shouldn't lose too many points.
In comparison to the other stats options, I don't think there were other classes I would've rather taken since REG has bad reviews and I prefer assignment based classes so I would probably take it again if had to pick again. I think the best way to take this though would probably be taking DO before this (which I don't plan on taking) or taking this in the summer so the optimization models get dropped. This class gets referenced as Machine Learning 2 a lot with CDA being called Machine Learning 1 and I'm not sure if that's the best way of seeing this, although the class was interesting it's more niche so I'm not sure if I'd use the content in this class as much as one might from CDA.
This was my 8th course in OMSA (having previously taken all core courses except mgt 8803, plus SIM, DO, and CDA). GPA before the course was >3.5. I seem to have sneaked by with an A here, but it was not an easy achievement.
I chose to take this course in summer, as they drop the optimization modules to squeeze the course content into the shorter semester. I had previously taken DO and didn't really see the point of re-sitting that content. To their credit, the dropped optimization modules are available for study, and do admittedly add value on top of the content covered in DO.
I REALLY liked this course from a content perspective. The lectures, readings, and example code were all excellent resources. I truly regret that I didn't have time in my personal life to spend mastering this material, as it is mostly things that I had never come across before.
I have indicated 15 hours as the time I SHOULD have spent on the content to comfortably do well, but realistically, I averaged about 9 hours/week in the form of nothing during the week and then late, high stress nights and weekends trying to simultaneously learn the material and get everything done at the last minute. Not proud of my approach here, but it was a rough few months between life/work/personal health. I say all this to add emphasis to the point that if you actually average 15 hours per week of study consistently throughout the course, the lecture materials, homework and exams are all set up such that you can get an A in the course without killing yourself. No gotchas, no unfair / trick questions.
This is a great course that I am thankful I didn't let the higher OMSCentral difficulty score discourage me from taking.
Lots of great content… I really enjoy the way prof walk me through every each of them: first display equations, then explain each term in the equation sometimes with pseudo code, then one or two applications with plot visualization, at the end matlab example code. I also can find myself apply the techniques taught in class to my daily work in the near future (signal processing, analysis..). Only thing if there is any, I would like to get a bit more understanding of the formula in applications. In homework this is usually a given. I would like to know how these formulae are derived from real engineering cases.
Really interesting topics! The lectures covered the mathematical formulations and also provided programming examples of applications. Homeworks were fair, with a lot of TA help and class participation. Lecture slides are very well constructed.
I wouldn't say this course is really "math heavy" because all of the homeworks and exams are programming in Python or R. But the course very quickly dives into matrix operations.
I'm 8 courses in to the C track for OMSA, but I was still kind of worried about this class over the summer. However, everything felt very manageable. I think if I took it as #3 or #4 I might have felt overwhelmed at times, but a lot of the material in this course felt familiar to bits and pieces learned from all of my previous courses.
Great course that covers some more niche but also very interesting and insightful areas of machine learning and statistical analysis. Basic linear algebra (SVD, inverses of matrices) and calculus (derivatives of scalars and vectors) is required. I used python for the assignments but R and matlab seemed slightly preferred as the solutions are often in matlab. The topics covered range from functional data analysis to applications of regularization such as robust PCA and matrix completion. Every module was more interesting than the last.
Hold on to your hats, folks, because this course is a BEAST! It's the most mind-bendingly mathematical course at OMSA, bar none. Get ready to level up faster than you can say "linear algebra" and "machine learning." Professor Paynabar is a force to be reckoned with, delivering lectures of the highest caliber that cover everything from ordinary least squares to smoothing splines. But don't expect any hand-holding in this class - the TAs are happy to help, but you'll have to put your thinking cap on and work hard for every solution. And let's not forget the homework assignments - they'll leave you shaking in your boots no matter how much you prepare. But fear not, if you stick with it, you'll come out on top and learn more than you ever thought possible
Class is an enjoyable experience that transformed my fear of math into a newfound appreciation for it. Below are topics we learned for each module. Module1: High Dimension data introduction, Functional data (space vs time), Regression - Least square Estimates, Polynomial Regression, Splines, Order-M Splines, B-Splines, Smoothing Splines, Natural Cubic Splines, K-Nearest Neighbor, Kernel Smoother Regression, RBF Kernel, Functional PCA Module2: Image Analysis, Transformation, Convolution, Convolution with a Mask, Segmentation, K-means clustering, Edge detection using derivatives, Sobel Operator, Kirsch Operator, Prewitt Mask and Gaussian Mask. Module3: Tensor data analysis, Basic mathematical operation, outer product, inner product, Kronecker product, Khatri-Rao product, Hadamard product, defining tensor ranks, rank one tensor, Candecomp/parafac (CP) decomposition, Tucker decomposition Module4: Regularization, Ridge, Lasso, Non-negative, Adaptive Lasso, Group Lasso, Module5: Compressive sensing, using Regularization for CS, Optimization Algorithm for CS