daring-pulsar-7646
Edited
• 3 Credit Hours
Key adjectives used by students — color intensity reflects sentiment
daring-pulsar-7646
Edited
vital-puma-0519
While the subject is cool, the course itself is a bit disappointing. The assignments give almost too much starter code, so it ends up being "mess with dimensions of neural networks until the code runs and Gradescope is happy". Quite a missed opportunity to force a deeper understanding of the material.
Edited
honest-giraffe-3217
I'm writing this review based off the OMSA version
This course is a very doable course. I was doing this course when I was working at the same time however, if you really want to dive deep into natural language processing particularly LLMs then it’s best for you to go an extra mile and dive deeper for yourself because this class doesn’t really teach much of it. It only glazes over it.
Edited
clever-fox-1970
Modules 1–6 were excellent — some of the best content I’ve encountered in the OMSCS program. However, the MetaAI lectures were a major disappointment. They were poorly organized and difficult to follow, making it frustrating to prepare for the semi-open-book final that heavily relies on them. Be prepared to spend hours trying to make sense of the material. Good luck.
Edited
vivid-koala-9597
Edited
golden-raven-6433
Edited
mellow-panda-5884
Difficulty has been dramatically upped for this course.
You will need to learn how to make ChatGPT from scratch.
This actually makes ANLP one of the most exciting courses OMSA has to offer.
Try it :)
Edited
clever-gecko-0029
Very easy and elementary NLP course for a graduate program.
Edited
witty-dove-5742
Edited
solid-crane-5114
Edited
This was by far the worst course I took on the programme. Don't expect to learn much from the classes (as they have nothing but a teacher reading the slides without taking note of anything) - most of the things I learned came from putting the class transcripts into ChatGPT to get explanations. The assignments are disappointing as well… they were useful in a certain way because I had to learn how to work locally using .py files (in a "object-oriented way"), but I found them pretty easy (as much of the code is provided). Overall, I learnt some stuff (and the course content is pretty up to date), but it wasn't worth my time or money.
This is my first review, previous reviews of this course gives a mixed sensation.
The course is self contained, we go from tokenizing, to embedding, to Neural networks , to transformers. every one of those concepts have a HW associated to it , similar to coding ML algortihms on CDA from scratch. Lectures are simple with weekly quizes of 7 minutes. and the heavy learning comes from the HW. It was actually a really good change of pace from CDA last semester, more life balance. It helps you to understand the genAI concepts and the last HW has a RAG question so you can jump on that tool or prepare for the new AI hype that is coming.
This was my 8th class in OMSCS.
I highly recommend this course to anyone in OMSCS or OMSA. It will change your understanding of the NLP industry.
Dr. Riedl is in my top 3 favorite OMSCS professors. He takes incredibly complex topics and breaks them into very learnable chunks. He does an incredible job summarizing the evolution of 50 years of NLP research from its origins to the modern day transformer architectures like GPT and BERT.
Many reviews give this a low rating on difficulty. This is mostly because the assignments cover complex concepts and coding frameworks that are already built for you. You are only required to complete a few simple blocks of code that allow the jupyter notebook to run. If you actually spend time walking through the provided code it, it is a way to quickly learn pytorch without too much pain.
What other reviews don't explain - maybe this is new in the summer 2024 class - is that the final project is actually really quite hard. You spent 80% of the semester expecting easy 4-5 hr assignments, but the final project is a 20-40 hr curveball. You are expected to design a KVMNet from scratch. KVMNets are early primitive precursors to the complex general purpose transformers like GPT. You train the KVMNet to answer questions about politicians based on a large Wikipedia dataset.
I didnt realize it until the end, but the KVMNet perfectly combines the principles you learned along the semester and provides a simplified format to understand the pivotal concept that allows GPT to work: attention mechanisms.
My only criticism is an echo of many other reviews in this thread - the lectures provided by Facebook are pretty hard to follow. You spend the first 3/4 of the class expecting best-in-class lectures from Dr Reidl, only to become frustrated at the end of the semester due to the Facebook lecturers reading off slides in broken English. There are some exceptions, I liked the 2 Facebook lecturers who came from GaTech - they seemed to be a bit better at developing educational content.
This course is good to have on your resume.
It is the sort of course that you can get through with a very easy A, but if you want to learn anything you need to invest a little more time.
The class work is basically setting up neural network structures from pictures in the jupyter notebooks. Any background reading on why you’re doing it is outside the required scope. The lectures are barebones, usually 15 minutes per week.
I would say this course is a lightly guided walk through of some of the very basics of NLP. It’s along the lines of the coursera ML course by Andrew Ng where you’ve learned enough to fill in the blanks for some of the ML topics, but not enough that you would have a good idea of how to go about choosing and training and improving a model on a random set of data.
The main criticism:
Context: I took this class in the summer of 2023, which is the second semester that this class is offered. This class still has some kinks to iron out, such as issues with Gradescope, issues with varied Python code indentations, version compatibility issues in some homeworks. It is very likely that later semesters will not encounter such issues as the instructors will have already fixed these.
I had minor NLP knowledge before taking this class. This is my last class in the program so I am already very comfortable with coding, various machine learning techniques, working with arrays (linear algebra) using numpy and pytorch and troubleshooting any potential issues with developing environments.
In terms of difficulty and amount of time invested into the homeworks, it was easy for me. Each of the 4 homeworks took less than 4 days total for completion. That time includes some hours spent training models on my laptop CPU and cancelling/restarting the job to walk away due to irl interruptions (since I was stubborn and did not want to use Google's Colab).
The homework difficulty progression is like this: 1 -> 2 -> 4 -> 3. Minimum time to completion: 4 hrs. Maximum time to completion: 4 days.
The lectures in this class are very terse however they do provide the necessary citations/resources for students to dig deeper if they are so inclined. No doubt some reviews will say that this class is thin on materials but I would disagree, since there are excess time left from completing the homeworks, I was able to spend them exploring further into the subject topics based on the provided resources.
The reviews for this class will likely be polar, on the one hand there will be lots of reviews on how thin the materials are or how easy it is, on the other hand there will likely be lots of reviews on the difficulty with Gradescope, version incompatibility issues, pytorch, lack of troubleshooting support etc...
Do keep in mind that this is a graduate level course and there is a base level expectation of being able to troubleshoot and solve problems on your own with minimal hand-holding. Additionally, how much one gets out of the class depends on the level of comfort with the aforementioned subjects. If all the time is spent troubleshooting the shape and rotation of a matrix multiplication problem or on basic python stuffs such as list of lists or list of dicts then there will be less time and brain resources devoted into exploring the actual NLP topics.
Therefore it is my opinion that to get the most out of this class, you have to already be comfortable with basic linear algebra, basic ML techniques, numpy (and therefore its big brother pytorch), scikit-learn, and know-how to set up and troubleshoot your python development environments.