Logistics

It's time for your FINAL computing exam wooo! View this not as an intimidating obstacle to be overcome, but a chance at reflection to see how much you've learned and where your gaps of knowledge may be.

Here's how this is going to go:

  • The "exam" will be entirely conceptual / mathematical to test your understanding of the material.

  • The exam will take place in our typical classroom and will last only 90 minutes; see the final exam schedule for the time and day:

    LMU Final Exam Schedule

  • The exam is non-cumulative and will only quiz you on the material after the midterm. See topics of exam below.

  • The exam is CLOSED note and CLOSED computer. You may NOT collaborate with peers or any other person during the duration of the exam.

  • You are allowed to bring ONE 8.5" x 11" (double sided) cheat sheet with any information you'd like with you for consultation into the exam.


Grading


  • The exam may feel long, but that's OK! Take your time, a deep breath or two, and don't worry if you don't finish everything -- it will be likely that your classmates do not either, which will be by design.

  • The exam does not have a forced curve (i.e., that only some set number of people can receive A's, B's, etc.), but it WILL have a difficulty adjustment (upward bonus) if it was too hard.

  • Remember the class' exam policy: your worse exam counts for half as much as the other. If your midterm grade wasn't where you wanted it to be, here's your chance to catch up!

  • The last question on the exam will always be a bonus to illustrate a pun on this section's material; come prepared with an idea, those are easy points.



Topics

This exam will cover the topics only in this half of the class. These include:

  • Bayesian Network Approximate Inference: sampling and its benefits, prior sampling, rejection sampling, Gibbs sampling, and being able to perform each by hand (similar to the classwork).

  • Hidden Markov Models: structure and semantics, parameters (transitions, emissions, initial belief), belief state, online updates for time and observation, forward algorithm, Viterbi algorithm, filtering, particle filtering.

  • Learning: supervised, unsupervised, and reinforcement learning distinctions, classification, features, class variables, learning from examples / data, training sets, validation sets, test / held-out sets, overfitting, weight-threashing, mediocre generalization, overtraining, parameters vs. hyperparameters, optimization as hill-climbing.

  • Naive Bayes Classifiers: structure and semantics, parameters (class and feature CPTs) and estimating from data (Maximum Likelihood + Laplacian smoothing), NBC classification algorithm, log-likelihood and its motivation.

  • Linear Perceptrons: feature extractors, weight vectors, activation functions, binary vs. multi-class classification, perceptron-update rule (learning weights), decision boundaries, perceptron pitfalls (overtraining, mediocre generalization, thrashing).

  • Logistic Regression: sigmoid + softmax functions, improvements over perceptrons, decision rule, maximizing log-likelihood of weights, logistic-regression learning, stochastic gradient descent (be able to perform by hand like on the CW), utility of hyperparameters like the learning rate.

  • Neural Networks: structure and semantics, feed-forward networks (and being able to trace activations by hand like on the in-class example), [GIST understandings on all of:] interpretations and construction of network layers, back propagation.


What will NOT be on the exam:

  • Neural Network nitty-gritty and their linear algebra. Back Propagation derivation, nor any of the tools from calculus III required for it, nor performing the algorithm by hand.

  • Anything programmatic (the homework already tested for that).



Question Types

The examination format may include:

  • Definitions and short answer questions

  • Multiple choice

  • Problems similar to the past classworks

  • Analyzing Bayesian Networks / HMMs / NBCs / Perceptrons / Logistic Regression Models / Neural Networks / training datasets and computing some probabilities


Be prepared to answer some questions similar to those on the assignments and in-class exercises.

Furthermore, although I won't ask you anything about mechanics we haven't covered in class, you might be expected to apply the mechanics we've learned about in a way that we didn't see in class. If you thoroughly understand the material, there should be no surprises, but still challenges.



Preparation

Here is my suggestion for preparation order:

  1. Re-read my course notes, re-doing the exercises if you aren't clear on any of them. Note that there are also suggestions for activities we did NOT see in class that will make for good practice.

  2. Study any available classwork and homework solutions.

  3. Form a study group, come up with an example problem, perform individually, and then compare solutions.

  4. Consult the syllabus' recommended extra-practice sites for problems on topics that you're still unfamiliar with; there are plenty for all of our models.

  5. Still not confident on a topic? Feel free to ping on Slack and ask anything, including requests for questions / specifications on a particular problem type.



  PDF / Print