An introduction to machine learning with application. The course covers the fundamentals of classification and regression, as well as, the methodology behind the design and implementing various models. Students will be introduced to a variety of topics including parameter estimation, model training/fitting, goodness of fit, generalization, regularization, inference, and objective & loss functions. Along the way, students will learn practical tools for machine learning & statistical building models, applications of machine learning & statistical models, and related mathematics.
The goal of this course is to provide you with the foundational knowledge of machine learning & computational modeling and its applications. To achieve this goal, you will study the underlying mathematics, statistics, and algorithms to used to create machine learning models. You will learn to appropriately apply these methods to solve complex, real-world problems.
The course will include regular homework and/or programming assignments. Unless otherwise specified, assignments are due 5 minutes before midnight on the due date. There will be no credit given for late assignments (without an excused absence)—turn in as much as you can.
Reading assignments should be completed before the lecture covering the material. Not all reading material will be covered in the lectures, but you will be responsible for the material on homework and exams. Quizzes over the assigned reading may be given at any time.
See the GFU CS/IS/Cyber policies for collaboration and discussion of collaboration and academic integrity. Most students would be surprised at how easy it is to detect collaboration in programming—please do not test us! Remember: you always have willing and legal collaborators in the faculty.
Almost all of life is filled with collaboration (i.e., people working together). Yet in our academic system, we artificially limit collaboration. These limits are designed to force you to learn fundamental principles and build specific skills. It is very artificial but intensional for your own benefit. The only way for you to learn is by doing the work.
To be clear, do not:
Besides EYS, I am always available to discuss the Christian faith if you have any questions or doubts. Send me an email, come by my office hours, or talk to my after class, Christ is the reason I am at GFU, I always have time to talk about faith.
The final course grade will be based on:
Week 1Introduction & Statistics
Reading: Chapter 1: pages 1 – 46 |
1/20MLK, Jr. Holiday—no classes
Reading: – |
Week 2Sampling & Distributions
Reading: Chapter 2: 47 – 86 |
Week 3Classification & Regression
Reading: First half of chapter 3: pages 87 – 113; & chapter 4: pages 141 – 154 |
Week 4Linear Model Details
Reading: Chapter 4: pages 155 – 173 |
2/14Mid-semester break—no classes
|
Week 5Model Fitting
Reading: Chapter 5: pages 195 – 200 & pages 221 – 230 |
Week 6SGD
Reading: Chapter 5: pages 195 – 200 & pages 221 – 230 |
Week 7Decision Boundaries
Reading: Chapter 5: pages 208 – 215 |
Week 8Hypothesis Spaces
|
3/12Midterm exam
|
Week 9K Nearest Neighbors & Non-linearity
Reading: Chapter 6: pages 237 – 248 |
Week 10Decision Trees
Reading: Chapter 6: pages 249 – 258 |
Week 11Spring Break
|
Week 12Neural Networks & Regularization
Reading: Chapter 6 & Supplementary Materials |
Week 13More Neural Networks: Deep Learning
Reading: Chapter 6 & Supplementary Materials |
Week 14Unsupervised Learning & K-Means
Reading: Chapter 7: pages 294 – 304 |
Week 15Ethics, Limits, & Special Topics
Reading: Supplementary Materials |
This page was last modified on 2025-03-31 at 17:37:02.
George Fox University · 414 N Meridian St · Newberg, Oregon 97132 · 503-538-8383
Copyright © 2018–2025 George Fox University. All rights reserved.