Feineigle.com - Thoughtful Machine Learning with Python

Home · Book Reports · 2018 · Thoughtful Machine Learning With Python

Published: May 14, 2018



The book in...
One sentence:
Another disappointing machine learning book that tries to cover too much too fast.

Five sentences:
This book starts of extolling the virtues of test driven design and the SOLID principles, which, while respectable, seem somewhat forced and out of place in a machine learning book. Next, much of the code is written from scratch, not utilizing libraries and giving a look under the hood. The code itself, while complete, lacks the thorough explanation one would hope for in a book for beginners. The equations underlying the algorithms are presented, but, like the code, lack explanation or proof. All in all the book tries to cover too much in too few pages.

designates my notes. / designates important.


Thoughts

While there is a lot of code provided, the accompanying descriptions are lacking. The test driven aspect, in my opinion, gets in the way if you are trying to learn machine learning.

The code is built, in most cases, without the help of sklearn, except in a few instances. While I assumed it would be nice to ’look under the hood’ at how some of the underlying functionality handled by libraries would be informative, I was wrong (at least in regard to this book). With more detailed explanations, line-by-line walk-troughs I think this would be a much better book.

As it stands now, it tries to cover too much in too few pages.

Additionally, the mathematics covered is sparse and without proofs or much in the way of explanations.


Table of Contents


· 01: Probably Approximately Correct Software

· 02: A Quick Introduction to Machine Learning

page 018:

· 03: K-Nearest Neighbors

page 026:
page 031:

· 04: Naive Bayesian Classification

page 47:
P(A_1, A_2,.., A_n) = P(A_1)*P(A_2∣A_1)*P(A_3∣A_1, A_2)*...*P(A_n A_1, A_2,.., A_n−1)

· 05: Decision Trees and Random Forests

page 74:

· 06: Hidden Markov Models

· 07: Support Vector Machines

· 08: Neural Networks

· 09: Clustering

page 165:
1. Richness
2. Scale invariance
3. Consistency

· 10: Improving Models and Data Extraction

page 183:

· 11: Putting It Together: Conclusion