The following is a list of free books on Machine Learning.

## A Brief Introduction To Neural Networks

**David Kriesel**

*A Brief Introduction To Neural Networks* provides a comprehensive overview of the subject of neural networks and is divided into 4 parts –Part I: From Biology to Formalization — Motivation, Philosophy, History and Realization of Neural Models,Part II: Supervised learning Network Paradigms, Part III: Unsupervised learning Network Paradigms and Part IV: Excursi, Appendices and Registers.

## A Course In Machine Learning (PDF)

**Hal Daumé III**

*A Course In Machine Learning* is designed to provide a gentle and pedagogically organized introduction to the field and provide a view of machine learning that focuses on ideas and models, not on math.

According to the book:

The audience of this book is anyone who knows differential calculus and discrete math, and can program reasonably well. (A little bit of linear algebra and probability will not hurt.) An undergraduate in their fourth or fifth semester should be fully capable of understanding this material. However, it should also be suitable for first year graduate students, perhaps at a slightly faster pace.

## A First Encounter With Machine Learning (PDF)

**Max Welling**

*A First Encounter With Machine Learning* is a precursor to more technical and advanced textbooks. It was written to fill the need for a simple, intuitive textbook to introduce those just starting in the field to the concepts of machine learning.

## An Introduction To Statistical Learning

**Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani**

This book provides an introduction to statistical learning methods. It is aimed for upper level undergraduate students, masters students and Ph.D. students in the non-mathematical sciences. The book also contains a number of R labs with detailed explanations on how to implement the various methods in real life settings, and should be a valuable resource for a practicing data scientist.

## Bayesian Reasoning and Machine Learning

**David Barber**

The book is designed to appeal to students with only a modest mathematical background in undergraduate calculus and linear algebra. No formal computer science or statistical background is required to follow the book, although a basic familiarity with probability, calculus and linear algebra would be useful. The book should appeal to students from a variety of backgrounds, including Computer Science, Engineering, applied Statistics, Physics, and Bioinformatics that wish to gain an entry to probabilistic approaches in Machine Learning.

## Deep Learning

**Ian Goodfellow, Yoshua Bengio and Aaron Courville**

*Deep Learning* is a textbook intended to help students and practitioners enter the field of machine learning in general and deep learning in particular.

It is divided into 3 parts — Part I: Applied Math and Machine Learning Basics, Part II: Modern Practical Deep Networks, Part III: Deep Learning Research

## Gaussian Processes for Machine Learning

**Carl Edward Rasmussen and Christopher K. I. Williams
**

Gaussian Processes for Machine Learningdeals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed.

## Information Theory, Inference, and Learning Algorithms

**David J.C. MacKay**

*Information Theory, Inference, and Learning Algorithms *is targeted at self-learners, undergraduate and graduate students.

This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks.

The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes — the twenty-first century standards for satellite communications, disk drives, and data broadcast.

## Introduction to Machine Learning

**Amnon Shashua**

*Introduction to Machine Learning* is 109 pages of class notes from a machine learning course which was taught at the Hebrew University of Jerusalem. The notes cover 9 topics — Bayesian Decision Theory, Maximum Likelihood/Maximum Entropy Duality, EM Algorithm: ML over Mixture of Distributions, Support Vector Machines and Kernel Functions, Spectral Analysis I: PCA, LDA, CCA, Spectral Analysis II: Clustering, The Formal (PAC) Learning Model, The VC Dimension and The Double-Sampling Theorem.

## Learning Deep Architectures for AI (PDF)

**Yoshua Bengio**

This paper discusses the motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer models such as Restricted Boltzmann Machines, used to construct deeper models such as Deep Belief Networks.

## Machine Learning

**Abdelhamid Mellouk and Abdennacer Chebira**

*Machine Learning *spans 20 chapters, but is more of a compilation of standalone papers from individual authors. Topics covered include Neural Machine Learning Approaches: Q-Learning and Complexity Estimation Based Information Processing System and Hamiltonian Neural Networks Based Networks for Learning to 3D Shape Classification and Retrieval Using Heterogenous Features and Supervised Learning and Machine Learning for Sequential Behavior Modeling and Prediction.

## Machine Learning, Neural and Statistical Classification

**D. Michie, D.J. Spiegelhalter, C.C. Taylor**

Machine Learning, Neural and Statistical Classificationis based on the EC (ESPRIT) project StatLog which compare and evaluated a range of classification techniques, with an assessment of their merits, disadvantages and range of application. This integrated volume provides a concise introduction to each method, and reviews comparative trials in large-scale commercial and industrial problems. It makes accessible to a wide range of workers the complex issue of classification as approached through machine learning, statistics and neural networks, encouraging a cross-fertilization between these disciplines.

## Neural Networks and Deep Learning

**Michael Nielsen**

Neural Networks and Deep Learningis a free online book. The book will teach you about:

- Neural networks, a beautiful biologically-inspired programming paradigm which enables a computer to learn from observational data
- Deep learning, a powerful set of techniques for learning in neural networks
The purpose of this book is to help you master the core concepts of neural networks, including modern techniques for deep learning. After working through the book you will have written code that uses neural networks and deep learning to solve complex pattern recognition problems. And you will have a foundation to use neural networks and deep learning to attack problems of your own devising.

## Probabilistic Models in the Study of Language (Draft)

**Roger Levy**

The intended audience is graduate students in linguistics, psychology, cognitive science, and computer science who are interested in using probabilistic models to study language.

## Reinforcement Learning: An Introduction

**Richard S. Sutton and Andrew G. Barto**

The book consists of three parts. Part I is introductory and problem oriented. We focus on the simplest aspects of reinforcement learning and on its main distinguishing features. One full chapter is devoted to introducing the reinforcement learning problem whose solution we explore in the rest of the book. Part II presents what we see as the three most important elementary solution methods: dynamic programming, simple Monte Carlo methods, and temporal-difference learning. The first of these is a planning method and assumes explicit knowledge of all aspects of a problem, whereas the other two are learning methods. Part III is concerned with generalizing these methods and blending them. Eligibility traces allow unification of Monte Carlo and temporal-difference methods, and function approximation methods such as artificial neural networks extend all the methods so that they can be applied to much larger problems. We bring planning and learning methods together again and relate them to heuristic search. Finally, we summarize our view of the state of reinforcement learning research and briefly present case studies, including some of the most impressive applications of reinforcement learning to date.

## The Elements of Statistical Learning: Data Mining, Inference, and Prediction

**Trevor Hastie, Robert Tibshirani, and Jerome Friedman**

It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book’s coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting–the first comprehensive treatment of this topic in any book.

I recently chatted to Shane Mac, a world expert on machine learning [Episode 482 of The App Guy Podcast]. Mark Zuckerberg loves his new machine learning retail platform he built for Facebook messenger.

Let me know if you want an introduction to him via email?

http://www.theappguy.co/the-app-guy-podcast/2016/8/12/tagp482-shane-mac

403 error for http://www1.maths.leeds.ac.uk/~charles/statlog/

MACHINE LEARNING, NEURAL AND STATISTICAL CLASSIFICATION

D. Michie, D.J. Spiegelhalter, C.C. Taylor

Hi Femto,

I’m not getting a 403 error. I just checked the link and it loaded fine for me.

It seems to have been fixed now. Thanks

Thank you very much!

Very useful. Thankyou very much!

Thank you, Renaud. I’m glad you found this post useful.

As an entrant into ML, I thank you for putting this list together.

I can certainly see many of these being future reference materials for me.

Thank you. I appreciate you taking the time to check out the list and leave a comment. I’m glad you found it useful.

Great List. Thanks.

Thank you for visiting and leaving a comment. I’m glad you found the list useful.

Thx a lot ! great set of book

Thank you for visiting. I enjoyed putting the list together.

Thanks, this is very helpful.

Thank you for visiting. I’m glad you found the list helpful.