From 0 to 1: Machine Learning, NLP & Python-Cut to the Chase

From 0 to 1: Machine Learning, NLP and Python – Cut to the Chase

$49

A down-to-earth, shy but confident take on machine learning techniques that you can put to work today

Product Description

This course is a down-to-earth, shy but confident take on machine learning techniques that you can put to work today

Let’s parse that.

The course is down-to-earth : it makes everything as simple as possible – but not simpler

The course is shy but confident : It is authoritative, drawn from decades of practical experience -but shies away from needlessly complicating stuff.

You can put ML to work today : If Machine Learning is a car, this car will have you driving today. It won’t tell you what the carburetor is.

The course is very visual : most of the techniques are explained with the help of animations to help you understand better.

This course is practical as well : There are hundreds of lines of source code with comments that can be used directly to implement natural language processing and machine learning for text summarization, text classification in Python.

The course is also quirky. The examples are irreverent. Lots of little touches: repetition, zooming out so we remember the big picture, active learning with plenty of quizzes. There’s also a peppy soundtrack, and art – all shown by studies to improve cognition and recall.

What’s Covered:

Machine Learning:

Supervised/Unsupervised learning, Classification, Clustering, Association Detection, Anomaly Detection, Dimensionality Reduction, Regression.

Naive Bayes, K-nearest neighbours, Support Vector Machines, Artificial Neural Networks, K-means, Hierarchical clustering, Principal Components Analysis, Linear regression, Logistics regression, Random variables, Bayes theorem, Bias-variance tradeoff

Natural Language Processing with Python:

Corpora, stopwords, sentence and word parsing, auto-summarization, sentiment analysis (as a special case of classification), TF-IDF, Document Distance, Text summarization, Text classification with Naive Bayes and K-Nearest Neighbours and Clustering with K-Means

Sentiment Analysis:

Why it’s useful, Approaches to solving – Rule-Based , ML-Based , Training , Feature Extraction, Sentiment Lexicons, Regular Expressions, Twitter API, Sentiment Analysis of Tweets with Python

A Note on Python: The code-alongs in this class all use Python 2.7. Source code (with copious amounts of comments) is attached as a resource with all the code-alongs. The source code has been provided for both Python 2 and Python 3 wherever possible.

What am I going to get from this course?

Identify situations that call for the use of Machine Learning
Understand which type of Machine learning problem you are solving and choose the appropriate solution
Use Machine Learning and Natural Language processing to solve problems like text classification, text summarization in Python

What is the target audience?

Yep! Analytics professionals, modelers, big data professionals who haven’t had exposure to machine learning
Yep! Engineers who want to understand or learn machine learning and apply it to problems they are solving
Yep! Product managers who want to have intelligent conversations with data scientists and engineers about machine learning
Yep! Tech executives and investors who are interested in big data, machine learning or natural language processing
Yep! MBA graduates or business professionals who are looking to move to a heavily quantitative role

Curriculum

  • M1 - What this course is about
  • M2 - Machine Learning: Why should you jump on the bandwagon?
  • M2 - Plunging In - Machine Learning Approaches to Spam Detection
  • M2 - Spam Detection with Machine Learning Continued
  • M2 - Get the Lay of the Land : Types of Machine Learning Problems
  • M3 - Random Variables
  • M3 - Bayes Theorem
  • M3 - Naive Bayes Classifier
  • M3 - Naive Bayes Classifier : An example
  • M4 - K-Nearest Neighbors
  • M4 - K-Nearest Neighbors : A few wrinkles
  • M5 - Support Vector Machines Introduced
  • M5 - Support Vector Machines : Maximum Margin Hyperplane and Kernel Trick
  • M6 - Clustering : Introduction
  • M6 - Clustering : K-Means and DBSCAN
  • M7 - Association Rules Learning
  • M8 - Dimensionality Reduction
  • M8 - Principal Component Analysis
  • M9 - Artificial Neural Networks:Perceptrons Introduced
  • M10 - Regression Introduced : Linear and Logistic Regression
  • M10 - Bias Variance Trade-off
  • M11 - Installing Python - Anaconda and Pip
  • M11 - Natural Language Processing with NLTK
  • M11 - Natural Language Processing with NLTK - See it in action
  • M11 - Web Scraping with BeautifulSoup
  • M11 - A Serious NLP Application : Text Auto Summarization using Python
  • M11 - Python Drill : Autosummarize News Articles I
  • M11 - Python Drill : Autosummarize News Articles II
  • M11 - Python Drill : Autosummarize News Articles III
  • M11 - Put it to work : News Article Classification using K-Nearest Neighbors
  • M11 - Put it to work : News Article Classification using Naive Bayes Classifier
  • M11 - Python Drill : Scraping News Websites
  • M11 - Python Drill : Feature Extraction with NLTK
  • M11 - Python Drill : Classification with KNN
  • M11 - Python Drill : Classification with Naive Bayes
  • M11 - Document Distance using TF-IDF
  • M11 - Put it to work : News Article Clustering with K-Means and TF-IDF
  • M11 - Python Drill : Clustering with K Means
  • M12 - A Sneak Peek at what's coming up
  • M12 - Sentiment Analysis - What's all the fuss about?
  • M12 - ML Solutions for Sentiment Analysis - the devil is in the details
  • M12 - Sentiment Lexicons ( with an introduction to WordNet and SentiWordNet)
  • M12 - Regular Expressions
  • M12 - Regular Expressions in Python
  • M12 - Put it to work : Twitter Sentiment Analysis
  • M12 - Twitter Sentiment Analysis - Work the API
  • M12 - Twitter Sentiment Analysis - Regular Expressions for Preprocessing
  • M12 - Twitter Sentiment Analysis - Naive Bayes, SVM and Sentiwordnet
  • M13 - Planting the seed - What are Decision Trees?
  • M13 - Growing the Tree - Decision Tree Learning
  • M13 - Branching out - Information Gain
  • M13 - Decision Tree Algorithms
  • M13 - Titanic : Decision Trees predict Survival (Kaggle) - I
  • M13 - Titanic : Decision Trees predict Survival (Kaggle) – II
  • M13 - Titanic : Decision Trees predict Survival (Kaggle) – III
  • M14 - Overfitting - the bane of Machine Learning
  • M14 - Overfitting Continued
  • M14 - Cross Validation
  • M14 - Simplicity is a virtue – Regularization
  • M14 - The Wisdom of Crowds - Ensemble Learning
  • M14 - Ensemble Learning continued - Bagging, Boosting and Stacking
  • M15 - Random Forests - Much more than trees
  • M15 - Back on the Titanic - Cross Validation and Random Forests
  • M16 - What do Amazon and Netflix have in common?
  • M16 - Recommendation Engines - A look inside
  • M16 - What are you made of? - Content-Based Filtering
  • M16 - With a little help from friends - Collaborative Filtering
  • M16 - A Neighbourhood Model for Collaborative Filtering
  • M16 - Top Picks for You! - Recommendations with Neighbourhood Models
  • M16 - Discover the Underlying Truth - Latent Factor Collaborative Filtering
  • M16 - Latent Factor Collaborative Filtering contd.
  • M16 - Gray Sheep and Shillings - Challenges with Collaborative Filtering
  • M16 - The Apriori Algorithm for Association Rules
  • M17 - Back to Basics : Numpy in Python
  • M17 - Back to Basics : Numpy and Scipy in Python
  • M17 - Movielens and Pandas
  • M17 - Code Along - What's my favorite movie? - Data Analysis with Pandas
  • M17 - Code Along - Movie Recommendation with Nearest Neighbour CF
  • M17 - Code Along - Top Movie Picks (Nearest Neighbour CF)
  • M17 - Code Along - Movie Recommendations with Matrix Factorization
  • M17 - Code Along - Association Rules with the Apriori Algorithm
  • M18 - Computer Vision - An Introduction
  • M18 - Perceptron Revisited
  • M18 - Deep Learning Networks Introduced
  • M18 - Code Along - Handwritten Digit Recognition -I
  • M18 - Code Along - Handwritten Digit Recognition – II
  • M18 - Code Along - Handwritten Digit Recognition – III

Course Requirements

No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory.

Working knowledge of Python would be helpful if you want to run the source code that is provided.

GET STARTED

Register | Lost your password?