Artificial Intelligence

Chatbots  aren't as difficult to make as You Think

Chatbots aren't as difficult to make as You Think

Chatbots are the in thing now. Every website must implement it. Every Data Scientist must know about them. Anytime we talk about AI; Chatbots must be discussed. But they look intimidating to someone very new to the field. We struggle with a lot of questions before we even begin to start working on them. Are they hard to create? What technologies should I know before attempting to work on them?
NLP  Learning Series: Part 4 - Transfer Learning Intuition for Text Classification

NLP Learning Series: Part 4 - Transfer Learning Intuition for Text Classification

This post is the fourth post of the NLP Text classification series. To give you a recap, I started up with an NLP text classification competition on Kaggle called Quora Question insincerity challenge. So I thought to share the knowledge via a series of blog posts on text classification. The first post talked about the different preprocessing techniques that work with Deep learning models and increasing embeddings coverage.
NLP  Learning Series: Part 3 - Attention, CNN and what not for Text Classification

NLP Learning Series: Part 3 - Attention, CNN and what not for Text Classification

This post is the third post of the NLP Text classification series. To give you a recap, I started up with an NLP text classification competition on Kaggle called Quora Question insincerity challenge. So I thought to share the knowledge via a series of blog posts on text classification. The first post talked about the different preprocessing techniques that work with Deep learning models and increasing embeddings coverage. In the second post, I talked through some basic conventional models like TFIDF, Count Vectorizer, Hashing, etc.
What my first Silver Medal taught me about Text Classification and Kaggle in general?

What my first Silver Medal taught me about Text Classification and Kaggle in general?

Kaggle is an excellent place for learning. And I learned a lot of things from the recently concluded competition on Quora Insincere questions classification in which I got a rank of 182/4037. In this post, I will try to provide a summary of the things I tried. I will also try to summarize the ideas which I missed but were a part of other winning solutions. As a side note: if you want to know more about NLP, I would like to recommend this awesome course on Natural Language Processing in the Advanced machine learning specialization.
NLP  Learning Series: Part 2 - Conventional Methods for Text Classification

NLP Learning Series: Part 2 - Conventional Methods for Text Classification

This is the second post of the NLP Text classification series. To give you a recap, recently I started up with an NLP text classification competition on Kaggle called Quora Question insincerity challenge. And I thought to share the knowledge via a series of blog posts on text classification. The first post talked about the various preprocessing techniques that work with Deep learning models and increasing embeddings coverage. In this post, I will try to take you through some basic conventional models like TFIDF, Count Vectorizer, Hashing etc.
NLP  Learning Series: Part 1 - Text Preprocessing Methods for Deep Learning

NLP Learning Series: Part 1 - Text Preprocessing Methods for Deep Learning

Recently, I started up with an NLP competition on Kaggle called Quora Question insincerity challenge. It is an NLP Challenge on text classification and as the problem has become more clear after working through the competition as well as by going through the invaluable kernels put up by the kaggle experts, I thought of sharing the knowledge. Since we have a large amount of material to cover, I am splitting this post into a series of posts.