Get Started With Natural Language Processing in iOS 11

In my latest article for @tutsplus, I talk about #CoreML and how to get started with #NLP for iOS. 

Machine learning has undoubtedly been one of the hottest topics over the past year, with companies of all kinds trying to make their products more intelligent to improve user experiences and differentiate their offerings. 

Now Apple has entered the race to provide developer-facing machine learning. Core ML makes it easy for developers to add deep machine learning to their apps.

Just by taking a look at your iOS device, you will see machine learning incorporated in almost every system app—the most obvious being Siri. For example, when you send text messages, Apple uses Natural Language Processing (NLP) to either predict your next word or intelligently suggest a correction whilst typing a word. Expect machine learning and NLP to continue to become ever-present and further ingrained in our use of technology, from search to customer service. 

Objectives of This Tutorial

This tutorial will introduce you to a subset of machine learning: Natural Language Processing (NLP). We'll cover what NLP is and why it's worth implementing, before looking at the various layers or schemes that make up NLP. These include:

  • language identification
  • tokenization
  • part of speech identification
  • named entity recognition

After going through the theory of NLP, we will put our knowledge to practice by creating a simple Twitter client which analyzes tweets. Go ahead and clone the tutorial’s GitHub repo and take a look.

Assumed Knowledge

This tutorial assumes you are an experienced iOS developer. Although we will be working with machine learning, you don’t need to have any background in the subject. Additionally, while other components of Core ML require some knowledge of Python, we won’t be working with any Python-related aspects with NLP. 

Introduction to Machine Learning and NLP

The goal of machine learning is for a computer to do tasks without being explicitly programmed to do so—the ability to think or interpret autonomously. A high-profile contemporary use-case is autonomous driving: giving cars the ability to visually interpret their environment and drive unaided. 

Beyond visual recognition, machine learning has also introduced speech recognition, intelligent web searching, and more. With Google, Microsoft, Facebook and IBM at the forefront of popularizing machine learning and making it available to ordinary developers, Apple has also decided to move in that direction and make it easier for machine learning to be incorporated into third-party applications. 

Core ML is new to Apple’s family of SDKs, introduced as part of iOS 11 to allow developers to implement a vast variety of machine learning modes and deep learning layer types. 


Natural Language Processing (NLP) logically sits within the Core ML framework alongside two other powerful libraries, Vision and GameplayKit. Vision provides developers with the ability to implement computer vision machine learning to accomplish things such as detecting faces, landmarks, or other objects, while GameplayKit provides game developers with tools for authoring games and specific gameplay features. 

In this tutorial, we will focus on Natural Language Processing. 

To read the rest of m article, visit Envato/Tuts+


Doron KatzNLP, CoreML, ML