Software Architect / Microsoft MVP (AI) and Technical Author

C#, Cognitive Services, Facebook API, Instagram Graph API, Machine Learning, Sentiment Analysis, Startups, Twitter

My talk about the benefits of AI with Microsoft and Grey Matter

Recently I was invited to speak to start-ups alongside Microsoft and Grey Matter at Level39 in London.  We introduced Microsoft Cognitive Services, I shared my experiences when using the Cognitive Services APIs in projects and spoke of some of the benefits deploying the APIs brought me.

Text Analytics

My focus was on analytics and natural language processing (NLP).  I spoke of my experience when building text analytics and custom sentiment analysis APIs.

I explained how I was able to build software that could identify if users on social media were expressing positive, negative or indifferent emotion in the messages they were posting.

If I had access to the Text Analytics API back then I could have saved quite a bit of development time!

Language Understanding with LUIS and Twitter

I introduced LUIS, the Natural Language Processing (NLP) service built on machine learning which helps you easily build and train language models than can infer the underlying intent behind human sentences.

I showed how LUIS can be easily consumed as-a-service by your existing software applications – again saving substantial development effort!

Part of Speech Tagging

A key piece of functionality I was delivering for an initiative with Twitter back in 2017 involved surfacing sales leads (or audiences) in social media conversations with accompanying analytics.  These audiences could then be served creative and contextual messages in near real-time.

I spoke about some of the challenges often associated when processing social media conversations and shared how Part of Speech Tagging (POS) techniques helped me structure data in a format that helped get around some of this.

This approach also helped the machine identify patterns and “understand” what was being said (to  a point).

For example, in one of the slides (above screenshot), the Tweet “Anyone have an iPhone for sale” was parsed out into a series of tokens (words) and the respective POS Tags were assigned:

  • NN
  • VBP
  • DT
  • NN
  • IN
  • NN

With the data tagged in this format, you can then apply business rules to certain combinations of POS Tags.

I talked about how I was able to use LUIS to accelerate development and add an additional layer of intelligence over the API I was building.  All of which formed part of a minimum-viable-product (MVP) that was submitted to Twitter as part of their #Promote initiative.

Azure, C# Demos and Postman

Not everyone had heard of the Cognitive Services APIs so to help with understanding, I ran through how to provision AI services in Azure, take the respective endpoints / keys and then quickly consume Cognitive Services in Postman for testing and evaluation purposes. It’s always much easier to get a feel for something when you can see it in action!

For the techies in the audience, I prepared some demos in C# / Visual Studio in the form of some simple console applications and showed how to integrate the Cognitive Services APIs with existing software applications and parse the information they can provide.

Image Recognition and National Geographic

There was some interest about image recognition and surfacing insights in pictures so we shared some ideas about how the Computer Vision APIs can be used to do that and how it’s possible to stack various Cognitive Services APIs to add multiple layers of intelligence to provide additional insights.

For example, I shared some ideas around an interface I’d built that extracted image meta data from Instagram and Twitter for researchers at the University of Michigan as part of National Geographic’s Photo Ark Project.

This interface provided researchers with data they needed to help:

  • identify popular and engaging content
  • identify the most popular types of content
  • identify potentially viral signals in social data

In theory, by using the Vision APIs, additional information could be inferred from images such as backdrop colour, types of animal and so on.  This was used as part of a wider initiative to help promote animal welfare and increase awareness of environmental issues.


It was great to speak to the attendees about the things they were doing with analytics, chatbots and social data.  The view across London from the 39th floor was impressive!

Get the latest content and code from the blog posts!
I respect your privacy. No spam. Ever.

Leave a Reply