Some of the data extracted included:
- Tweet copy
- Tweet location
- Most mentioned user
- Most popular hashtag
- Most popular location
Under the hood, the tool was also performing:
- Named Entity Recognition – to detect the existence of people, places, organisations in each tweet
- Key Phrase Extraction – to return a list of strings and key talking points in each tweet
- Sentiment Analysis – to identify the underlying tone of each tweet
I swapped out some of my custom code and leveraged the Text Analytics API to perform the above three tasks. This removed some of the pain when processing text and meant that I had less code to debug and maintain.
The data is visualised in a ASP.NET web application and after eyeballing the data you can see that over the course of a few days:
- 122,128 Tweets were processed
- #AI was the most popular hashtag being mentioned 41,309 times
- @MikeQuindazzi was the most mentioned user being mentioned 5,892 times
- United States was the most popular location being mentioned 2,279 times
Here you can see the overall sentiment for the event – positive vibes all round!
A deeper look at the insights generated by custom Part of Speech (POS) Tagger and Azure Cognitive Services Text Analytics API surfaced the following:
(For some reason, the Top 10 User by Followers query blew up, so I need to check out what went on with that!)
Finally, the screen shot here show the top users that had the most mentions:
Migration to Azure SQL
In the earlier post I had been running this locally on my laptop and had been using my free Azure credit allowance. With the volume of tweets being processed I started to max out the free allowance and the database performance wasn’t so good.
I ended up migrating my local SQL database to an Azure SQL instance using the Data Migration Assistant. This went relatively pain free.
The only issue I had was with a few stored procedures that were explicitly naming a DB instance but those were fixed relatively quickly.
After the database was successfully migrated, I set my Azure SQL Database to use the Basic service. This seemed ok initially, but very quickly the sheer volume of data slowed down the database.
I found a decent price point and acceptable performance by upping the tier to the Standard level with 50 DTUs – just under £70 a month.
Now the database has been migrated to Azure SQL, I no longer have the headache of maintaining the SQL box on my older cloud server.
This exercise was useful from a few different perspectives, namely:
- identifying performance bottlenecks
- optimum database settings
- estimating total monthly cost for running Azure SQL
- estimating total monthly cost for running Azure Cognitive Services per 100,000k records
If you’re considering migrating your legacy .NET application to Azure but not sure where to start, then some of what’s been discussed might help. Or if you have any questions then leave a comment below or contact me on Twitter.