Introduction
This is part seven in a mini series of posts where I’m documenting build tasks as part of the Twitter Innovation Challenge 2017 #Promote.
Last weekend I ran through one of the Microsoft Cognitive Services API’s called LUIS and configured it to infer commercial intent from human text. Whilst this worked reasonably well, it does only give you about 10,000 “hits” a month.
It got me thinking, why not use a Bayesian Classifier I’d previously written in C# and load that with sufficient training data? Food for thought.
With limited time, this is a short post and I’ve built:
- basic Tweet view model
- method which generates a hashed version of a Twitter username
View Model
Not that much to report here:
public class TweetViewModel { public string id { get; set; } public string tweet { get; set; } public string profileImageUrl { get; set; } public string username { get; set; } public string bioDesc { get; set; } public string isGreenLead { get; set; } public string isAmberLead { get; set; } }
Twitter Audiences
“Someone who has already expressed interest in your business is more likely to engage with your marketing messages. But how can you find, and reach this audience on Twitter?”
The answer to this is Tailored Audiences and you can read more about these here.
From a C# perspective, the first thing you need to do is encrypt each user name with a hash. The method below applies the hash which Twitter expects at their end:
public string sha256_hash(String value) { StringBuilder Sb = new StringBuilder(); using (SHA256 hash = SHA256.Create()) { Encoding enc = Encoding.UTF8; byte[] result = hash.ComputeHash(enc.GetBytes(value)); foreach (byte b in result) Sb.Append(b.ToString("x2")); } return Sb.ToString(); }
Why?
This solution will listen for commercial signals on Twitter then automatically harvest the user ids, hash them and add these to the Tailored Audience List.
This list can then be uploaded to the Twitter TON API. The users can then be served creatives (ads) directly into their timeline .
Finally
The TON API deserves it’s own post and I may cover that in the future but for now, all of the main building blocks are now there. Next steps are to integrate these and work on the data model.
1 Pingback