Social networking tools seem to have become one of the more popular Internet technologies today. Out of the many available, Twitter is ahead of them all when it comes to popularity and micro-blogging. While browsing the Internet you may run across a site that has taken advantage of Twitter’s portability by integrating their Twitter feed within the site.
There are many different ways to retrieve Twitter user information, one of which is using the Twitter API. The Twitter API provides a variety of functions that can be used to retrieve a user’s account data. On one hand it is very beneficial to use these API functions by reducing in-house programming, but on the other hand making multiple calls to an API from a site that has a large volume of traffic can cause issues like affecting download times and generating errors.
In our development group, when we need to implement a Twitter feed and not affect performance of the site we prefer to cache the information in a database. We do this by writing a script that runs as a scheduled service from the database. This script makes a call to one of the common Twitter API functions to retrieve the user’s timeline. In this post I will be specifically focusing on the .NET framework.
ABT_Twitter t2 = new ABT_Twitter();
string timelineAsXml = t2.GetUserTimeline("username", "userpass", OutputFormatType)
Now that we have the data we need from Twitter we can begin picking apart the data to store into our database. In this case we chose XML as the OutputFormatType
so first we need to create a XmlDocument
to store the local string variable.
XmlDocument xmlDoc = new XmlDocument();
doc.LoadXml(timelineAsXml);
The next step is to create a simple DataTable
that will store each of the desired values to be retrieved from the Twitter API, then we need to add the columns for the values we choose to cache.
DataTable tweetDataTable = new DataTable();
tweetDataTable.Columns.Add("value1");
tweetDataTable.Columns.Add("value2");
The next thing to do is to loop through each XmlNode
in the XmlDocument
we instantiated earlier for each node in a certain xpath
. Inside this loop we are going to create a new DataRow
and set the values of the columns we added to the tweetDataTable
earlier to the nodes we are trying to retrieve. Once all of the values of the columns in the DataRow
have been set we will need to add this new row to the tweetDataTable
we created.
foreach (XmlNode node in xmlDoc.SelectNodes("xpath")
{
DataRow tweetDataRow = tweetDataTable.NewRow();
tweetDataRow["value1"] = (node["desiredNode1"].InnnerText);
tweetDataRow["value2"] = (node["desiredNode2"].InnerText);
tweetDataTable.Rows.Add(dr);
}
Now we have retrieved the information we need from the Twitter API and placed it into a DataTable
. Also, we will perform an if statement to make sure that our DataTable
has information in it to keep our script from performing unnecessary actions. We will need to truncate the table that we are storing our Twitter information in to eliminate the possibility of redundant data. After the table has been truncated our script will now loop through each row in the tweetDataTable
we populated earlier and insert data associated with each “tweet.”
if (tweetDataTable.Rows.Count > 0)
{
SqlCommand truncateCommand = new SqlCommand("truncateStoredProcedureName", connectionString);
truncateCommand.CommandType = CommandType.StoredProcedure;
truncateCommand.ExecuteNonQuery;
for (int i = 0; i < tweetDataTable.Rows.Count; i++)
{
SqlCommand insertCommand = new SqlCommand("insertStoredProcedureName", connectionString);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("@parameter1", tweetDataTable.Rows[i]["value1"]));
cmd.Parameters.Add(new SqlParameter("@parameter2", tweetDataTable.Rows[i]["value2"]));
cmd.ExecuteNonQuery();
}
}
And voilà! Now all of the information has been retrieved from the Twitter API and cached into a local database. The idea behind using this method to retrieve the user information is to only call the script on a scheduled service so the service is not called every time the page is loaded. Now that the data is stored locally it takes a simple database call to retrieve the information.