Twitter OAuth with PHP and cURL multi

No Gravatar

I am maintaining a website that aggregate a large number of Twitter accounts’ tweets. For this purpose we’ve been given one of the accounts with the high API request rate limit – usually it is possible to call the Twitter API’s 150 times every hour. We have a rate limit of 20.000 hourly requests and we use most of them to keep up a near real time picture of the activity of the user base.

Recently Twitter disabled regular basic HTTP authentication supporting only OAuth – which generally is a good thing, In this case we simply read the RSS-version of the user_timeline and OAuth seems overkill for reading a simple public time line.¬†Aaaanyways – we are using PHP with cURL multi and OAuth introduces a bit of extra code. Sure – there are plenty of OAuth API’s and OAuth API’s for Twitter out there, but the specific combination of PHP cURL multi and GET’ing user_timelines required a combination of Google’ing and a bit of guesswork.

Preparations

– First¬†Register your application with Twitter – You’ll need a couple of tokens and secrets from your app-page and from the specific token page.
– If you need to make more than 150 requests per hour, apply for¬†whitelisting here. ¬†It’ll take a while for Twitter to process your request.
– Make sure you have cURL for PHP installed

Specifically preparing the GET connections for cURL presented a challenge – this piece of code did the trick for us:

<?php
// $url_array is an array of Twitter RSS-feed URL's that we need to read
$mh = curl_multi_init();
foreach($url_array as $i => $item) {
	// Keys and secrets from http://dev.twitter.com/apps/
	$consumer_secret = '<get your own from http://dev.twitter.com/apps/_your_app_id_/>';
	$token_secret = '<get your own from http://dev.twitter.com/apps/_your_app_id_/my_token>';

	// Build ud an array of OAuth parameters
	$params = Array();
	$params['oauth_consumer_key'] = '<get your own from http://dev.twitter.com/apps/_your_app_id_/>';
	$params['oauth_token'] = '<get your own from http://dev.twitter.com/apps/_your_app_id_/my_token>';
	$params['oauth_signature_method'] = 'HMAC-SHA1';
	$thetime = time();
	$params['oauth_timestamp'] = $thetime;
	$params['oauth_nonce'] = SHA1($thetime);
	$params['oauth_version'] = '1.0';

	// Sort the array alphabetically
	ksort($params);

	// Build the paramter string for the GET request
	$concatenatedParams = '';
	foreach($params as $k => $v)
	{
	  $k = urlencode($k);
	  $v = urlencode($v);
	  $concatenatedParams .= "{$k}={$v}&";
	}
	$unencodedParams = substr($concatenatedParams,0,-1);
	// URL-encode the parameters for signature use
	$concatenatedParams = urlencode(substr($concatenatedParams,0,-1));
	$signatureBaseString = "GET&".urlencode($item['url'])."&".$concatenatedParams;

	$params['oauth_signature'] = base64_encode( hash_hmac('sha1', $signatureBaseString, $consumer_secret."&".$token_secret, true) );

	// Initiate a new cURL connection and set up it's URL
	$conn[$i] = curl_init();
	curl_setopt($conn[$i], CURLOPT_URL, $item['url'] . '?' . $unencodedParams);
	curl_setopt($conn[$i], CURLINFO_HEADER_OUT, 1);

	// Build the HTTP header for the request
	$curlheader = Array();
	$curlheader[]='Content-Type: application/x-www-form-urlencoded';
	$curlheader[] = 'Authorization: OAuth';
	foreach($params as $ky=>$va) {
		$curlheader[] = "$ky=$va,\n";
	}

	// Initiate a new cURL connection and assign URL + HTTP header to it
	$conn[$i] = curl_init($item['url']);
	curl_setopt($conn[$i], CURLOPT_HTTPHEADER, $curlheader);
	curl_setopt($conn[$i], CURLOPT_HTTPGET, 1);

	// Add the connection to the cURL multi handle
	curl_multi_add_handle ($mh,$conn[$i]);
}

// Now execute curl_multi_exec on the $mh handle
This can no doubt be improved and secured in many ways, but since this specific example runs from as a cron-job/a PHP CLI Рnot on a server that replies to  inbound connections Рthe keys are not secured any further in this implementation.

Ubuntu 10.4 – up with only one quirk so far

No Gravatar

This morning I let my Ubuntu 9.10 upgrade itself to 10.4. The upgrade itself ran fine with only a few interruptions where I had to confirm or deny changes to configuration files for PHP etc.

As I landed at the office and hooked up the laptop (Thinkpad T61P) however, my external monitor was flickering a lot. Tried booting into Windows and everything was fine. Google’d and found this post on Ubuntu Forums.

The post suggests making changes to menu.lst in /boot/grub. This is a somewhat short term solution, because the problem will probably reappear at the next kernel upgrade (which could be just around the corner). The fix may be on it’s way in an actual Ubuntu fix, but i chose a different approach which may also seem hack’ish – the fix could be more robust though.

Disclaimer: If you are not sure what you are doing, be very careful. Misspellings or deviations from the description below may lead to an unbootable system. This worked for me and doesn’t have to work for you. Make sure to back up any files you change.

OK? Let’s go..

First start a terminal (ALT-F2, enter “gnome-terminal”):

cd /etc/grub.d

Then:

sudo nano 10_linux

which will open an editor. Around line 96 inside the linux_entry-function is a line that looks like this:

linux    ${rel_dirname}/${basename} root=${linux_root_device_thisversion} ro ${args}

and it should be changed to:

linux    ${rel_dirname}/${basename} root=${linux_root_device_thisversion} ro radeon.modeset=0 ${args}

Having done this, you should ask grub to rebuild it’s configuration files with:

sudo update-grub

Reboot and go on without the flicker..

Will ideas of the Massachusetts law spread?

No Gravatar

On reddit, I found this article regarding a new law i Massachusetts that is to increase the security of personally identifiable information.

If you have personally identifiable information (PII) about a Massachusetts resident, such as a first and last name, then you have to encrypt that data on the wire and as it’s persisted. Sending PII over HTTP instead of HTTPS? That’s a big no no. Storing the name of a customer in SQL Server without the data being encrypted?  No way, Jose. You’ll get a fine of $5,000 per breach or lost record. If you have a database that contains 1,000 names of Massachusetts residents and lose it without the data being encrypted that’s $5,000,000. Yikes.

The idea is good (yay – better security for anyone registering on a website) and bad (expensive). As a consultant in a country not covered by the law I could care less, but is this in general a good idea? Securing data and sending sensitive information across secure connections is always a good idea, and it could hit the EU soon. Documenting how you secure the Personally Identifiable Information (PII)? Who can argue with that being a good idea? As an IT consultant you won’t hear me cry myself to sleep.

But the proportions of the legislation seem unclear. If you log on to my website (securely of course), and I feed you the following pages through clear text HTTP – can I be fined if your name appear on the page as in “Reader XYZ is logged in”?

I guess so.

It strikes me that there is no level of sensitivity defined – anything considered personal must be secured. As a legislator, it seems very easy to do this when you don’t have to pick up the bill.

If this kind of legislation should hit Europe, I hope someone would elaborate a bit on the do’s and dont’s:

  • Are all kinds of data included? Can I be fined if you, as a Massachusetts citizen, post a comment to this and include your name? (the database is secured, but nothing is sent through HTTPS)
  • Would it make sense to allow users to wave their rights, and thereby allow users to work with online applications that are not 100% secure?