“People often say they could run Britain better than the political parties. A web-based revolution may give them the chance”
Nice article in the Sunday Times today mentioning lots of our sites and others.
Last week at the SustainIT eWell-Being Awards, we picked up an award for FixMyStreet. The judges said it was “[a]n excellent example of an independent website which empowers the general public in their dealings with their local council. It is a relatively simple application, yet highly effective and replicable.” One example the accompanying Independent supplement mentioned was “a community in Great Yarmouth which joined forces through FixMyStreet to clear their local unused railway track. The site made possible a dialogue between community members and the council’s community development worker, who organised a “clear up” day where locals could get involved with rectifying the situation, with tools, insurance and even a barbeque provided.” It’s great to see that sort of thing happening on the site, and also great to be recognised in this way.
In a spirit of celebration (though more to celebrate the endorsements the campaign has received), TheyWorkForYou now covers the Scottish Parliament – see the TheyWorkForYou news for more information.
Residents turn to web in lane fight describes a set of problems reported on FixMyStreet.
A NARROW mountain lane has been damaged and turned into an “international playground” for 4x4s and satnav-guided lorries, angry villagers claimed yesterday.
The interesting thing is the claim that making the reports public really helped pressurise the authority into fixing them
one [resident] claimed the local authority had now been “embarrassed” into action by the complaints on fixmystreet.com
I’ve just come back from a hectic week away – visiting friends and working while there. It’s good teleworking, I worked in a welsh valley on Monday, Tuesday and Wednesday staying at my friend Ben’s house.
Ben has been working for the Media Standards Trust, on their fab new site Journalisted. It’s a kind of a TheyWorkForYou-style site, but which gathers info and stats about journalists (as opposed to MPs). I’ve added links to it from TheyWorkForYou, so if you browse to an MP page who is also a journalist you can easily look up what they wrote.
Hopefully this will put some much needed scrutiny on journalists – rewarding the good ones, who are both allowed the time to properly research stories, and are energetic enough to do so. Now, what will be the next Great British Institution to have some sunlight on it…
Ben and I whizzed to London on the train to catch Steve Coast’s talk last night. He was as usual, excellent. We also had some great questions – it was interesting to hear a detailed account of the financial side of Open Street Map. Afterwards we had Italian food and beer. mySociety’s next disruptive tech talk is at the end of November. It’s by Jason Kitcat of the Open Rights Group talking about electronic voting.
I’m glad to be back home after a week away.
Hello, all. Just wanted to let you know about another podcast interview I did about PledgeBank…this time on the 501c3Cast. This podcast is geared toward folks in the nonprofit/philanthropic world (the tax code classification for many nonprofits in the U.S. is “501(c)3”) to spread the word about new tools, ideas, and conversations in the sector. You can find the podcast here (though, be forewarned, it’s a bit long…).
Last night was the annual New Statesman New Media Awards, held in Westminster Abbey’s College Gardens. mySociety were finalists in two categories, Modernising Government and Contribution to Civic Society, with both Number 10 petitions and FixMyStreet nominated in both. Also, two other projects we host, PlanningAlerts and The Government Says, were both finalists in the Information & Openness category.
It was a lovely evening, seeing some people I haven’t seen for some time and meeting new people too. We ended up winning in both our categories – the Number 10 petitions site in Modernising Government, and FixMyStreet in Contribution to Civic Society, which is obviously fantastic for everyone involved. The judges were impressed at the open source nature of the petitions site, and the “deceptive simplicity” of FixMyStreet. This is now the third year in a row we’ve won the Civic Society award – TheyWorkForYou won in 2005, and WriteToThem in 2006, so we’re obviously doing something right. 🙂
It’s a shame that Chris could not be with us, but his mother did attend to see the projects he worked on recognised.
Thanks and congratulations to all the other winners and finalists.
The “um’s” and “uh’s” on the interview are embarrassing, to say the least, but PledgeBank just got its first podcast coverage. Check out my NetSquared interview here. I’m hoping this gets us a bit of exposure in the U.S.
And I’ll do better next time with all the incidental sounds… 🙂
Hello, all…this is Heather, PledgeBank evangelist in the U.S. Francis just hooked me up with the ability to post the the developers’ blog, so I thought I’d give it a whirl…
We’re starting to get a little momentum around PB…I sent out emails this week to folks working at state networks of volunteer centers across the country and have gotten quite a few responses. This isn’t huge press or anything, but it’s good to know that folks who work at volunteer centers full-time think there’s enough value to the site on first glance that they’ll give me a chance to chat with them. I’ll keep you updated on how those phone calls and meetings go, and might be asking for some more specific tech details behind the site as questions arise.
It might also be of interest to folks that I’m headed this afternoon to Georgia for a retreat with about 25 of the most prominent bloggers in the country and a handful of progressive funders…all talking about the state of and future of the blogosphere. You can see more about the meeting at http://neworganizing.com. I helped do a lot of the logistics for the retreat, and am looking forward to meeting some of these mysterious folks this weekend!
Neighbourhood Fix-It makes it as easy as possible for citizens across
the UKto report local problems like fly tipping, broken lights, graffiti etc, whilst opening the problems up to browsing and public discussion of solutions.
The problem tackled
Councils across the UK do an excellent job of fixing local problems when they’re reported by citizens. However, the model for handling the information is a system of doctor-patient style confidentiality. A citizen who makes a report normally knows about a problem, and so does the council, but there is no general public way of finding out what has been reported or fixed.
Given that the nature of public problems being reported is that they are public, this seems a strange situation.
Neighbourhood Fix-It opens up and democratises the process of discovering and reporting problems, so people can see what other reports have been filed locally using the site, and can leave extra feedback and comments on the problems if they see fit.
In quiet beta test for a few weeks prior to launch, several hundred problems have already been reported across the UK. Fixes by councils so far include:
- Fixed paving slabs
- Redundant estate agent signs removed
- Filled pot holes
- Removed graffiti
Funding and Partnership
The project was funded with £10,000 of support from the Department of Constitutional Affairs Innovations Fund, and is a partnership with the Young Foundation’s Transforming Neighbourhoods Programme, a consortium of 15 local authorities, government departments and community organisations working together on practical ways to give more powers to neighbourhoods.
Tom’s quote from the press release: “Neighbourhood Fix-It aims to change the act of reporting faults – turning it from a private one-to-one process into a public experience where residents can see if anyone else in the neighbourhood has already spotted and reported a problem, and to see how their council is acting on it. We hope the website will make the process of reporting faults more efficient, possibly reducing the number of individual reports that councils receive because people will be able to see that their neighbours have already made the call.”
So, PledgeBank got Slashdotted a couple of weeks ago when Mike Liveright’s $100 laptop pledge was linked from a post about the laptop. We didn’t cope very well.
Unfortunately, PledgeBank is a pretty slow site. Generating the individual pledge page (done by mysociety/pb/web/ref-index.php) can take anything up to 150ms. That’s astonishingly slow, given the speed of a modern computer. What takes the time?
It’s quite hard to benchmark pages on a running web server, but one approach that I’ve found useful in the past is to use an analogue of phase-sensitive detection. Conveniently enough, all the different components of the site — the webserver, the database and the PHP process — run as different users, so you can easily count up the CPU time being used by the different components during an interval. To benchmark a page, then, request it a few times and compute the amount of CPU time used during those requests. Then sleep for the same amount of time, and compute the amount of CPU time used by the various processes while you were sleeping. The difference between the values is an estimate of the amount of CPU time taken servicing your requests; by repeating this, a more accurate estimate can be obtained. Here are the results after a few hundred requests to http://www.pledgebank.com/100laptop, expressed as CPU time per request in ms:
Subsystem User System apache ~0 ~0 PostgreSQL 55±9 6±4 PHP 83±8 4±4
(The code to do the measurements — Linux-specific, I’m afraid — is in mysociety/bin/psdbench.)
So that’s pretty shocking. Obviously if you spend 150ms of CPU time on generating a page then the maximum rate at which you can serve users is ~1,000 / 150 requests/second/CPU, which is pretty miserable given that Slashdot can relatively easily drive 50 requests/second. But the really astonishing thing about these numbers is the ~83ms spent in the PHP interpreter. What’s it doing?
The answer, it turns out, is… parsing PHP code! Benchmarking a page which consists only of this:
<? /* ... */ require_once '../conf/general'; require_once '../../phplib/db.php'; require_once '../../phplib/conditional.php'; require_once '../phplib/pb.php'; require_once '../phplib/fns.php'; require_once '../phplib/pledge.php'; require_once '../phplib/comments.php'; require_once '../../phplib/utility.php'; exit; ?>
reveals that simply parsing the libraries we include in the page takes about 35ms per page view! PHP, of course, doesn’t parse the code once and then run the bytecode in a virtual machine for each page request, because that would be too much like a real programming language (and would also cut into Zend’s market for its “accelerator” product, which is just an implementation of this obvious idea for PHP).
So this is bad news. The neatest approach to fixing this kind of performance problem is to stick a web cache like squid in front of the main web site; since the pledge page changes only when a user signs the pledge, or a new comment is posted, events which typically don’t occur anywhere near as frequently as the page is viewed, most hits ought to be servable from the cache, which can be done very quickly indeed. But it’s no good to allow the pledge page to just sit in cache for some fixed period of time (because that would be confusing to users who’ve just signed the pledge or written a comment, an effect familiar to readers of the countless “Movable Type” web logs which are adorned with warnings like, “Your comment may take a few seconds to appear — please don’t submit twice”). So to do this properly we have to modify the pledge page to handle a conditional GET (with an If-Modified-Since: or If-None-Match: header) and quickly return a “304 Not Modified” response to the cache if the page hasn’t changed. Unfortunately if PHP is going to take 35ms to process such a request (ignoring any time in the database), that still means only 20 to 30 requests/second, which is better but still not great.
(For comparison, a mockup of a perl program to process conditional GETs for the pledge page can serve each one in about 3ms, which isn’t much more than the database queries it uses take on their own. Basically that’s because the perl interpreter only has to parse the code once, and then it runs in a loop accepting and processing requests on its own.)
However, since we unfortunately don’t have time to rewrite the performance-critical bits of PledgeBank in a real language, the best we can do is to try to cut the number of lines of library code that the site has to parse on each page view. That’s reduced the optimal case for the pledge page — where the pledge has not changed — to this:
<? /* ... */ require_once '../conf/general'; require_once '../../phplib/conditional.php'; require_once '../../phplib/db.php'; /* Short-circuit the conditional GET as soon as possible -- parsing the rest of * the includes is costly. */ if (array_key_exists('ref', $_GET) && ($id = db_getOne('select id from pledges where ref = ?', $_GET['ref'])) && cond_maybe_respond(intval(db_getOne('select extract(epoch from pledge_last_change_time(?))', $id)))) exit(); /* ... */ ?>
— that, and a rewrite of our database library so that it didn’t use the gigantic and buggy PEAR one, has got us up to somewhere between 60 and 100 reqs/sec, which while not great is enough that we should be able to cope with another similar Slashdotting.
For other pages where interactivity isn’t so important, life is much easier: we can just emit a “Cache-Control: max-age=…” header, which tells squid that it can re-use that copy of the page for however long we specify. That means squid can serve that page at about 350reqs/sec; unfortunately the front page isn’t all that important (most users come to PledgeBank for a specific pledge).
There’s a subtlety to using squid in this kind of (“accelerator”) application which I hadn’t really thought about before. What page you get for a particular URL on PledgeBank (as on lots of other sites) vary based on the content of various headers sent by the user, such as cookies, preferred languages, etc.; for instance, if you have a login cookie, you’ll see a “log out” link which isn’t there if you’re an anonymous user. HTTP is set up to handle this kind of situation through the Vary: header, which the server sends to tell clients and proxies on which headers in the request the content of the response depends. So, if you have login cookies, you should say, “Vary: Cookie”, and if you do content-negotiation for different languages, “Vary: Accept-Language” or whatever.
PledgeBank has another problem. If the user doesn’t have a cookie saying which country they want to see pledges for, the site tries to guess, based on their IP address. Obviously that makes almost all PledgeBank pages potentially uncachable — the Vary: mechanism can’t express this dependency. That’s not a lot of help when your site gets featured on Slashdot!
The (desperately ugly) solution? Patch squid to invent a header in each client request, X-GeoIP-Country:, which says which country the client’s IP address maps to, and then name that in the Vary: header of the outgoing pledges. It’s horrid, but it seems to work.