Faces and Feeds

I think I might have to develop an app for reading Facebook the way I think it should work.

There was an article doing the rounds the other week about how “our minds can be hijacked,” which was all about how terrible social networking is for us. I skimmed part of it, but got annoyed when it seemed to be about rich Silicon Valley entrepreneurs deciding to go “off-grid.” That’s all very well for them, but most of us have to make a living.

More pertinently, since the main target for the attack was Facebook, it annoyed me because I use Facebook to keep in touch with people that I might otherwise not. For that, it can be very good.

And yet… it struck a chord with, me to some degree. I realised that Facebook has increasingly become more of a time sink than a pleasure. Not that I spend vast amounts of time on it each day, but when I do open it up, I often end up spending longer than I’d have wanted to. And not reading updates from friends and family, but following links to articles and quizzes and nonsense, most of which I wish I hadn’t bothered with.

By comparison, a similar length of time spent in my feed reader lets me read blog pieces by people I actively want to hear from, and which I’m generally glad I’ve read.

But they mostly aren’t friends and family.

And then there’s the fact that the Facebook algorithm is tuned to show me what it thinks I should see, not what I want to see. What I want to see is all the updates from my friends, in reverse-chronological order. And that’s all. But there’s no guarantee that it will show me everything everyone posts, and the order is close to random at times.

One way to work round this is to visit people’s individual Facebook pages. You could see all your the posts by all your friends by going to each of their profiles in turn. But that would mean you’d have to keep track of all that: remember who you visited and when, and somehow manage the list of people.

Keeping track of things is what computers are good at. The software should be doing that for us.

So I’m thinking that what I want is an app that will do that for me: that will keep a list of my Facebook friends, and show me all their posts (which of course is what Facebook used to do).

As far as I know, no such app exists. This seems strange and unlikely, but I don’t think Facebook make a public API available for third-party clients, so such an app would have to work by scraping the web pages, which is neither good practice nor much fun.

Of course, what this means is effectively turning Facebook back into a set of RSS feeds — or now, especially as I have some experience with them, a set of JSON Feed feeds. Which would then be usable in all sorts of other places.

Web scraping may be bad and painful; still, I think I want to write this thing. Watch this space.

The Sound of Audio Formats

Amusing that in the same week that I post a criticism of software patents, the final patents on the MP3 format expired. Some people are characterising this as the “death” of MP3, which is just nuts.

In fact, far from being dead, it can finally come to life, as Marco Arment makes clear.

Software patents: they’re what needs to die.

In other software-and-the-law news, here’s a story about a case of alleged GPL violation coming to court. The judge so far seems to have made a good decision, by stating that the existence of the GPL and the defendant company’s use of the software does mean there was a contract in place.

Misbehaviour Again

I’m sure you all pay great attention to the goings on at this here blog. You’ll almost certainly have noticed things going very strange yesterday, with the same post being repeated three or four times, in various forms and ways.

No? Well, in case: what we had is (probably) a glitch caused by a WordPress plugin. Or maybe not. Maybe it was something else entirely. Really, we’ll have to see what happens when this one posts.

But I’ve turned of some of the sharing features for now. So you might not even see this if you’re used to being notified via Facebook or Twitter.

Actually since that’s where most of the interaction comes from, it would be interesting to know who if anyone is not reading it that way. Is anyone subscribed to the feed? that’s how I still do most of my blog reading.

Things That Should be Easy

It ought to be easy to install a software package on Linux. I mean, it usually is. All modern distros ship with package managers, right? So all you should have to do is type (Debian-based example):

sudo apt-get install PACKAGE-NAME

and away you go. Right?

Well, usually. But today, not for me.

I have a NAS box from Western Digital, which is really a little Linux server with a biggish disk drive. Some time ago I replaced the shipped distro with a newer one, but it was so long ago, and it’s been so quiet and reliable that I can’t remember what version, exactly.

So first, there seems to be no way to interrogate it to see what distro it is. I mean, there must be, and this page lists several ways, but none of them work on this box. I mean, uname shows me the kernel version and all that, but not the distro.

Anyway, all that doesn’t really matter. I was only doing it to install Node, and I was only wanting to install Node so that I could run AirSonos. We got a Sonos Play:1 for the kitchen recently, and it’s great, but the one weakness is that it doesn’t support playing from an arbitrary source one your phone, such as, say, your podcast app of choice (Overcast, obvs).

AirSonos is supposed to effectively turn the Sonos into an AirPlay speaker, so you can easily send audio to it from iOS devices. And you want it to be running on a server, so it’s available all the time.

But it turns out that Node does not want to install on my NAS. Either by apt-get, as above, or by downloading the binary and unpacking it. (That installs it, obviously, but it won’t run.)

I’m going to try running SonoAir on my MacBook. That’s a wrapper round AirSonos, and obviously it’ll only work (assuming it does at all) when my MacBook is awake. But life’s too short.

Some Thoughts On Software Development

Before the job interview that I mentioned the other day, the company asked me to answer some questions in writing. I didn’t get the job, but I was pleased with my written answers (and they presumably helped me to get the interview, at least). So I thought I’d reuse them as a blog post. None of this should be surprising for anyone who knows anything about the software development field, but it’s interesting to reflect on how things have changed across my career.

What are some of the fundamental changes in your approach to software development you have adopted in the last few years?

There are two main changes that are fundamental and independent of languages and deployment environments: agile techniques and test-driven development (TDD).

Agile

Moving from waterfall to agile development was probably the most significant change to development practices in the industry. We always knew that breaking work down into smaller units led to better estimating, more modular code, and just better software. The genius of agile was to extend that understanding to the period of time spent on a block of work. A two-week sprint, with its work being specifically estimated, planned and developed, is just infinitely more manageable than a project phase lasting months.

Add to that:

  • self-organising teams which include someone from the customer or end user — or at least someone whose role is to represent the user;
  • accepting that change will happen, and embracing it;
  • and the discipline of saying that some features won’t be developed;

and we have a recipe for success.

TDD

Good developers always understood that testing was essential, and did it. But they used to follow a written test plan, or just have an idea of what needed to be tested and work to that. Testing was manual, hard to repeat, and error-prone.

TDD brought automation. So instead of writing a document listing the required tests, we can write code. That inherently makes the tests rerunnable, so regressions get caught before they become a problem.

But almost more important than that is the idea of writing the tests first. In an ideal world you write a comprehensive set of tests, write functional code until all the tests pass, and you’re done. It may not always work out exactly like that — in particular, adding tests to a mature codebase can be problematic — but writing tests first encourages us to write code that is easy to test, which tends to lead to better-designed, more modular code.

An added bonus is that the tests can help to document the code, by showing our expectations. And of course they make refactoring easy and safe, as long as they are in place before you start.

If you were to start your last project over again, what would you do differently?

The project I’m thinking of involved rewriting the product’s GUI into a modern, responsive, browser-independent form, using HTML 5 and Twitter Bootstrap.

The existing version was an old frames-based web app that only worked fully in Internet Explorer, and had to be tweaked when each new version of that browser came out. We had long wanted to modernise it, but there were always other demands on development time.

Eventually I got a chance to try a proof of concept for the change. The application uses JSPs and Struts action classes, and the brief was to continue using these as much as possible. I decided to start with one of the main display pages, the one that users spend most of their time in. The idea was to give a quick demonstration of what was possible; and it did, to a point. But what I hadn’t realised was that frames are not part of HTML 5. There are ways to keep using them, but it’s not easy, and not good practice.

So while the new look and feel of a single page was clear, it was far from clear how the various pages would interact, how they would be brought together to form the whole UI, without frames.

If I were to start the project again now, my first step would be to work out how to link the pages together into a single interface, in the absence of frames. Most likely I would use one or other of the forms of JSP includes.

However, if there was the budget to do a more complete rewrite — by which I mean one that did not necessarily seek to use the existing JSPs — I would probably make much greater use of JavaScript and Ajax, and use the action classes just to provide data to the Ajax calls.

What is your approach to testing, and how would you test your application?

I would use a mixture of automated unit testing using JUnit, automated GUI testing, and actual user testing, if at all possible.

This fits well with what I was saying above. There are, broadly, three levels of testing: unit, integration, and system. Though writing automated unit tests is a development activity, rather than a testing one. Certainly we wouldn’t expect dedicated QA testers to work at the unit-test level.

So let’s assume that we have satisfactory unit-test coverage and we are interested in testing the application as a whole. Automation is obviously key here, as well, both because it allows us to easily repeat the tests regularly — for every checkin, in an ideal world (and see below); and because it removes the need for testers to manually step through a written script, which is boring and error-prone.

I have used Selenium for automated GUI testing, with some success. It takes a significant amount of development work, because it’s doing a significant thing, but the effort should pay off.

However, even after all that, there is still no alternative to having someone sit down and actually use the application. Automated testing might pick up outright errors in how the user interaction works. But it won’t catch fine details like misaligned elements, typos in onscreen text, or just generally how it feels to use the application.

What are the benefits of Continuous Integration?

Continuous Integration takes us beyond the traditional daily build. It does more than just building, and does it more frequently than just daily.

At the simplest level it ensures that, for every commit, an incremental build of the complete product is made, and all the unit tests are run. In the most advanced case, as well as building and testing, the product can be deployed to test servers and integration tests such as the automated GUI tests mentioned above can be run. Realistically those tend to take longer, so it’s unlikely that you would do them for every commit, but they can certainly be run multiple times daily.

So we get the following benefits:

  • frequent builds catch problems in code integration;
  • unit tests are run frequently, catching any regressions;
  • integration tests are run regularly, catching other problems;
  • general confidence in the product is increased;
  • developers are happy to commit changes frequently.

Software patents: dead in Europe

In other good news, over on BoingBoing, Cory is telling us that Euro software patents are dead:

The European Parliament voted 648 to 14 to reject the Computer Implemented Inventions Directive.

The bill was reportedly rejected because, politicians said, it pleased no-one in its current form.

Responding to the rejection the European Commission said it would not draw up or submit any more versions of the original proposal.

This is excellent news, though as Cory goes on to say,

Software patents have been staked through the heart before, but they keep rising from the grave. There’s too much monopoly rent waiting to be extracted by anti-competitive companies for them to simply give up and go home. The price of liberty is eternal vigilance.

A year or so ago the number one or two hit on Google for “software patents” was an article by an old friend of mine, John Gray, who is a Patent Attorney, in favour of them.  With well-reasoned arguments, as I recall.  Sadly the article appears to have gone now, though links to it remain.  Such is one of the weaknesses of the web, unfortunately, when you can’t trust (some) publishers to keep their URLs pointing at something.

Update: asajeffrey found a mailing list post that, if not John’s article that I was thinking of, certainly discusses the same ideas.  Thanks, Alan.  Note that I am not the “Martin” referred to in that post.