Tuesday, March 23, 2010


Why are computers so hard? I ended up fiddling with iTunes again this morning . This time it was with a bunch of Music files that for some reason had an invisibility flag set on them. Were it not for the fact that I was working in a terminal with primitive terminal tools I would've NEVER known that those files existed aside from the fact that a huge chunk of my disk was gone. The problem with them being hidden though is that iTunes wouldn't pick them up so large chunks of my music library appeared to be missing (e.g. almost all of Radiohead). I ended up running an arcane command "chflags" on my whole music directory to unhide those files. Seriously? On a Mac? "Better, faster, easier?" There's no way on earth most computer users would've figured that one out. Granted most computer users wouldn't have had my problem because I was copying files around onto/from a network drive, but I didn't do anything terribly advanced and I can imagine users getting tripped up by this.

The problem, I think, is that in computers/software we work harder to hide complexity than we do to remove complexity. We thrive on abstraction and encapsulation and the result frankly is magical when you think about it. The number of lines of code that cooperate for me to type this alone is staggering (code in the operating system managing memory, and disk, and network layers... code in my web browser... code in the device drivers that collect my keystrokes... code in the display driver that powers my monitor... you get the picture). Generally we're entirely insulated from all of that hidden complexity, but when the abstraction fails the result is painful. What would it look like if we spent more time throwing away code and removing layers than we did writing new code?

What if smart people sat down and wrote an operating system and focused like a laser on keeping things simple? What if they built it with a child in mind, or with a grand parent, or another novice computer user in mind? What if they sought to control rather than abstract complexity?

Take for example "hidden" files. Why on earth do we need hidden files? It's more likely than not that real users have NEVER used that feature and instead have only been hurt by that feature. That feature exists as an attempt to encapsulate system files. Microsoft/Apple said "hmmm all of these files in /System or in C:\windows look pretty scary and users would be pretty messed up if they deleted them. what can we do?" And the abstraction of hidden files was born. So rather than reducing the need for those files (can't eliminate entirely obviously), or putting them in one location which is never shown, or archiving the files together in one bundle and hiding that.... we got file-level hiding, and beautiful but useless encapsulation that hid complexity but didn't reduce it and it sits there waiting to bite.

Thursday, March 4, 2010

Laptop Memory

I upgraded my MacBook Pro to 4GB of RAM (a whopping $100!) and while this should absolutely be no surprise to me my machine feels much snappier. It feels like this has to be a psychological effect particularly since I don't typically have a lot of memory hogs open at once. However it certainly seems to be the case with launching apps, changing between apps, jumping around Spaces, etc. It's hard to tell by reading the old RAM label, but my guess is that the new RAM I put in is faster. Either way it's nice to spend $100 on a "new" laptop ;)

Tuesday, March 2, 2010

Automated "Refactor-needed" Detection?

I've been working on and maintaining a site for tracking Cub Scout progress called Den Manager (not terribly exiciting if you don't have an account). I first wrote the site in Rails almost four years ago. About two years ago I did a pretty major refactor to reorganize the achievements so that it'd be easier to add additional awards. Finally, a month ago I migrated the site to the latest version of Rails and made a few "minor changes." This latest round of changes didn't go so smoothly, and yes I'm to blame. I didn't buy into the Rails religion completely in 2006 so I've got basically no test coverage (I'm working to rectify that now but it's hard to backfill test on years of work some of which has been through multiple refactors). But I don't want to talk about my personal failings in this post ;)

As I was fixing these latest problems I had an interesting thought.... The need for a refactor can be spotted when a small change causes many breakages in different places. In my case I changed one attribute on a class from a string to a symbol and it broke in dozens of different places (many of which I didn't spot for a few weeks because of missing test coverage). Had I had better coverage I wondered if you could build a system that monkeyed with code and intentionally introduced random bugs and then counted the number of breakages. You'd run a system like this in a CI type environment and you'd report up bugs in a particular module and how many failures it triggered. Sorting by the number of failures and you have potentially interesting candidates for refactoring.

iPad Keynote

I just got around to watching Steve's keynote for the iPad and, while I confess that I'm a card carrying Apple zealot (two laptops, a mini, two iPhones, a bunch of iPods, and a Cinema display), it was a bit silly how often Steve said "in the palm of your hand" The other thing that I found interesting was that within the first minute of the presentation, while browsing the web he demoed how Apple products don't play nicely with Flash. Hard to day if it was an intentional (though subtle) dig at Adobe or if it wasn't a terribly well planned demo. If it was a dig it's not clear that customers will see it that way. Average users (the kind of users who have been Apple's key to success historically... "it's so easy") don't give a rip if it's built in Flash or HTML5 or Obj-C so honestly, that the NYT site uses Flash and doesn't work on the iPad just makes it look like the device doesn't work.