Simply Statistics A statistics blog by Rafa Irizarry, Roger Peng, and Jeff Leek

Do you own or rent?

When it comes to computing, history has gone back and forth between what I would call the “owner model” and the “renter model”. The question is what’s the best approach and how do you determine that?

Back in the day when people like John von Neumann were busy inventing the computer to work out H-bomb calculations, there was more or less a renter model in place. Computers were obviously quite expensive and so not everyone could have one. If you wanted to do your calculation, you’d walk down to the computer room, give them your punch cards with your program written out, and they’d run it for you. Sometime later you’d get some print out with the results of your program. 

A little later, with time-sharing types of machines, you could have dumb terminals login to a central server and run your calculations that way. I guess that saved you the walk to the computer room (and all the punch cards). I still remember some of these green-screen dumb terminals from my grad school days (yes, UCLA still had these monstrosities in 1999). 

With personal computers in the 80s, you could own your own computer, so there was no need to depend on some central computer (and a connection to it) to do the work for you. As computing components got cheaper, these personal computers got more and more powerful and rivaled the servers of yore. It was difficult for me to imagine ever needing things like mainframes again except for some esoteric applications. Especially, with the development of Linux, you could have all the power of a Unix mainframe on your desk or lap (or now your palm). 

But here we are, with Jeff buying a Chromebook. Have we just taken a step back in time? Is cloud computing and the renter model the way to go? I have to say that I was a big fan of “cloud computing” back in the day. But once Linux came around, I really didn’t think there was a need for the thin client/fat server model.

But it seems we are going back that way and the reason seems to be because of mobile devices. Mobile devices are now just small computers, so many people own at least two computers (a “real” computer and a phone). With multiple computers, it’s a pain to have to synchronize both the data and the applications on them. If they’re made by different manufacturers then you can’t even have the same operating system/applications on the devices. Also, no one cares about the operating system anymore, so why should it have to be managed? The cloud helps solve some of these problems, as does owning devices from the same company (as I do, Apple fanboy that I am).

I think the all-renter model of the Chromebook is attractive, but I don’t think it’s ready for prime time just yet. Two reasons I can think of are (1) Microsoft Office and (2) slow network connections. If you want to make Jeff very unhappy, you can either (1) send him a Word document that needs to be edited in Track Changes; or (2) invite him to an international conference on some remote island. The need for a strong network connection is problematic because I’ve yet to encounter a hotel that had a fast enough connection for me to work remotely over on our computing cluster. For that reason I’m sticking with my current laptop.