?

Log in

No account? Create an account

t3knomanser's Fustian Deposits

Against the Cloud

How Random Babbling Becomes Corporate Policy

run the fuck away

Mad science gone horribly, horribly wrong(or right).

Against the Cloud

Previous Entry Share Next Entry
tesla

Screw you, Cloud


I hate the term cloud computing. I'm generally opposed to the idea of it. I know that this goes against the current fad, but I really really have a philosophical problem with it. Before I go into a long-winded history lesson to make a point, I want to address one point first: the utter stupidity of the name "cloud computing". When IT people make diagrams, there are often areas or elements that are out of scope, for example, when talking about how data gets from your computer to Livejournal and back, it's superfluous to explain all the routers and network hops in-between. For most applications, I don't need that kind of detail, so I just draw a cloud there and label it "the Internet". A cloud means the details are unknown or unimportant to the problem at hand.

"Cloud computing" takes its name from that. "Where does the application live? The cloud!" A market-friendly way of saying, "Don't know, don't care," which conceals the fact that we do know and should care.

In the beginning...


Let's rewind the clock. A long, long time ago, computers were expensive. Processing power was expensive. Storage was expensive. Because of this, computing resources were jealously guarded. Slowly but surely computers made the transition from batch processing to interactive sessions, and it's at that point something interesting happened. We finally reached the point where we could have programs that interacted with the user. In a time sharing system, we could have a bunch of programs running and a bunch of users talking to them, and it all happened fast enough to seem instantaneous to the end users.

This was the birth of the dumb terminal era.

Dumb terminals were just that- keyboard, monitor, and a connection to the central computer located in a closet someplace else. Dumb terminals didn't have any computing power of their own- they were just a gateway into the real computer. Later on, some terminals became "smart"- they had limited processing capacity, but by then the era of the terminal was mostly over. Desktop computers replaced them, and could run "terminal emulators", which were dumb terminals implemented in software.

When I was in college in the late 90s, the campus IT group was working on banishing the dumb terminal from campus. During my time there, the VAX mainframe finally died, mostly, and with it went the dumb terminals. I enjoyed using them for the antique factor. They were such a bizarre and quaint way to interact with a computer. As a CS major, I was learning my way around the command prompt, so I could actually do a decent amount of my coursework this way.

Living on the edge


The reality is that people generally hated this arrangement. It's cumbersome, for starters. The network creates another point of failure. As desktops got cheaper and cheaper, the character driven command line interface became less and less attractive.

And there were other advantages to the desktop. For starters, users could "own" their desktop in a way they couldn't in the mainframe. They could configure it, install software for their personal use. They didn't have to budget against quotas set by the system administrator. It gave the end users a lot more power, but had the downside of creating a lot more problems- how do you get these desktops talking to each other (nigh impossible in the DOS/Win16 days)? Sharing files and data? Printers? With the added power, users could bork their systems much harder, creating the need for specialized desktop support.

Power moved from the center to the edge, and this was viewed as a good thing. I think it's still a good thing- the innovation and productivity of the more decentralized approach to computing created an explosion of possible OSes. Linux could never have been born in a mainframe environment. There's a wealth of applications that arose thanks to the fact that the power was at the edges.

And processing power got dirt cheap. For a time, there were dreams of putting processors in everything from phones to refrigerators. The first one turned out to be a good idea, the last one not so much.

While this was going on, other things were changing. Storage was getting cheaper, too. And networking was getting cheaper and easier too. As the desktop took over, from the 80s into the 90s, the Internet started growing. There was a feedback loop between the Internet and the desktop- the disconnected desktops needed to be strung together, and the more you strung computers together, the more valuable the network became.

Once you've got a bunch of computers strung together into a big network, people start finding uses for these connections. And people have certain instincts. Sex being the primary one, and socialization (arguably a subset of sex, if you want to get reductionist about it). Among all the various network driven services, we had some serious social services- starting with email to IM to web based social networks, we started to see more and more serious stuff happening online instead of off.

Social networks were the birth of "cloud computing". The idea here is that connections have value, and that it's often difficult to connect disparate things. So let's centralize the connections between people into a single service. Hence MySpace, LiveJournal, Facebook, Twitter, etc.

This centralization had value to the users- it made it easy to connect with their friends (sex and socialization). It's good for large companies too- those connections do have value, and data mining interests and activities is a great way to watch trends and market products.

Hey, centralization. While this was going on, processing power and storage and bandwidth kept getting cheaper and cheaper. For a relatively small fee you could buy a whole farm of powerful computers, string them together and have a mountain of computing power at your disposal. And when you've got a big computer, you've got the new mainframes.

The New Mainframes


I find it odd that we've come full circle. When people talk about "cloud computing", they're really talking about a new generation of dumb terminals and mainframes. The names have been changed to disguise it, but it's the same idea all over again.

We have big, powerful computers (relatively) sitting in a closet someplace. We have not so beefy computers with terminal emulators web browsers accessing these hosted services. Power is moving away from the edge back to the center.

There are some advantages here. In the mainframe days, if I wanted to share a file with another user, it was easy- I just fiddled with permissions and anyone on the mainframe could see/edit it. In the desktop era, I had a harder time. Were we networked? Was the network working? Would Windows throw a fit? Maybe I'll need to sneaker-net it over (carry the file on a floppy), which is easy since files were smaller and less interesting back then, or if I'm really lucky, use my CD burner for a big file. And then let's hope the recipient can actually read the file.

The Internet didn't really make sharing files any easier. Oh, email works- up to a point. Attachment sizes are limited by most email systems, there's a bunch of security risks, etc. And when files get big, it gets hard to share- and a whole slew of unusual technologies have been invented to solve this problem.

And there's another weakness to living on the edge- your files are on your computer. If your computer is gone, out of service or inaccessible, where are your files?

The new mainframes solve this problem in the most paternalistic way possible. They'll hold your files for you. Isn't that nice of them? Storage is cheap, after all, so they can store a lot of files for a lot of people at a trivial cost. And then they can do clever things with them, like have a program read your files to match ads to what you're working on (but it's okay, nobody's snooping on you or anything- it's just an automated process). They can track traffic and model user behavior, sell corporate versions of their services and let the average Joeblo piggy back off of that. And since the terminal in this formula is the web browser, it's much easier to develop the same software once for every platform than a thick client would be.

It's not inherently bad, but we need to stop and say, "Wait a second, why are we moving power back to the centralized model we abandoned a decade ago?" I adore GMail, and find that the web based interface it offers is superior to any desktop email client I've worked with. I'm a fan of Evernote, simply because it uses the centralized model to sync data across a bunch of devices. But- Evernote and GMail both publish this data using standardized protocols that allows me to take it with me, if I want to. I'm not married to them.

We need to evaluate the tradeoffs, and we need to take an approach that gives us as many of the benefits as possible with as few of the disadvantages as possible.

My Dream Machine


When processors got cheap, people came up with all sorts of wacky things to do with them. Like mashing up appliances with computers to get ePliances. Nobody could really explain why my fridge needed to be a computer, it was just the next logical step, or something.

But there's a flip side to this. Think about your average appliance. What's the setup process? In most cases, you plug it in. Maybe you set the clock. Even more complicated devices, like gaming consoles, generally work from the instant you hit the power button, with minimal setup.

These are all appliances, at least in terms of their ease of configuration.

Is there a way we can mash this idea together with the advantages of "cloud computing"? Can we ditch the cloud and just keep everything on the edge? I think we could.

I have a vision of a device. The super-smart anti-terminal. It's small, maybe the size of a very large wall wart. Inside, it's got a lightweight processor, a decent sized laptop HDD or SSD, wireless Ethernet, maybe Bluetooth/wireless USB. It plugs directly into an outlet. It has a USB port or two on the outside.

Out of the box, it's got an embedded Linux server running. You plug it in, connect to its "configuration network" and follow through a wizard like you would when setting up your wireless router. It joins your wireless or wired network, it registers a domain name (or a free dynamic DNS subdomain), and points it at your network. As much as possible, it automatically sets itself up with port forwarding on your router.

The entire point of all of this? All those cloud services? Google Docs, Twitter, etc? They run here. Out of the box, it's got WordPress, Laconica, a web based office suite, and acts as a remote file server. It can stream media, it can track bittorrents. Using some of the open social networking standards, it becomes your social networking identity. Accessible via the web, via phone client software. Options to mirror an encrypted disk image to an offsite backup.

Now we get the benefits of thin client access, anywhere access, etc.- but without reverting to the maniframe age with slightly different branding.

All the pieces are there to make this technology work. What really needs to happen is to have someone sit down and really work out the mechanics of doing this so that it really is plug-and-play. It needs to be as easy to configure as a Wii. Not an easy task, especially considering the complexity of the problem being solved.

In Conclusion


I think "cloud computing" is a horrible term. It's centralized computing. And I think it's a bad thing. I think putting things on the edge, where the users live, is a much better idea than putting it on Google or Microsoft's servers.

And have you noticed that? Not to go all tinfoil hatty here, but have you noticed that it's giant companies that want to hold your data for you? They want everyone using thin clients on tiny little netbooks, and that gives them the power. They have the data, they have the processors and the storage. I'm not claiming that there's any conspiracy to weaken the public- but conspiracy or no, that would be the result.
  • Thank all the gods that I'm not the only one. I'm fighting the cloud battle at work and fortunately winning (so far).
  • (no subject) - jeramey
    • Oh, of course. Certain tasks should be centralized. Search engines wouldn't work very well as a distributed application.

      I'm not sure how I feel about smart-gridding appliances. But in any case, it's not really the same sort of idea- this isn't user focused stuff, but infrastructure. The rules for infrastructure are always different than user focused applications.

      Also, on a technical note, even if we didn't allow hinting, if appliances were a bit smarter about predictive consumption, it would allow better balancing on the power grid. For example, before the A/C kicks on its compressor, it shoots a message onto the grid announcing, "Hey, I'm about to do this." It would also allow the user to tie appliances together to cut down their consumption as well- running the A/C compressor and the refrigerator compressor at the same time is not only loading the grid, but probably inefficient- the waste heat from the fridge should be allowed to dissipate some before any attempts are made to cool the area.
    • Actually, distributed processing works will with monumental tasks like protein folding tests, signal processing and the like. I'm sure it'd work equally as well with weather prediction -- all customers who run the client donate processing power as their way of paying for it...
  • Not unexpectedly, I agree with you entirely here -- and I'm vocal to everyone around me how dumb it is, but they always call me "anti-progress". Ugh, I really don't want computing to go this way -- to lose control over everything, with all software subscription-based, etc.
    • The frustrating thing is that software as a service is what's anti-progress. It's progress only for the business pushing this garbage as a way of maximizing revenue.
  • Funny you should mention netbooks. They have enough processing power to manage everything you mentioned. I'm terribly mixed about the rest. Social network sites are useful, and they're hard to do without at least a little coordination. Gmail gives me features I want, and I don't have to manage any software. And yet, while using these tools, I occasionally think back to the British enclosure movement (17th century?) and the modern water buyouts. What's happening now does feel like a bunch of corporations swooping in, buying up the commons, and slowly closing it off. I think that's the part that worries me: the closing of the commons.
Powered by LiveJournal.com