From The Founder and Senior Analyst of ZapThink

Ron Schmelzer

Subscribe to Ron Schmelzer: eMailAlertsEmail Alerts
Get Ron Schmelzer: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Cloud Computing

Blog Feed Post

The Bandwidth of Sneakernet to the Cloud

Just what is the bandwidth of a van full of hard drives traveling 300 miles at a speed of 65 mph?

Just what is the bandwidth of a van full of hard drives traveling 300 miles at a speed of 65 mph?

After a short Twitter discussion based on this post which suggested Ye Olde Sneakernet is the best way to transfer large data sets from the enterprise to the cloud (which is, unfortunately, not as uncommon a suggestion from cloud providers as you might think) I was dared to compute the actual bandwidth of said sneakernet (probably because I said I had the urge to do just that, but is that really important? I didn’t think so.)

I have a hard time passing up dares like that, but you knew that, didn’t you? So let’s get out our papers and dust off the old math textbooks, shall we?


HERE COMES THE MATH…


Our van is a 2008 Dodge Caravan with an interior carrying capacity of 143.8 cubit feet. There are 1728 cubic inches in one cubic foot, which means a total of 248451 cubic inches are available for our hard drives.

Our hard drives are Maxtor BlackArmor Portable hard drives, 160 GB each. Dimensions are: 5.17" H x 3.32" W x 0.67" L for a total of 11.5 cubic inches. Let’s say 12 cubic inches per drive, just to make the  calculations a little easier.

That means there are: 248451 / 12 = 20704 hard drives in the back of our van. That’s a total of 3,312,640 GB.

Okay, that was the hard part. The rest should be fairly straightforward.

The van is traveling at 65 mph. We assume no congestion (traffic jams) for the purposes of this illustration.

It will take the van 4.6 hours to reach its destination, which means it takes 16560 seconds to arrive at “the cloud”. Given the amount of data we’re transferring, that means an effective transfer rate of: 200 Gbps.

That sounds pretty good, until you consider the latency incurred from a pit stop and I’m not sure there’s a way (yet) for application delivery to address that.

AddThis Feed Button Bookmark and Share

Related blogs & articles:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.