Friday, March 16, 2012

#WindowsError timing couldn't have been worse. Apple products #ForTheWin!

EXCLUSIVE: Steve Wozniak in Line for iPad 3 at Apple Store - YouTube

How great is this?!

Woz is THE man. He doesn't have to wait in line, but he's such an integral part of g33k culture and truly LOVES technology and electronics, so he does this for the experience. He is the most awesome, passionate nerd ever :)

Posted via email from Tony Burkhart

Super-Secret Google Builds Servers in the Dark | Wired Enterprise |

This just makes me like Google even more. It's almost a cloak and dagger feel to it and lends to an intriguing story.

Super-Secret Google Builds Servers in the Dark

Inside the massive data centers run by Equinix, the lights are on in some cages, but off in others. Photo: Peter McCollough/

Just how far will Google go to hide its custom-built data-center hardware from the rest of the world?

In one Silicon Valley data center, the company is apparently so paranoid about competitors catching a glimpse of its gear, it’s been known to keep its server cages in complete darkness, outfitting its technical staff like miners and sending them spelunking into the cages with lights on their heads.

“Many [companies] try to keep things covered up. There’s a lot of valuable intellectual property in here,” says Chris Sharp, general manager of content and cloud at Equinix, as he walks through the company’s data center. “But we were always amazed by Google and the helmets.”

Google is one of many big-name web outfits that lease data-center space from Equinix — a company whose massive computing facilities serve as hubs for the world’s biggest internet providers. All the big web names set up shop in these data centers, so that they too can plug into the hub. The irony is that they must also share space with their biggest rivals, and this may cause some unease with companies that see their hardware as a competitive advantage best hidden from others.

About two years ago, Chris Sharp says, Google unscrewed all the light bulbs inside the hardware cages it occupied at that Equinix data center. “They had us turn off all overhead lights too, and their guys put on those helmets with lights you see miners wear,” he tells Wired. “Presumably, they were bringing up custom-built gear they didn’t want anyone else to see.”

Google declined to comment on Sharp’s little anecdote. But the tale is not surprising. Google designs its own servers and its own networking gear, and though it still leases space in third-party data centers such as the Equinix facility, it’s now designing and building its own data centers as well. These designs are meant to improve the performance of the company’s web services but also save power and money. More so than any other outfit, Google views its data-center work as an important advantage over competitors.

That said, Google has actually loosened up in recent years. In 2009, the company opened a window into the first custom-built data center it had built five years before, and it has discussed parts of its newer facilities. But many of its operations remain a mystery.

Some believe this should change. Facebook now designs its own data centers and servers, and as a direct response to Google’s approach, the social-networking outfit has “open sourced” its designs, hoping to encourage collaboration on designs across the industry. This, Facebook says, will allow the rest of the world to save power in much the same way Google has done and ultimately, well, save the planet.

Several companies have already embraced this effort, including Netflix, the Texas-based cloud provider Rackspace and Japanese tech giant NTT Data. But others still prefer to keep their secret hardware secret.

Hosting With The Enemy

Amazon, for instance, takes a Google-like approach. The company says very little about the facilities it runs or the hardware in those facilities. Apparently, the company is working with server sellers such as ZT Technologies to customize its servers, and it has followed Google’s lead in constructing its data centers with modular shipping containers. But it’s unclear just how far the company has gone towards designing and building its own hardware.

This week, the internet is rife with speculation about just how many machines back the company’s Elastic Compute Cloud service.

At Google, employees sign strict non-disclosure agreements that bar them from discussing what goes on inside the company’s data centers — and apparently, this agreement is open-ended. That alone puts a lid on Google’s deepest secrets. We’ve seen the NDA in action — many times. But for Google, and others, there’s an added problem when they set up shop in a “co-location” facility like the data centers run by Equinix.

The nature of the beast is that you’re sharing space with competitors. Equinix trumpets its data centers as places where the giants of the web can improve performance by plugging their gear straight into the world’s biggest internet carriers — and into each other. The company began life offering a service — the Internet Core Exchange — that connected all the major internet service providers, and now it lets other outfits plug into this carrier hub.

According to Sharp, over 70 carriers used the company’s main data center in San Jose, California. “We were a place for network operators to efficiently hand off traffic, and that’s the legacy that created Equinix,” Sharp says. “Not only are networks leveraging that to talk to each other, but [websites] are too.”

Security is high in the company’s facilities. Hand geometry readers — i.e. fingerprint readers that extend beyond fingerprints — guard access to the data center floor. There’s a security camera looking at you every time you turn around. And each company can contain their gear in their own cages, protected by still more hand readers. But if you’re on the floor, you can peer into the cages. For cooling purposes, they’re not walled off.

While some companies proudly display their logo on the side of their machines, the Googles of the world do their best to hide themselves. To keep competitors from eying their gear, Sharp says, many companies keep the lights off inside their cages when no one’s working in them. But others go even further.

Amazon's beautiful Sterling, Virginia, data center. Photo: Eric Hunsaker/Flickr

It’s one of Amazon’s best-kept secrets. How many computers does it take to keep its Elastic Compute Cloud platform afloat?

And now, a researcher with Accenture thinks he has the answer: 445,000. That’s the number that Huan Liu came up with when he did a bit of internet sleuthing. “It’s a fairly big site; it’s pretty impressive,” he says of the entire EC2 operation.

EC2 is Amazon’s pay-as-you-go computing service. It’s become a popular way to spin up computing power for a corporate skunkworks project or a startup, but it’s also the back-end for serious online sites, including Netflix and Dropbox.

Liu’s analysis found that Amazon’s main cluster of data centers, located in northern Virginia, is truly massive: he guesses that Virginia is home to about 322,000 servers. But he also found that Amazon has a relatively small footprint in other parts of the world. For example, he guesses that there are only 1,600 EC2 servers in Sao Paulo, Brazil. It’s “hard to compete with Amazon on scale in the US, but in other regions, the entry barrier is lower. For example, Sao Paulo has only 25 racks of servers,” Liu wrote in a blog post discussing his findings.

Liu, a research manager with Accenture Technology Labs, took advantage of the way that Amazon organizes its EC2 domains to come up with his estimate, which strikes us here at Wired as a bit of a lowball guess.

Because Amazon relies heavily on virtual computing — that is, it can host several software-based “virtual” servers on a each computer — figuring out the number of machines in Amazon’s data center is a very tough task.

But Liu used a few tricks to link all of Amazon’s Domain Name System and IP addresses to actual server racks used by the Internet giant. Then, by guessing that each server rack has 64 machines in it, he came up with his total numbers.

He tells Wired that he’s “pretty confident” about the number of racks that Amazon uses. As to whether the company crams 64 or 128 servers in each rack? Well that, nobody knows for sure. “It’s an educated guess,” he admits.

The estimate also leaves out the servers that are powering Amazon’s Virtual Private Cloud, a hosting service for servers that are kept off the Internet, and which couldn’t be measured using Liu’s techniques.

Nobody knows for sure, but because it buys so many servers, Amazon has probably joined Google, Facebook, and others and come up with custom, energy efficient server designs that are different from what you’d see in most corporate data centers.

Amazon spokeswoman Kay Kinton declined to say anything about Liu’s work saying the company doesn’t comment on “rumors and speculation.”

Posted via email from Tony Burkhart