Sent with Reeder
Friday, June 29, 2012
Thursday, June 28, 2012
Google has announced plans to start offering a compute-on-demand service that rivals Amazon’s Elastic Compute (EC2) service. Google has offered many what it calls higher-level cloud services such Google storage, BigQuery and Google App Engine in the past, but now the company believes it needs to sell a more prosaic Infrastructure as a Service (IAAS) offering whose primary target is attracting more developers to Google’s cloud platform. The news of this new service was first reported by my colleague Derrick Harris in May and was confirmed later by me with additional details this past week.
“The Google Compute Engine, we believe, has been the missing piece,” said, Urs Hölzle, Google’s senior vice president of technical infrastructure, during a broad conversation this week. He said that building an infrastructure-as-a-service isn’t a trivial task, as the demands on such a service are quite intensive. Google has been working on this new service for some time now, he added.
The focus of the Google Compute engine is on performance, scale and value. In order to show its performance and scale, Google is planning to show off a genomic app that runs on 600,000 cores. Another app will use 10,000 virtual machines. And if that isn’t enough, the company says it will offer 50 percent more compute resources compared to other shared cloud infrastructures. Translation: It’s a shot across the bow of Amazon Web Services’ EC2 offering. (See image below for pricing)
The developers can run any stack and any software on this service. The company is partnering with third-party services such as RightScale to add more tools and services to its platform. Google is going to initially offer its service in limited preview and will sell it through its sales force if customers are looking for 100 or more cores. Eventually the service will be accessible with a credit card and a browser like most cloud services. From Google’s blog post:
The capabilities of Google Compute Engine include:
- Compute. Launch Linux VMs on-demand. 1, 2, 4 and 8 virtual core VMs are available with 3.75GB RAM per virtual core.
- Storage. Store data on local disk, on our new persistent block device, or on our Internet-scale object store, Google Cloud Storage.
- Network. Connect your VMs together using our high-performance network technology to form powerful compute clusters and manage connectivity to the internet with configurable firewalls.
- Tooling. Configure and control your VMs via a scriptable command line tool or web UI. Or you can create your own dynamic management system using our API.
At launch, we have worked with a number of partners — such as RightScale, Puppet Labs, OpsCode, Numerate, Cliqr and MapR – to integrate their products with Google Compute Engine. These partners offer management services that make it easy for you to move your applications to the cloud and between different cloud environments.
A company spokeswoman said that anyone can “sign up today, but we will be accepting customers who are focusing on larger workloads. In some cases we would accept smaller workloads as well. “ During the early phase, Google will offer Google compute service only to the U.S.-based developers, but will eventually roll out the platform to customers globally. Hölzle said that the company was using Google’s current infrastructure stack to offer the on-demand compute service.
Better late than never?
When asked if he believed that Google was a tad late to the party, Hölzle pointed out that while there have been many existing players offering cloud infrastructure services, there is ample opportunity for Google as the shift to cloud is more cyclical and long term. “This really isn’t about stealing marketing share from other players,” he said.
“I think we are early because the whole industry itself is in [its] infancy,” he said. “If you look at it, in the grand scheme of things, nearly 99 percent of the companies are not in the cloud.” Hölzle, however, said it was the right time for Google to enter the market. “More and more apps are being built for the web and mobile and the original storage and services are all moving to the cloud,” he said. The emergence of Chrome OS, Android and iPhone have led to the point where cloud clients are becoming “stateless.”
“It is very early in the market and, frankly, five years from now you will have a whole different kind of cloud and services.” He declined to outline what the cloud will look like in five years.
Hölzle was reticent about predicting the level of adoption as well, but was not shy of pointing out that Google has been in the infrastructure business for years and it is one of the key advantages for the company. “The market will show,” he said, and invited me to ask him the same question “two years from now.”
The great (cloud) game
Despite Google’s dismissal, one can’t help but notice Amazon’s looming shadow on Google. Amazon Web Services, thanks to being an aggressive and early believer in the cloud as we know it, has carved itself a nice niche and is rumored to be bringing in over a billion dollars in revenue. But it is not just revenue that has a whole industry jealous of Amazon’s cloud business.
Success in cloud services has made Amazon attractive to startups and independent app developers, who are embracing Amazon’s stack of cloud services. These code-tinkerers are the kingmakers in this new world, especially now that Amazon has forked Android and has been pretty public about its grand mobile ambitions.
The battle for developers and locking them into cloud and mobile platforms is literally the trillion-dollar question of the 21st century. Microsoft Azure, Apple’s iCloud, Amazon Web Services and now Google Compute Engine are essentially trying to get their hooks into the developers. Frankly, I am surprised that Facebook hasn’t announced its own efforts to do the same.
Amazon, for now, is the king of the hill. At our Structure 2012 conference, when I asked Amazon CTO Werner Vogels about the next five years, he talked about a new layer of services emerging and Amazon being the trendsetter. It is a distinct advantage for the Seattle-based company that has angered its partners but has been focused on making sure it keeps pushing the envelope. He understands — and so does Google — that there is an opportunity to take away the dollars spent on IT dinosaurs such as Hewlett-Packard.
These giants of the past should be waking up with a migraine, for the entry of Google makes life tougher for them. I wonder what this news does to smaller cloud players such as Rackspace that have been inching their way toward Amazon’s heels.
Nevertheless, Amazon knows it has no time to rest on its laurels. For instance, it is not going to let Google press the price advantage for long. “If you look back, we’ve lowered pricing 20 times, so the best thing to look forward to is we’ll continue to do that,” Vogels said in our onstage conversation. “That’s at least our goal.”
And whichever way you look at it, Google’s entry into the business is a good thing for the developers and startups.
Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.
- Deploying big data: 2012 strategies for IT departments
- 2012: The Hadoop infrastructure market booms
- Infrastructure Q1: Cloud and big data woo enterprises
Sent with Reeder
Sent with Reeder
Wednesday, June 27, 2012
Tuesday, June 26, 2012
Apple’s taking a billion dollars and heading to Reno, but it’s going to avoid the slots: it plans to invest the money in a data center and a separate shipping and receiving office.
The proposed data center will actually be located east of Sparks, Nev., a little outside Reno. The business office is intended to be located in downtown Reno.
The Reno Gazette-Journal reported Tuesday from the Washoe County Board of Commissioners meeting where Apple presented its case for the Sparks data center. The paper quotes Mike Folks, an Apple spokesman, saying that the data center should be up and running before the end of 2012, and that Apple is looking for a “30-year relationship” with the area.
Apple will go before the Reno City Council to make its case for the business center on Wednesday.
According to the Reno Gazette-Journal, ”the data center east of town (called Project Jonathan) is estimated to generate up to 41 jobs as well as 200 long-term contractors. The project would generate about 580 direct construction jobs.”
As part of the agreement, Apple will be getting both sales and property tax breaks from the city and county. Apple, of course, is not a newcomer to Nevada or Reno — or its favorable tax policies. The New York Times recently focused on Apple’s small Reno office, run as a subsidiary called Braeburn Capital. The Times highlighted the small office as an example of how places like Nevada, which does not collect corporate taxes, can act as a tax haven for businesses like Apple.
There aren’t a lot of details about why Apple chose the unincorporated area of northern Nevada for the data center, which would be its fourth announced in the U.S., after Maiden, N.C., Prineville, Ore. and Newark, Calif. Apple has pledged to power its existing data centers with significant amounts of local, renewable energy sources, so the location could be tied to nearby solar farms.
Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.
- Controversy, courtrooms and the cloud in Q1
- New challenges for the IT organization
- CES 2012: a recap and analysis
Sent with Reeder
$5B CAMO SNAFU - WWW.THEDAILY.COM
Tap Images to EnlargeNATICK, Mass. — The Army is changing clothes.
Over the next year, America’s largest fighting force is swapping its camouflage pattern. The move is a quiet admission that the last uniform — a pixelated design that debuted in 2004 at a cost of $5 billion — was a colossal mistake.
Soldiers have roundly criticized the gray-green uniform for standing out almost everywhere it’s been worn. Industry insiders have called the financial mess surrounding the pattern a “fiasco.”
As Army researchers work furiously on a newer, better camouflage, it’s natural to ask what went wrong and how they’ll avoid the same missteps this time around. In a candid interview with The Daily, several of those researchers said Army brass interfered in the selection process during the last round, letting looks and politics get in the way of science.
“It got into political hands before the soldiers ever got the uniforms,” said Cheryl Stewardson, a textile technologist at the Army research center in Natick, Mass., where most of the armed forces camouflage patterns are made.
The researchers say that science is carrying the day this time, as they run four patterns through a rigorous battery of tests. The goal is to give soldiers different patterns suitable for different environments, plus a single neutral pattern — matching the whole family — to be used on more expensive body armor and other gear. The selection will involve hundreds of computer trials as well on-the-ground testing at half a dozen locations around the world.
But until the new pattern is put in the field — a move that’s still a year or more away — soldiers in Afghanistan have been given a temporary fix: a greenish, blended replacement called MultiCam. The changeover came only after several non-commissioned officers complained to late Pennsylvania Rep. John Murtha, and he took up the cause in 2009. Outside of Afghanistan, the rest of the Army is still stuck with the gray Universal Camouflage Pattern, or UCP. And some soldiers truly hate it.
“Essentially, the Army designed a universal uniform that universally failed in every environment,” said an Army specialist who served two tours in Iraq, wearing UCP in Baghdad and the deserts outside Basra. “The only time I have ever seen it work well was in a gravel pit.”
The specialist asked that his name be withheld because he wasn’t authorized to speak to the press.
“As a cavalry scout, it is my job to stay hidden. Wearing a uniform that stands out this badly makes it hard to do our job effectively,” he said. “If we can see our own guys across a distance because of it, then so can our enemy.”
The fact that the government spent $5 billion on a camouflage design that actually made its soldiers more visible — and then took eight years to correct the problem — has also left people in the camouflage industry incensed. The total cost comes from the Army itself and includes the price of developing the pattern and producing it for the entire service branch.
“You’ve got to look back and say what a huge waste of money that was,” said Lawrence Holsworth, marketing director of a camouflage company called Hyde Definition and the editor of Strike-Hold!, a website that tracks military gear. “UCP was such a fiasco.”
The Army’s camouflage researchers say the story of the universal pattern’s origins begins when they helped develop a similarly pixilated camouflage now worn by the Marine Corps. That pattern, known as MARPAT, first appeared in 2002 after being selected from among dozens of candidates and receiving plenty of input from Marines on the ground at the sniper school in Quantico, Va. The Marines even found one of the baseline colors themselves, an earth tone now called Coyote Brown.
“They went to Home Depot, looked at paint swatches, and said, ‘We want that color,’ ” said Anabelle Dugas, a textile technologist at Natick who helped develop the pattern. That particular hue, she added, was part of a paint series then sold by Ralph Lauren.
Around the same time, the Army was on the hunt for a new camouflage pattern that could solve glaring logistical problem on the ground in Iraq. Without enough desert-specific gear to go around, soldiers were going to war in three-color desert fatigues but strapping dark green vests and gear harness over their chests. At rifle distances, the problem posed by the dark gear over light clothing was as obvious as it was distressing.
Kristine Isherwood, a mechanical engineer on Natick’s camouflage team, said simply, “It shows where to shoot.”
The Army researchers rushed to put new camouflages to the test — several in-house designs and a precursor of MultiCam developed by an outside company. The plan was to spend two years testing patterns and color schemes from different angles and distances and in different environments. The Army published results of the trials in 2004, declaring a tan, brushstroke pattern called Desert Brush the winner — but that design never saw the light of day.
The problem, the researchers said, was an oddly named branch of the Army in charge of equipping soldiers with gear — Program Executive Office Soldier — had suddenly ordered Natick’s camouflage team to pick a pattern long before trials were finished.
“They jumped the gun,” said James Fairneny, an electrical engineer on Natick’s camouflage team.
Researchers said they received a puzzling order: Take the winning colors and create a pixilated pattern. Researchers were ordered to “basically put it in the Marine Corps pattern,” Fairneny said.
For a decision that could ultimately affect more than a million soldiers in the Army, reserves and National Guard, the sudden shift from Program Executive Office Soldier was a head-scratcher. The consensus among the researchers was the Army brass had watched the Marine Corps don their new uniforms and caught a case of pixilated camouflage envy.
“It was trendy,” Stewardson said. “If it’s good enough for the Marines, why shouldn’t the Army have that same cool new look?”
The brigadier general ultimately responsible for the decision, James Moran, who retired from the Army after leaving Program Executive Office Soldier, has not responded to messages seeking comment.
It’s worth noting that, flawed as it was, the universal pattern did solve the problem of mismatched gear, said Eric Graves, editor of the military gear publication Soldier Systems Daily, adding that the pattern also gave soldiers a new-looking uniform that clearly identified the Army brand.
“Brand identity trumped camouflage utility,” Graves said. “That’s what this really comes down to: ‘We can’t allow the Marine Corps to look more cool than the Army.’ ”
Sunday, June 10, 2012
Saturday, June 9, 2012
Tuesday, June 5, 2012
Friday, June 1, 2012
Venezuela bans private gun ownership1 June 2012 Last updated at 00:54 ET
Venezuela has brought a new gun law into effect which bans the commercial sale of firearms and ammunition.
Until now, anyone with a gun permit could buy arms from a private company.
Under the new law, only the army, police and certain groups like security companies will be able to buy arms from the state-owned weapons manufacturer and importer.
The ban is the latest attempt by the government to improve security and cut crime ahead of elections in October
Venezuela saw more than 18,000 murders last year and the capital, Caracas, is thought to be one of the most dangerous cities in Latin America.'Must do more'
The government has been running a gun amnesty in the run-up to the introduction of the new law to try to encourage people to give up their illegal arms without fear of consequences.
Besides the health of President Chavez, security is the main concern for voters ahead of presidential elections in October.
While voters don't seem to hold Mr Chavez responsible for the insecurity, the situation has worsened throughout his 13 years in office.
The government's most recent statistics put the murder rate at around 48 per 100,000, although some non-governmental organisations estimate it's much higher - 60 per 100,000 in 2011, one of the highest rates in the world.
Critics say the new gun laws and other recently announced measures, like a victim's compensation fund, are just the latest in a long line of failed attempts to bolster security.
One member of the public in Caracas told the BBC: "They're killing people every day. This law is important but they need to do more, they're not doing enough now."
Hugo Chavez's government says the ultimate aim is to disarm all civilians, but his opponents say the police and government may not have the capacity or the will to enforce the new law.
Criminal violence is set to be a major issue in presidential elections later in the year.
Campaign group The Venezuela Violence Observatory said last year that violence has risen steadily since Mr Chavez took office in 1999.
Several Latin American countries have murder rates far higher than the global average of 6.9 murders per 100,000 people.
According to a recent United Nations report, South America, Central America and the Caribbean have the highest rates of murder by firearms in the world.
It found that over 70% of all homicides in South America are as a results of guns - in Western Europe, the figure was closer to 25%.
Obama Ordered Use of Stuxnet, Acceleration of Cyber Attacks Against Iran
Author David Sanger Says President Obama Ordered Wave of Cyberattacks Against Iran
According to a soon-to-be-released book by The New York Times' chief Washington correspondent, David Sanger, President Obama secretly ordered - and decided to accelerate - cyber attacks against systems that powered Iran’s prime nuclear enrichment facility, namely its Natanz plant. The famous attack, as we all know, was Stuxnet.
And according to a New York Times article authored by Sanger and adapted from his book Confront and Conceal: Obama's Secret Wars and Surprising Use of American Power, set to be released on Tuesday, Stuxnet was born under the Bush administration in 2006, and originally code named “Olympic Games”.
“Hawks in the Bush administration like Vice President Dick Cheney urged Mr. Bush to consider a military strike against the Iranian nuclear facilities before they could produce fuel suitable for a weapon,” Sanger notes. “Several times, the administration reviewed military options and concluded that they would only further inflame a region already at war, and would have uncertain results.”
In order to successfully execute their attack, U.S. officials felt as though they couldn’t do it alone, and called on Israel to help, mainly for technical expertise from a special unit of the Israeli armed forces, Unit 8200, which according to Sanger, had extensive intelligence on operations at the Natanz plant and would play a critical role in the cyber attack’s success.
Once the powerful Stuxnet work was developed, the cyber weapon needed to be tested. Accordingly, the United States built replicas of the primary target, Iran’s P-1 centrifuges, described as “an aging, unreliable design that Iran purchased from Abdul Qadeer Khan, the Pakistani nuclear chief who had begun selling fuel-making technology on the black market.”
In July 2010, Stuxnet was discovered due of a programming error that allowed it propagate around the Internet and fall into the hands of security researchers who spent months analyzing it. It’s no surprise, as it has been wdely speculated and assumed that the powers behind Stuxnet are the United States in Israel, but nevertheless, the developers did not want news of their cyber weapon to leak.
“At a tense meeting in the White House Situation Room within days of the worm’s ‘escape,’ Mr. Obama, Vice President Joseph R. Biden Jr. and the director of the Central Intelligence Agency at the time, Leon E. Panetta, considered whether America’s most ambitious attempt to slow the progress of Iran’s nuclear efforts had been fatally compromised,” Sanger explained.
President Obama reportedly questioned if the attack should be shut down, but after being told is was unclear what details the Iranians knew about the worm, it’s code, and where it could have come from, Obama decided to continue the attack.
“The last of that series of attacks, a few weeks after Stuxnet was detected around the world, temporarily took out nearly 1,000 of the 5,000 centrifuges Iran had spinning at the time to purify uranium,” Sanger adds.
While the United States government has acknowledged that it is developing cyber weapons, it hasn’t officially admitted to putting them into action in an offensive manner.
“Mr. Obama, according to participants in the many Situation Room meetings on Olympic Games, was acutely aware that with every attack he was pushing the United States into new territory, much as his predecessors had with the first use of atomic weapons in the 1940s, of intercontinental missiles in the 1950s and of drones in the past decade,” Sanger concludes.
The story comes at an interesting time, as just this week news of Flame, another complex cyber weapon, emerged, again found targeting systems in Iran and the Middle East, though much wider in scope than Stuxnet and designed to steal more data than affect physical systems.