Thursday, September 6, 2012

Colocation Firm Telx Names Senior Vice President for Service Provider Markets

Telx has named Tony Rossabi senior vice president for service provider markets.

Colocation provider Telx announced on Thursday it has named Tony Rossabi senior vice president for service provider markets, a role in which he will focus on developing solutions for the expanding service provider market.

The move comes a few months after Telx appointed industry veteran Joe Weinman its senior vice president of cloud services and strategy.

Rossabi has more than a decade of international business and product development experience within IT and telecommunications organizations.

Telx recognizes network scalability, low latency and application performance as critical components for growth in the Service Provider sector.

Telx has tasked Rossabi with forming a senior team of professionals that will drive all aspects of the Telx go-to-market strategy and key initiatives in support of this segment.

Rossabi joins Telx from Tata Communications where he served as senior vice president of global carrier solutions. In this role he was responsible for leading global carrier solutions and business development initiatives.

In addition, Rossabi held positions with WilTel (Level 3), BellSouth Long Distance, and Telnext Communications.

“Telx is consistently deepening our roster of talented individuals to increase our ability to provide customers with excellent service and strategic direction,” said Eric Shepcaro, CEO for Telx. “Tony’s leadership experience building winning teams and driving business development on a global scale will further distinguish Telx as an industry leader. It is crucial for Telx to continue to deepen our expertise in the Service Provider segment and adding an industry veteran such as Tony to our team certainly is a strong push in this direction.”

Telx provides interconnection and data center services in strategic, high-demand North American markets.

The company owns and operates 17 C3 Cloud Connection Centers that server more than 1,000 customers, including leading telecommunications carriers, ISPs, cloud providers, content providers and enterprises.

Telx is a privately held company headquartered in New York City with five facilities in the New York Metro area, two facilities in Chicago, two facilities in Dallas, four facilities in California, (Los Angeles, San Francisco, and two in Santa Clara) and facilities in Atlanta, Miami, Phoenix and Charlotte, N.C.

Talk back: Have you ever formed a partnership through a platform such as the Telx Marketplace? Do you think such a partnership would benefit your hosting business? Let us know your thoughts in a comment.

No related posts.


Source : http://www.thewhir.com/web-hosting-news/colocation-firm-telx-names-senior-vice-president-for-service-provider-markets

Internal Politics May Have Kept Microsoft From Dealing Amazon A Serious Blow

Microsoft's latest Windows operating system for enterprise servers launched yesterday and should sell well.

Microsoft is calling it a cloud-friendly operating system.

But will its success cause Microsoft to win lots of new customers for its own Amazon-competitor, Azure?

Probably not, one source close to Microsoft told BI.

He says the new OS does not integrate with Windows Azure "in any concrete way."

He says internal politics kept the Windows and Azure groups from working closely together.

"So no, I don't think it'll help with Azure adoption in any way."

We asked Microsoft about new features integrated with Azure and were sent a list:

  • A feature in Windows Azure let's IT folks move apps between their own data center servers and Azure more easily.
  • WS 2012 and Azure can now securely communicate though a virtual private networks.
  • And Windows Server 2012 and Windows Azure can be managed through a single management portal.
How is Azure doing as a product? Hard to say.

Microsoft has been shy about reporting exactly how many users it has for its cloud. In May, Azure marketing exec Bob Kelly, said that Microsoft had in the "high tens of thousands" of customers for it and was "adding hundreds a day."

Microsoft may have missed a good opportunity to beef up Azure's adoption.

That's because Windows Server 2012 has been getting good reviews. It offers a lot of improvements over the previous server. Enterprises will probably move to adopt it sooner than they adopt Windows 8.

We asked Microsoft if politics between the Windows Server team and the Azure team kept them from working more closely together. This is Microsoft's response:

"Windows Server 2012 and Windows Azure are designed and developed for close integration to help customers build, manage and deliver applications across both their own datacenters and the Windows Azure public cloud, or in many cases both.  Windows Server and Windows Azure are the foundation of the Cloud OS, which offers customers common application development, virtualization, management and identity technologies that can be applied to both."

Are you a Microsoft insider with insight to share? We want to hear it! We are discreet. jbort@businessinsider.com or @Julie188 on Twitter.


Source : http://www.businessinsider.com/source-politics-could-be-hurting-microsofts-cloud-2012-9

Move Over, Marc Benioff: Aneel Bhusri Is King Of The Cloud

Workday, on the cusp of a highly anticipated IPO, will be giving a big boost to another hot up-and-coming cloud startup: Okta.

The two cloud companies announced on Wednesday that they are deepening their partnership.

Okta's service—itself delivered through the cloud—lets enterprise companies manage the security of all of their cloud services.

Workday offers cloud-based human resources apps.

So the Okta-Workday deal lets Workday offer its customers enhanced security, while Okta gets access to Workday's large customer base.

In the middle of it all is Aneel Bhusri.

Bhusri is an investor in Okta and sits on its board. He's also cofounder of Workday.

Okta was founded by Todd McKinnon, who learned the ropes from Marc Benioff as an early employee at Salesforce.com. While Salesforce.com CEO Marc Benioff gave birth to the software-as-a-service phenom, Bhusri has quietly built an empire of some of the most successful next-gen SaaS companies around.

In addition to Workday and Okta, look at these others that Bhusri invested in:

ServiceNow: In a season of great enterprise IPOs, ServiceNow's June stock-market debut stood out, earning the company a whopping $3.5 billion valuation. It was the first company to go public after Facebook.

Cloudera: Arguably the biggest of the Hadoop startups, founded by ex-Yahoo bigwigs. Cloudera was the one that snared Hadoop creator Doug Cutting away from Yahoo—even though Yahoo has its own Hadoop spin-out, Hortonworks.

Zuora: A hot cloud service that does invoicing and billing. It was founded by another early Salesforce.com alum, Tien Tzuo, with Benioff's blessing (and some of his money).

Pure Storage: Pure Storage makes enterprise storage systems entirely out of flash memory. It's lead by ex-Yahoo exec Scott Dietzen, a cofounder of Zimbra (bought by Yahoo for $350 million in cash in 2007). Dietzen is also a board member for Cloudera.

Tidemark: A younger cloud startup that only came out of stealth mode about a year ago, but with some power behind it. It was founded by ex-SAP exec Christian Gheorghe and last month landed Phil Wilmington as its COO.  Wilmington had been co-president at PeopleSoft until it was acquired by Oracle. Gheorghe and Wilmington had run a previous startup together, OutlookSoft acquired by SAP in 2007. 

Wilmington and Bhusri are former coworkers, too. In what has become tech industry legend, Bhusri started his tech career in the early 1990s at PeopleSoft, working his way up to vice chairman. He and PeopleSoft founder Dave Duffield unsuccessfully tried to save PeopleSoft from Oracle CEO Larry Ellison's hostile takeover.

But now, Bhusri could be having the last laugh. Between his successful cloud startups, Oracle is threatened, as is SAP, another software giant PeopleSoft competed with.

Both are spending billions of dollars on cloud startups to catch up with the likes of Workday.

Some of that money may land in Bhusri's pockets. So he's set to win either way.

It's good to be king of the cloud.

Don't miss: The Next 25 Big Enterprise Startups 


Source : http://www.businessinsider.com/aneel-bhusri-workday-cloud-computing-2012-9

Cloud Spectator Study Finds Internal Network Stability Crucial to Overall Cloud Performance

Cloud Spectator shows Zunicore cloud is much more stable than Amazon EC2

PEER 1 Hosting‘s cloud division Zunicore outperforms Amazon EC2 in terms of the stability of its internal cloud network, according to a report released last week by cloud consulting firm Cloud Spectator.

While CPU performance is around the same area between Zunicore and Amazon EC2, Cloud Spectator sees a drastic difference in their respective internal network performance. Cloud Spectator says the actual speed of a VPN is crucial to a customer’s cloud IaaS purchase decision.

It is worth mentioning that Cloud Spectator was not paid to write this report, which is a common practice among research that compares two competing services. As prospective customers look for cloud options for their business, they may consult third-party, independent studies to help inform their purchasing decision. With this in mind, it is a good idea to keep informed about the different performance studies available to end-customers, as well as the methodology behind them.

“When multi-server applications must maintain a stream of communication, the bottleneck that may occur in a network inside the VPN is a major consideration, and that performance must be reliable for complex and mission-critical applications running on the cloud,” Cloud Spectator writes in its report. “An application that must constantly read and write data from one or many servers depends on the speed of the internal network.”

CloudSpecs Performance Test System is a software suite of open-source, industry-standard server performance tests that can be replicated 4 times a day, 365 days a year. For this test, Cloud Spectator averaged performance over 30 days between July 30, 2012 and August 29, 2012.

“Apart from the fact that Zunicore outperforms Amazon in every one of our CloudSpecs performance measurements gauging the performance of the internal networks, the idea that Amazon’s network performance fluctuates so drastically over time should fire up a red flag. Other notable performers with steady network performance that we have measured include Rackspace and Hosting.com,” Cloud Spectator writes.

Reports such as this one also make a case for customers choosing a different cloud provider over Amazon. In a recent feature on the WHIR, SoftLayer explained how it wins cloud customers from Amazon.

The data Cloud Spectator collects through its testing process is used to measure the value and performance of different cloud providers. The full test results and methodology are explained in detail on its blog.

In July, the WHIR hosted a webinar with Cloud Spectator analysts and OnApp CMO Kosten Metreweli where OpenStack Essex was compared to OnApp Cloud. The archived webinar is available for viewing at the WHIR.

Talk back: How does performance of your cloud stack up? Do you think customers will be more aware of internal network performance with reports like these? Let us know in a comment.

Related posts:

  1. Cloud Brokerage Ingram Micro to Sell Netmagic Cloud Services
  2. The Open Group Publishes Standards for SOA and Cloud Computing

Source : http://www.thewhir.com/web-hosting-news/cloud-spectator-study-finds-internal-network-stability-crucial-to-overall-cloud-performance

Cloud traps: How to avoid them

Moving to the cloud is still a big challenge for many companies. For all its potential benefits - cost optimizations, greater reliability, improved scalability, easier access to computing resources - there are equally valid concerns that keep companies from making this move: security issues still plague several cloud solutions, the track record of vendors in achieving the promised SLAs has been less than stellar, and others. Deciding to make the move is only half the battle, though.

Even if they manage to properly address all the concerns related to the cloud, or to find use cases where these concerns aren’t as important, companies may still be hard pressed to fully realize the benefits of cloud computing. This is a direct result of two cognitive traps: the trap of ownership and the trap of past successes.

The ownership trap

People are hardwired to value the things they own, and to have a hard time letting go of things once they have developed a sense of ownership (even if it isn’t real). In the cloud, this translates to users developing a sense of ownership over their cloud servers, turning resources which should have been hired on a pay-as-you-go basis into “dedicated virtual servers” for which people are paying hundreds of dollars every month. In the end, this developed sense of ownership creates a reluctance to shut down existing servers, leading to the elimination of any kind of cost optimizations that could be derived from a cloud model.

This is what I call the ownership trap. Instead of using the utility computing model that the cloud enables, companies end up having several underused virtual servers on-line, replicating the inefficiencies of a conventional data center at a much higher cost. And several vendors encourage this, offering “special” monthly prices that guarantee the full availability of the requested resource. If the monthly price is 50% of what you would pay to keep that server online the whole month on a pay-as-you-go model, but you only actually use that server for 10% of the month, you are still overpaying by a large margin.

The only way to beat this trap is by learning to let go. It’s not that companies shouldn’t own any servers, but rather that they can skip purchasing servers to handle peak demand, and rely instead on temporary servers that can be quickly brought online and then later shut down. The same goes for test servers: for most companies, there is no real need to keep a test server online 24×7. It can be spun up, used, and then have an image saved and shut down. This is what leads to cost optimizations.

The trap of past successes

One of the top methods people use to solve problems is by similarity: they’ll look to the past to find problems that are similar to the ones they are facing, and apply the same solutions used before. This usually works, but in the case of the cloud, it can lead to big trouble. As developers got used to the client-server computing model, system architectures based on centralization (of data, of access control, and of other elements) became widespread, mainly because it worked.

A cloud environment, however, is closer to a distributed system than a centralized one, and therefore solutions based on centralization don’t perform very well. A centralized data store, for instance, can create a single point of failure for applications, reducing the reliability. At the same time, it can create a performance bottleneck that restricts the possibility of horizontal scalability. As scalability and reliability are two of the promises of cloud-based systems, employing centralized solutions can quickly destroy these benefits.

To escape this trap, developers in particular need to recognize that cloud systems need to be designed and developed differently from conventional ones. Once we understand that the shift in computing paradigms leads to the need for new architectures and development techniques, it becomes plain that what worked before may well not work again.

Finding the right mentality

Escaping these traps is no easy feat. I’ve been working with cloud-based systems for over two years now, and every now and then I still catch myself falling into them. It requires constant attention and a sort of “reeducation” of IT professionals from all areas. At the same time, it is essential to achieving the full potential of cloud solutions.

I don’t mean that the other concerns about the cloud aren’t important. They are and, in most cases, they will actually determine if a move to the cloud is feasible or not. The right mindset, however, is necessary to keep companies from perceiving their cloud efforts as major failures.


Source : http://www.techrepublic.com/blog/datacenter/cloud-traps-how-to-avoid-them/5775

Wednesday, September 5, 2012

Cloud Host ElasticHosts Offers SSDs at All Server Instance Sizes

ElasticHosts says its new SSD options are available in its London Portsmouth zone and will roll out across its data centers in the UK, US and Canada

Cloud hosting provider ElasticHosts announced on Wednesday that it has improved its user interface, as well as introduced Solid State Drives at all server instance sizes.

Customers with high I/O applications and heavy databases will be able to provision SSDs to offer a higher level of performance. SSDs allow data to be accessed up to 250 times faster than standard hard disk drives, according to ElasticHosts.

Previously, ElasticHosts only offered cloud servers backed by traditional hard disk drives. Its new SSD options are available in its London Portsmouth zone and will roll out across its data centers in the UK, US and Canada.

Database storage and transaction processing are two examples of I/O intensive workloads that require speed and performance. More cloud providers and hosting companies are introducing SSDs to provide higher I/O rates to customers. It has become an increasingly popular technology for cloud providers to offer performance as a value-add for customers.

ElasticHosts customers can now choose to include SSDs as part of their cloud hosting package and get improved performance on a flexible, pay-as-you-go basis. SSD options are available to customers regardless of the amount of resources they use.

“We are pleased to offer the option of cloud hosting based on SSD drives to all of our customers,” Richard Davies, CEO of ElasticHosts said in a statement. “While HDDs are fine in most cases, in certain instances customers need greater performance. The SSD options will be particularly attractive to those customers looking to move high performance databases to a cloud based infrastructure-as-a-service model. As the cloud market matures, offering SSD is the next step for us. Some cloud providers have offered SSD with only their largest, most expensive server instance sizes, but we believe that all of our customers should be offered the benefits of improved computing power and greater resiliency.”

As for the updated user interface, ElasticHosts says it has simplified management of large accounts and improved response times. Advanced Ajax functionality means that customers can see changes made to their virtual servers in real time.

At the beginning of July, ElasticHosts launched its new white-label cloud program.

Talk back: Have you introduced SSDs in your cloud server offering? Why or why not? Let us know in a comment.

Related posts:

  1. Cloud Brokerage Ingram Micro to Sell Netmagic Cloud Services
  2. The Open Group Publishes Standards for SOA and Cloud Computing

Source : http://www.thewhir.com/web-hosting-news/cloud-host-elastichosts-offers-ssds-at-all-server-instance-sizes

Like it or not, it's time to get a cloud strategy

Regardless if you are for or against adopting the cloud as part of your organization’s computing infrastructure and/or as a replacement for that tired, multitenant software architecture of yours, as an IT professional or executive, you need a cloud strategy!

Do you think I sound like a snake oil salesman? Already coming up with the same old excuses as to why the cloud isn’t a viable alternative to your on-premise data center? Well, you need to start to get over that bias of yours as you can’t argue with a multi-billion dollar industry that is progressively occupying a bigger portion of the technology provider-type market share. (Need evidence of this? Just take a look at Salesforce.com’s/NYSE: CRM’s last income statement.) Additionally, if your competitors haven’t already considered an adoption strategy, they’re most likely currently in the process of migrating toward at least a hybrid approach toward cloud computing. It’s time to do your diligence and draw up a new SWOT analysis matrix, because your organization will inevitably pay more when it’s forced into the cloud years down the road, or lose business to competitors when they’re able to capitalize on your indifference.

As alluded to before, there is still quite a bit of distrust and cynicism when it comes to the cloud, especially from IT professionals who’ve gone complacent and nested themselves between power hungry server racks. I don’t mean to sound cynical myself, but the truth of the matter is that on-premise environments actually hinder an organization’s ability to scale their technology according to operational business values, or with product or service development. Furthermore, the more an organization concerns itself with the details of running IT services in house, instead of on-demand, the more stretched an already bloated IT budget will become. Not to say that formulating a cloud adoption/migration strategy can’t become problematic and expensive (especially for organization’s with legacy systems), but those who have the power to make these types of decisions need to start asking themselves how much of their IT budget is dedicated towards application development, and how much is spent on supporting the hardware and software needed to run them.

The same platitudes regarding the dependability, cost and security of cloud-based services just won’t cut it these days. If you’re an IT professional who reports to an executive and you’re throwing a barrier between cloud adoption and your firewall, or you’re an executive who usually delegates these types of decisions to “the professionals”, stop! The cloud offers a profitable way, in monetary terms, as well as in terms of the unexpected benefits an organization can achieve by adopting what is actually a more reliable and, plausibly, just as secure approach to computing.

In the coming weeks, I plan to make a concerted case for the cloud, and prove how the benefits can vastly outweigh any presumed risks. I’ll do this by discussing topics like how IaaS (Infrastructure as a Service) on-demand software/platforms can bring about change in both your IT and non-IT related staff alike. Hopefully, in the end, you’ll get the sense that by formulating a well-intentioned cloud strategy, your organization will become more productive, cost-effective and capable of allocating your IT resources to scale with organizational goals.


Source : http://www.techrepublic.com/blog/datacenter/like-it-or-not-its-time-to-get-a-cloud-strategy/5770

Tuesday, September 4, 2012

Cloud File Sharing Software ownCloud Launches Beta of Updated Community Version

ownCloud has made several improvements to its community version of its cloud file sharing software

Cloud file sharing software ownCloud has made several updates to the community edition of its open source file sync and share project.

The latest version of ownCloud, released in beta on Tuesday, features the ability to assign a sub-adminisrator to a group to handle account management, desktop fast syncing, enhanced permissions, and support of shared calendars so end users can view shared calendars across multiple devices.

“This beta does a great deal to scale, enhance the user experience and increase control of ownCloud,” Frank Karlitschek, founder of ownCloud said in a statement. “Personally, I also really like the shared calendar functionality because it demonstrates clearly that we can live in a world of device and location independence securely and privately.”

In March, ownCloud launched a partner program for partners, including web hosts, who distribute the platform access to customer referrals, special pricing, marketing assistance and technical support.

ownCloud currently has more than 600,000 users worldwide. It is unclear how many users it has through its partners

Its users can host their own data sync and share services on their own hardware and storage, use public hosting and storage offerings or both. With the latest version, external cloud storage from providers like DropBox or Google can be integrated into an ownCloud user account and be part of the ownCloud user folders.

Shared address books can now be synced to other devices with the latest version of ownCloud, and contacts can be dragged and dropped between address books.

Talk back: What kind of cloud storage do you offer customers? Do you see cloud storage as a lucrative value add? Let us know in a comment.

Related posts:

  1. Cloud Brokerage Ingram Micro to Sell Netmagic Cloud Services
  2. The Open Group Publishes Standards for SOA and Cloud Computing

Source : http://www.thewhir.com/web-hosting-news/cloud-file-sharing-software-owncloud-launches-beta-of-updated-community-version

SAP's Bill McDermott Just Announced A Huge New Company Goal

SAP has been on a tear of late, re-energized by the popularity of its hot in-memory database, Hana, and its aggressive plan to (finally) become a major player in cloud computing.

Now CEO Bill McDermott just revealed a huge new goal for the company: It wants to have 1 billion users of its technology by 2015. So Bill McDermott told attendees at a Churchill Club event in Silicon Valley last week, reports Eric Savitz at Forbes.

To do that, SAP would essentially need to more than double the current number of users it has now, an SAP spokesperson confirms.

The 1 billion number "includes all the people who would touch an SAP system," we were told.

This means all the employees at companies using SAP enterprise software and its new database. It also includes all the employees at companies that use its recently acquired cloud services, including SuccessFactors and Ariba, and other tech, like the homes served by utilities whose smart meters feed data into SAP systems.

It also includes mobile device users whose mobile carrier uses SMS messaging systems from Sybase, an SAP subsidiary.

To get there, SAP is investing like mad. It is spending billions acquiring cloud companies like SuccessFactors ($3.4 billion, closed in February) and Ariba ($4.3 billion, a deal announced in May). It's also set aside a half a billion dollars to fund the adoption of its Hana database, including a $155 million fund for startups writing big data apps using Hana.

At least some of these investments are starting to pan out. Hana is on track to book $500 million a year in revenues, McDermott also said. That's the most successful SAP product of all time, its creator told Business Insider.

It could also do well with the cloud, after previous attempts like homegrown offering ByDesign floundered. But now SAP has got SuccessFactors founder Lars Dalgaard helping it compete in cloud-based services.


Source : http://www.businessinsider.com/sap-bill-mcdermott-one-billion-users-by-2015-2012-9

Noise Filter: Windows Server 2012 Hits General Release, Pushes Cloud OS Notion

Windows Server 2012 and Cloud OS is a significant leap forward in Microsoft's cloud computing strategy

Every now and then, an exciting or controversial issue triggers a flood of online discourse. For our Noise Filter feature, the WHIR pans the raging rivers of opinion for shining nuggets of useful commentary.

With Tuesday’s highly-anticipated launch of Windows Server 2012, Microsoft has been working hard to promote its underlying message of how the new product and its accompanying Cloud OS (based on Windows Server and Windows Azure) is a significant leap forward in the company’s cloud computing strategy.

The general availability of Windows Server 2012 comes months after its successful beta period, as well as a preview given in a session at Microsoft Partner Conference 2012.

Windows Server 2012 offers customers a single platform to manage and deliver applications and services across private, hosted and public clouds.

Ultimately, the new cloud operating system will help to improve the speed, scale and power that data centers and applications can build on.

Microsoft’s Windows Server 2012 delivers more than 200 cloud services, along with new advancements in virtualization, storage, networking and automation.

In a post on Microsoft’s official blog Technet, Satya Nadella, president of Microsoft server and tools business discussed the four major objectives that Microsoft developers addressed in building the Cloud OS.

In building the Cloud OS, we are focused on four key things. First is the transformation of the data center. We want to bring together all of the resources provided by a traditional data center – storage, networking and computing – into one platform that scales elastically with an organization’s needs. Second is offering the APIs and runtimes to enable developers to create modern applications – for mobile, social and big data. A third important aspect of the Cloud OS is ensuring personalized services and experiences, so that any user on any device can access all of their data and applications. Lastly, data of any size or type, stored anywhere and processed in any style, must be a first-class citizen of the Cloud OS.

ZDNet’s Mary Jo Foley talked about how Microsoft’s overall cloud vision is undergoing a significant shift with the launch of Windows Server 2012.

But the overarching message of launch day is that Windows Server 2012 isn’t about all the new bells and whistles in Windows Server 2012. Instead, it’s about Windows Server 2012 being a key component of Microsoft’s ‘Cloud OS’ vision and strategy — one where the very idea of what consititues an “operating system” is morphing. Microsoft execs have been wrestling with the best way to explain the company’s approach to the cloud for the past few years. “Software plus services” gave way to “We’re all in,” which later gave way to Microsoft’s public/private/hybrid messaging.

In a report via Information Week, Randy George outlined the three key changes to Windows Server 2012 over its predecessors.

Dynamic Access Control is a really cool feature of Windows Server 2012, but it’s not exactly plug and play to deploy. To be fair, any DLP package from any other vendor can be equally or even more difficult to deploy and manage… On the whole, DirectAccess is vastly improved in Server 2012. The drawback is, in order to realize many of those improvements, you need to deploy Windows 8 along with it… In Server 2012, the Server Core and full UI installation options are no longer an all or nothing proposition. That’s good news for security conscious admins, because it makes the process of hardening a Windows server playing a critical server role much easier.

In his blog post, The New York Times’ Quentin Hardy stressed how Windows Server 2012 “may be the most significant effort to date in Microsoft’s plan to change with the times.”

The company has to transition to a world where Windows and Office matter less, and sell more services via the cloud. The new servers are priced between $425 and $4,800, depending on capabilities. The hope, however, is that Microsoft will be able to sell much more online, with services like data marts for performing analytics, or extra computing capacity for heavy one-time workloads.

And finally, Bloomberg’s Dina Bass talked about how Windows Server 2012 will ultimately allow Microsoft to compete with cloud giants like VMware.

Microsoft Corp. (MSFT) released a new version of Windows for running corporate networks, trying to take share from VMware Inc. (VMW) and gain ground as companies seek to cut the cost of managing business programs. Windows Server 2012 improves features for virtualization, which allows companies to cut costs by using fewer server computers in their data centers.

Talk back: Are you planning on using Windows Server 2012? Do you think it will slingshot Microsoft’s presence in the cloud market and allow the company to compete against the VMware’s and HP’s of the marketplace? Let us know in the comments.

Related posts:

  1. Cloud Brokerage Ingram Micro to Sell Netmagic Cloud Services
  2. The Open Group Publishes Standards for SOA and Cloud Computing

Source : http://www.thewhir.com/web-hosting-news/noise-filter-windows-server-2012-hits-general-release-pushes-cloud-os-notion

Microsoft's New Enterprise Operating System Is Going To Be A Big Hit

While all the world remains focused on the forthcoming Windows 8 desktop OS, Microsoft's other Windows is already out—and it's destined to be a big hit.

Windows Server 2012 is officially available today and according to the analysts, enterprises are really going to like it.

"If I was still working at Microsoft, I would be very proud to put this product out. I think they've done a super job," analyst Michael Cherry told Business Insider.

Cherry, now the operating system guru for Directions On Microsoft, worked on parts of Microsoft's server operating system in the '90s.

Despite Microsoft's rhetoric, Windows Server 2012 isn't really a game changer, but an OS filled with tons of "incremental changes," says Cherry. It will slowly step Microsoft customers toward the cloud.

And that's what they want. That's what they need.

It makes it much easier to move virtual Windows servers around, from running on a server in a private data center to residing in one managed by a cloud provider—say, Rackspace or Microsoft's Azure cloud. In the past, this was a major pain. Moving a virtual server meant updating a long list of networking configurations.

New up-and-coming technologies known as software-defined networking help solve this problem. Windows Server 2012 implements Microsoft's own version of SDN, says Cherry, and it works well.

Plus, Microsoft is "dogfooding" Windows Server 2012, or testing it internally.

Microsoft's search engine, Bing, is running Windows Server 2012, says Satya Nadella, who's president of Microsoft's Server and Tools business unit. This proves to enterprises that the new OS can handle a really big cloud service.

Enterprises tend to be slow to roll out new operating systems, so Windows Server 2012 might not become an immediate smash hit. But it is destined to do very well.

And that's hugely important for Microsoft.

Remember, Microsoft's Server and Tools business is now bigger than its desktop Windows business. In the fiscal year Microsoft just finished, S&T hit $18.7 billion in revenue, compared to $18.4 billion for Windows. Windows Server is a key product in Server and Tools.


Source : http://www.businessinsider.com/microsoft-windows-server-cloud-operating-system-2012-9

Is there still a place for the virtual tape library?

Today, a great number of different storage mediums exist whose purpose is to retain information in some manner. One of the oldest data storage mediums, the tape, is a linear storage system that still has great application with today’s revisions to this long standing technology.

Implementing tape storage within virtual storage infrastructure can be a great asset to a backup and recovery solution for many industries.

The tape storage system sequentially stores data on drives that utilise a magnet tape as a storage medium. The core design of tape has existed for many years but has been refined to operate more efficiently and reliably while sustaining the immense amount of information needed in today’s storage solutions.

Most of today’s tape drives resemble the form factor of a cassette tape rather than the big projector-like reels which slowly rotated in the computing labs, seen in early science fiction movies. Tape drives function at high enough speeds to keep up with data streams needed for properly configured backup interfaces.

A modern tape storage tape’s design is such that it should rarely fall victim to “start/stop operation,” a common problem in older designs that caused data loss from the damage cause to the tape during this process.

A virtual tape library takes the essence of archival tape storage and implements it in a virtual sense.

Though the name implies that the storage is tape based, it doesn’t necessarily have to be. Virtual volumes can be created to use a modern hard disk, for example, as the actual storage medium for a virtual tape volume.

Storage is allocated based on data usage as whether the information will be readily available on a hard drive cache, the hard disk itself or actual tape storage media.

In order for the read and write times to sync together, especially when various storage platforms are included within such virtual environment, the system has to be appropriately configured to avoid operational glitches.

A virtual tape server platform is needed to administrate how data is buffered and allocated into storage.

In a general sense, this means that archival data will buffer to be written to the slowest medium while data that is accessed more regularly will be written to faster mediums and data that is in constant use will mostly remain in the hard drive cache as it is simultaneously put into permanent storage at certain intervals.

Ideally, the best virtual tape storage system should include a variety of storage mediums but predominantly utilise tape storage for archival purposes.

Tape is the lowest cost of all storage mediums which makes it ideal to backup non critical data for record retention purposes. In a virtual environment, storage mediums should be configured with data retrieval hierarchy in consideration.

Utilising solid state storage media as part of virtual tape library, in addition to standard platter based drives can appropriate data to the best medium, based on the classification of the data.


Source : cloudcomputing-news[dot]net

Monday, September 3, 2012

From Virtual Shopping To Virtual People: 6 Tech Breakthroughs Coming Soon

Cisco's Dave Evans has a fantastic job. His role is to hobnob with all the brightest minds in technology and predict the future.

He then builds prototypes for Cisco's customers of this newfangled stuff in action.

BI recently met up with Evans and toured an office-of-the-future lab at Cisco's San Jose headquarters, where he keeps a selection of these prototypes on display.

None of this stuff was built out of idle fancy—which differentiates Evans's operation from other R&D wonderlands. Every one of of these prototypes were developed for a specific Cisco customer wanting to solve a specific business problem.


Source : http://www.businessinsider.com/tech-breakthroughs-cisco-dave-evans-2012-8

BSA: EU cloud uptake lower than global average

Highest number of cloud users in Romania and Greece, according to research

Research from software advocates BSA (Business Software Alliance) has inferred a surprising lack of cloud usage among European Union computer users.

The figures, surveying 4000 people, showed that across the EU, whilst 86% of respondents used cloud services for personal use, less than a third (29%) used it for business purposes.

Greece and Romania had the most cloud users with 39%, higher than the global average of 24%, with Poland (25%), the UK (21%) and Austria (20%) making up the top five.

The most popular responses to the question “What type of online cloud computing services have you used?” were:

  • Email service (79%), compared to a global figure of 78%
  • Online word processing aligned with 36% of EU consumers (45% global numbers)
  • Photo storage and online games came joint third (35%) with European computer users

The global figures derive from research earlier this year by Ipsos Mori covering nearly 15000 PC users in 33 countries, about whom the BSA set out their research template.

The research also revealed a wide discrepancy between countries on how familiar they were with the technology.

Where in countries such as Greece and Romania, 24% and 20% of respondents claimed a ‘high familiarity’ with the cloud – compared to the 39% who use it in both countries – countries such as the UK and Austria had more people familiar with the cloud than actually used it.

According to Robert Holleyman, BSA president: “Unfortunately, most computer users in the EU have little understanding of cloud computing and have not yet moved to capitalise on the opportunities cloud computing offers”.

Analysis: Familiarity breeds contempt...but what about unfamiliarity?

Analysis of the survey’s methodology shows potential discussion points.

Even though the research surveyed 4000 regular PC users, it only covered nine countries – Austria, Belgium, France, Germany, Greece, Hungary, Poland, Romania and the UK.

It’s fair to say some big EU countries weren’t involved in this survey – however, the questions asked did correlate with the Ipsos Mori global market research to avail better results.

But could a further argument be made to infer that the figures could be higher with better understanding?

The survey mentioned various responses from participants, including answers to the question “How familiar are you with cloud computing?”

From the responses, 43% had “never heard of it”; 22% had “only heard the name”; whereas subsequently dwindling numbers knew “a little” (18%); or were “somewhat” (11%) and “very familiar” (6%) with it.

While this reveals that 35% – or approximately 1400 respondents – did have knowledge of cloud computing, it’s a fair assertion that most European computer users are still unfamiliar with the concept.

This is a prevalent trend, especially when the American-centric research from Wakefield Research for Citrix published last week is considered.

According to their study of more than 1000 American adults, responses to a slightly vaguer question – “what’s the first word of phrase that comes to mind when you hear the term ‘the cloud’?” – had very creative results.

These included such insights as “toilet paper”, “drugs” and “aeroplanes”.

While this methodology could also be brought to account, it’s surely a question of semantics – the differentiation between ‘a cloud’ and ‘the cloud’ is a significant one, after all.

Perhaps more importantly, the report found that 32% of respondents saw the cloud as “a thing of the future”. Kim DeCarlis of Citrix spoke at the time of a “wide gap between the perceptions and realities of cloud computing”.

Do these two pieces of research indicate a need for greater transparency with the definition of the cloud? Let us know in the comments...


Source : cloudcomputing-news[dot]net