Comments
yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
Cloud Computing
Conference & Expo
November 2-4, 2009 NYC
Register Today and SAVE !..

2008 West
DIAMOND SPONSOR:
Data Direct
SOA, WOA and Cloud Computing: The New Frontier for Data Services
PLATINUM SPONSORS:
Red Hat
The Opening of Virtualization
GOLD SPONSORS:
Appsense
User Environment Management – The Third Layer of the Desktop
Cordys
Cloud Computing for Business Agility
EMC
CMIS: A Multi-Vendor Proposal for a Service-Based Content Management Interoperability Standard
Freedom OSS
Practical SOA” Max Yankelevich
Intel
Architecting an Enterprise Service Router (ESR) – A Cost-Effective Way to Scale SOA Across the Enterprise
Sensedia
Return on Assests: Bringing Visibility to your SOA Strategy
Symantec
Managing Hybrid Endpoint Environments
VMWare
Game-Changing Technology for Enterprise Clouds and Applications
Click For 2008 West
Event Webcasts

2008 West
PLATINUM SPONSORS:
Appcelerator
Get ‘Rich’ Quick: Rapid Prototyping for RIA with ZERO Server Code
Keynote Systems
Designing for and Managing Performance in the New Frontier of Rich Internet Applications
GOLD SPONSORS:
ICEsoft
How Can AJAX Improve Homeland Security?
Isomorphic
Beyond Widgets: What a RIA Platform Should Offer
Oracle
REAs: Rich Enterprise Applications
Click For 2008 Event Webcasts
In many cases, the end of the year gives you time to step back and take stock of the last 12 months. This is when many of us take a hard look at what worked and what did not, complete performance reviews, and formulate plans for the coming year. For me, it is all of those things plus a time when I u...
SYS-CON.TV
The Next Virtualization Waves Are Forming
We are still in the beginning stages of realizing what virtualization can do. Where do we go from here?

Pete Manca's Blog

The virtualization "waves" are just forming. And while server virtualization is at full crest, there are many more waves behind this that are taking shape and quite frankly, are more significant.

Server virtualization was about saving money. Allowing multiple applications to be consolidated onto a single server saves capital and operational expenses. Reducing the number of servers running in the data center is a good thing, as it also saves some carbon emissions as well. But is that it? If so, that’s more like a ripple than a wave. Don’t get me wrong, reducing power, cooling, server count and consolidating apps is a good thing, but it's not the whole story. Not by a long shot.

I don’t believe that this is it. In fact, I think we are still in the beginning stages of realizing what virtualization can do. It’s really the enabling technology that fuels the ability to create new ways to solve problems that exist in today’s data center.

As with all great technology movements, a core set of technologies must be established first. Server virtualization is one for sure, but what are the others? I/O Virtualization might be the next important cornerstone technology. Without solving this problem, servers continue to be static and inflexible. We might be able to utilize servers more by virtue of the hypervisor, but we can’t exploit them to their fullest extent without the flexibility to change their I/O bindings dynamically. Other key virtualization technologies include file virtualization, data virtualization, and application virtualization. These are keys to making access to applications, data, and resources agile and ubiquitous.

Once the server, I/O, data, and applications are virtualized, the resulting possibilities and opportunities really are endless. These cornerstones open the market for management, security, converged fabrics, and a whole host of technologies that can free up the data center and open new markets.

Expect 2008 to be another banner year for virtualization. The next wave is here.

About Pete Manca
Pete Manca is CTO and EVP of Engineering, Egenera. He brings over 20 years' experience in enterprise computing to Egenera. His expertise spans a wide range of critical enterprise data center technologies including virtualization, operating systems, large-scale architectures and open standards. In particular, his leadership and experience in virtualization technologies has led to the continued progression of Egenera's advanced PAN (Processing Area Network) architecture. Manca leads product planning by working directly with customers to understand their most difficult challenges and guide Egenera's architecture, hardware and software engineering teams to translate those requirements into solutions. Prior to Egenera, he served as Vice President of Engineering at Hitachi Computer Products America with responsibility for operating systems and enterprise middleware products.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1

Paul,

Thanks for your comments. You are, of course, correct that virtualization started with the mainframe. I am lucky enough - or old enough? :) - to have worked on mainframes and have blogged on the very same thing in the past (http://blogs.egenera.com/pete_manca/).

My point in this blog was that server virtualization is only part of the story in this new emerging x86 virtualization market and other complementary technologies, like IO virtualization, are just as important, if not more important, when trying to create a dynamic data center. In reality, the mainframe has been doing server virtualization and IO virtualization for years. It is just not that relevant in the discussion today around the emerging trends in data centers. That is not meant to be negative on the mainframe, a platform I have immense respect for, however, it is not really in the mix when discussing new wave virtualization technologies.

Pete

To expand on a feedback I posted on 9 January to another similar article:

IBM commercially introduced virtualization in 1967, it's not the "next" big thing, it's been around for over 40 years. It JUST recently (even "recently" is a
relative statement over 40 years) was embraced and exploited by the x86 world, both AMD and Intel are finally stepping up and putting the hardware features in to support it better, features that have existed on other platforms for years.

What's more interesting today is to realize is that there are three logical virtualization layers:

1. HW virtualization (goes back to 1967) includes not just the processor and memory, but also storage, network, I/O, and more.

2. Middleware virtualization (goes back to the early 80s at least, unless you want to count IBM's two premier transaction managers CICS and IMS which are both 40 or more years old and provided for virtualized transactional services long before the 80's)

3. Service (SOA) virtualization, more recently formalized for Enterprise specific services.

Net is virtualization is finally "reaching" platforms like x86, but it is still no where near as sophisticated (and capable) as it has been for, in some cases, decades on platforms like IBM's System z and even IBM System p. What I find interesting is to watch these more mature systems and learn what the distributed world has to "work up to" someday.


Your Feedback
Pete Manca wrote: Paul, Thanks for your comments. You are, of course, correct that virtualization started with the mainframe. I am lucky enough - or old enough? :) - to have worked on mainframes and have blogged on the very same thing in the past (http://blogs.egenera.com/pete_manca/). My point in this blog was that server virtualization is only part of the story in this new emerging x86 virtualization market and other complementary technologies, like IO virtualization, are just as important, if not more important, when trying to create a dynamic data center. In reality, the mainframe has been doing server virtualization and IO virtualization for years. It is just not that relevant in the discussion today around the emerging trends in data centers. That is not meant to be negative on the mainframe, a platform I have immense respect for, however, it is not really in the mix when discussing new wave v...
Paul Giangarra wrote: To expand on a feedback I posted on 9 January to another similar article: IBM commercially introduced virtualization in 1967, it's not the "next" big thing, it's been around for over 40 years. It JUST recently (even "recently" is a relative statement over 40 years) was embraced and exploited by the x86 world, both AMD and Intel are finally stepping up and putting the hardware features in to support it better, features that have existed on other platforms for years. What's more interesting today is to realize is that there are three logical virtualization layers: 1. HW virtualization (goes back to 1967) includes not just the processor and memory, but also storage, network, I/O, and more. 2. Middleware virtualization (goes back to the early 80s at least, unless you want to count IBM's two premier transaction managers CICS and IMS which are both 40 or more years old and provided for v...
SOA World Latest Stories
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a w...
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations?...
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the clou...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNati...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET News.com Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)sys-con.com!

Advertise on this site! Contact advertising(at)sys-con.com! 201 802-3021


SYS-CON Featured Whitepapers
ADS BY GOOGLE