Microsoft Releases Hyper-V Server 2008

October 17, 2008

Well, well, well… look at this. Microsoft is unfurling more and more layers of its next-gen computing and software strategy – especially with regards to virtualization.

(Ok, a required disclosure: We are currently under NDA with Microsoft and have some confidential knowledge around certain roadmap and product plans, but NOTHING in this blog post is based on any inside knowledge derived from, or in any way based on, those confidential discussions).


The reason I wanted to blog on this is in relationship to the IT in Transition theme is that, as I have written in several blogs, the entire landscape of the endpoint is changing. A lot of people see this, so this view is in no way unique or revolutionary to us.

A couple of posts ago I blogged on the coming Endpoint Wars of 2009. In order to make that post digestible, I intentionally left a detailed and deep discussion about the impact of virtualization and hypervisors out of that post.

Let me add a bit of my color (and opinion) here:

Quoting from David Marshall’s article:

So what’s new and different? Didn’t they already release Hyper-V? This platform is slightly different from the version found in Microsoft’s Windows Server 2008 operating system. According to Microsoft, it provides a simplified, reliable, and optimized virtualization solution for customers to consolidate Windows or Linux workloads on a single physical server or to run client operating systems and applications in server based virtual machines running in the datacenter. And it allows customers to leverage their existing tools, processes and skills. But perhaps best of all, Microsoft is making this product a no-cost Web download — yup, it’s free!

Yup, it’s free.

Also from the article:

The provisioning and management tools are based on Microsoft System Center, providing centralized, enterprise-class management of both physical and virtual resources.

And the management mechanisms and tools are “above platform” as we’d expect, with Microsoft System Center being adapted as the management framework, as we’d expect.

So the Hypervisor (HV) wars are in full force now as well. Obviously this is just the leading edge of the one of the fronts of the Endpoint Wars.

Seems like the three major battlegrounds are VMWare, Citrix and now Microsoft. If highly capable hypervisors are going to be “loss leader” in any go-forward virtualization platform strategy, then where will the value and revenue shift to as the traditional demarcations are realigned?

Our guess is that more of the instrumentation will be subsumed into the platforms (as we have stated for quite some time) including into the HV. This obviously will force more of the method “above platform” including image management and enforcement. And where does traditional infosec (AV, IDS, etc) move in this new world?

Think services.

And these services will go well beyond software streaming, and likely include image management and high-assurance software and full software stack delivery methods.

And platform intrinsic security and compliance “instrumentation”, supported by above platform validation and attestation methods, will likely become commonplace.

Food for thought.



Rolled a “7”…

February 18, 2008

Last week SignaCert was named one of the “7 Virtual Management Companies to Watch” by Network World. Wow, that came a bit out of left field…

Yes, we have been doing a lot of work in this space (configuration and image management for virtualization) but we were surprised, and a bit humbled by the recognition.

We are also coming off of a big week in New York City where we spent some quality time with the tier one and two financial firms exploring where they are in their thinking about systems management and information security in the enterprise. It was very interesting really….there were several independently validated trends that emerged out of these great customer meetings.

1) Virtualization remains a bit of a curiosity today in most large enterprises. It was a bit uncanny to hear from several of them that they were “10% or less (much less) deployed in their current enterprise IT environments” and that they “expected to be” 40-60% deployed in the 2-4 year time frame.

2) Deployment of MS-Vista on the user endpoints (desktops, workstations, laptops) is going to be “delayed” by at least a year—maybe two. This is somewhat a function of the current financial environment in the Financial Services sector—but likely more a function of not having a compelling reason to switch, coupled with a resounding “we’re not ready.”

The other thing that came thru was common thread of “how do we take this time out and get our stuff together as we prepare for the next wave?”

What was clear is that standardized configuration and image management is a *precursor* to what our customers need to achieve NOW. That is – do more with less by:

1) Delivering more productive IT Business Process cycles with their existing infrastructure WHILE lowering costs and improving compliances.

2) Doing that while optimizing CapEx and lowering OpEx.

Sounds like IT measurement and automation to me. No choice. No more excuses.

Another very interesting normalized data point is that the traditional (read: already in house and validated as suppliers) need to do more to address these needs AND the only – underscore ONLY new vendors that will make the cut to supply in these times are ones that can address the More with Less demand. Period.

So connecting all of these dots……

Our prospects and customers must acquire tools and knowhow to make sure that what they build and deploy in their IT environments STAY deployed as intended through out the IT business process lifecycle.

And if we can’t understand and maintain our S/W builds NOW in our mostly monolithic (1 computer – 1 Operating System and Application Stack)—then how are we possibly going to do this in the Virtualized IT World that we all know is coming?

Thank you Network World for “getting it”… Software measurement and IT controls methods built on high resolution software stack management is not a luxury today… and increasingly crucial to manage the current and future enterprise IT.

IT in Transition turns the page…..


Microsoft and Veridian

February 7, 2008

Over the last few weeks Microsoft (MSFT) announced more details of their long awaited virtualization strategy (drum roll please) and their expanded partnership with Citrix/Xen (with an emphasis on servers) and simultaneously announced the acquisition of Calista.

When Citrix originally announced the acquisition of XenSource a few months ago we thought that is was apparent that MSFT had to be “in the know” as Citrix and MSFT have been in a love/hate relationship in within the enterprise markets for nearly 18 years.

The prior relationship was much to do with terminal services and “backracking” and streaming of applications to the end-point.  In many ways these uses are a pre-cursor to virtualization – an enterprise “Petri dish” to see what and how customers find value around alternate enterprise usage of platforms and software delivery. Now the next shoe is dropping.

With the success of VMware―both in terms of early enterprise acceptance and deployment AND the IPO (giving VMW a huge warchest)―Microsoft has been forced to move.  Some would see it as “late”, but the virtualization market is really very nascent.  The bulk of VMW revenue is made up of deal sizes $100k or less….(likely ASP’ing at <=$70k right now)….so the bulk of the $1B+ in revenue by VMW is still “pilot” and for development usage.  So we are very early stage.

But the shift is happening quickly and the full transition is inevitable in short order (Less than 5 years for leading sectors to cross over to more than 50% virtualized infrastructure.)

While it is clear that the Virtual Memory Manager (VMM) and Hypervisors (HV) are ultimately commodity delivery mechanisms for the stack and software in the Virtual Machine (VM) enablers, control of VMM and HV is important to the big guys until the other layers of the value-add opportunities develop and evolve.

The longer term question for the “little guys” (every one with less than $100b market cap) is “where are the defensible 3rd party value-add areas” as the paradigm shift fully reveals itself?  What “goes away” as the shift from the one-to-one (hardware to OS) monolithic platform yields to the one-to-many virtual platform.

What happens to traditional IT security in this brave new world?  Where can we hang our respective 3rd party hats as the elephants trample the old ground in search of new and fertile new areas?

It is in these questions that the “positive” security and systems management model really begins to stand out.  Knowing that VM instantiations are ASSEMBLED FROM trusted code by validating them against a platform and vendor agnostic, high-quality “white list” resource becomes critical.

Also knowing WHAT CODE IS LOADED WHERE AND FOR HOW LONG becomes an enabling capability, regardless of which VMM and HV is used to create the VM software stacks.

Also, compliance and software licensing become even more important, but can be easily handled with trusted code and stack measure/validate methods.  Being able to “attest” the stack to an external “white list” reference built from a rich supply of high-quality software reference measurements becomes a highly-defensible and long-term way of adding value to the new virtual compute paradigm.

Interestingly for those that get the jump on this, this represents a huge, content-based, recurring revenue model that the first-party players will have a difficult time displacing (because they don’t have ready access to software measurements from other vendors and due to the “trusted third party” implications).

Will we really trust Microsoft to validate Microsoft?

So let’s just view this as another card in an unfolding game of mammoth proportions and implications.

Stay tuned.  This is going a lot of fun to watch, and to participate in.

The Elephants are Stirring…

November 28, 2007

I have blogged in the past on the increasing momentum of virtualization as a major IT in Transition indicator. Over the past couple of weeks there have been some additional indicators.

First Oracle laid out more details of their virtualization strategy. See:

And Sun CEO Jonathan Schwartz talked about Sun’s work in the VM space:

For details, I would check out the videos set that Sun EVP John Fowler had produced. While of course it’s “Sun skewed,” it is a very good overview of the technical details of Sun’s strategy. See the videos linked from here.

So as not to get lost in the shuffle, Dell announced their partnership with Sun at the same event, as well as having their own individual set of announcements. The Sun/Dell announcement is really directed at Dell’s adoption of Solaris as yet another supported operating system.

Likely a good move for both companies, although time and execution will reveal the true value. For Sun, extending the “ubiquity” of Solaris is an important tenant. And for Dell I would guess this falls under the “Simplify IT” banner as Sun has done some really good work to create an enterprise OS, especially in the area of virtualization options up, down and across the stack. Here is an article about the “Love Fest…”:

I will finish today with this thought…

How in the heck is all of this virtualization stuff going to shake out?

We KNOW it is a key technology for the future of IT. We KNOW every platform supplier and large ISV is wrestling with the strategic issues of how, when, and can I make money with it (or at least not kill my current cash cow). And we KNOW customers want and need the benefits.

Just to give you a sense of the emerging confusion (which only increased over the last few weeks). See:,1895,2216435,00.asp?kc=EWKNLNAVFEA1

So I have to admit to being “virtually” confused. This IT transition will be one of the most interesting of all. A real page turner. Stay tuned for more.


Could this be a Techno-Tsunami?

October 1, 2007

Where does the time go? Here we are at October 1, 2007 and just look at where we are so far… The economy is all over the map (one article says “go long” and the next one says “short the market”). The credit ripples are still building, and we are going to test our economic resilience yet again. I continue to believe that we are mid-stride in one of the most disruptive periods of change that we have seen for many years.

Yes, the dot-com bust was “disruptive” in a negative sense – we saw the pendulum swing full out and bang the edges. We just plain got ahead of ourselves (which is a repeating pattern for those of us with enough gray hair).

But this cycle seems different. The new technologies and products really seem to be more fundamental and useful. The disruptive nature of change seems much more deeply rooted. And when viewed from the right “altitude” the innovation seems more holistic.

But with all of that, high-technology remains a fascinating study in innovation and change. Look at some of the big stories:

* Virtualization continues to roll on

* Apple continues its slow but steady progress

* Intel recovers its footing

* Google continues its relentless march

* Web “2.0” seems to have real teeth

* There is more cheap bandwidth everywhere

* AT&T is back (now at&t ;-))

Examining some of the above in depth . . .

Virtualization. Not only more discrete compute machines per physical platform, but even the “bigger” virtualization – the complete abstraction of the platform to the user. Where does my word processer actually live anymore? Maybe on this box…maybe streamed to me on demand as a service? And where do I keep my file storage and backup these days? May as well use that Amazon storage backend…it is cheaper and more versatile than buying it.

And wow, talk about consumer wizzies. Check out the new Apple product line across the board. Great graphics, cool form factors, transparent cross-compatibility, and basically Operating System agnostic for all practical purposes.

We have Dual Core (x2) Microprocessors quickly moving to Octal Core (x8) with multiple sockets per motherboard and tons of memory – continuing to drive compute density up and lowering cost.

Zoom up and look at the really big picture for a second. Technology is really beginning to achieve the promise we’ve all had for it. To make our lives easier, more productive – and face it – more fun.

But I continue to watch with interest as the transitory waves, and the inescapable realities roll through our industry. These are not just the lapping waves on the beach. The water is retreating from the shore at an increasing pace… It is likely to return soon with force.

Are we witnessing a technology tsunami before our eyes?

What does Microsoft look like as a company 10 years from now? Where is Google? Does Motorola even exist?

Nature has an intractable and cruel reality. Evolve and adapt or become extinct. Watch carefully for signs of adaption with your favorite companies and sectors. If you don’t see the spark of innovation, get out.

Now on to Q4…..


Wow, what a week for Virtualization!

August 22, 2007

First we have the highly anticipated VMware offering….Off the charts.  VMware is now sporting a market cap of a cool $20B (symbol: VMW).  This is about $4B higher than our friends at Symantec (SYMC).   And then immediately following the IPO,  XenSource gets swallowed up by Citrix for a whopping $500M – likely a bit more than 150 times last twelve months trailing (LTM) revenue.  Nice work if you can get it.   

So, back to the theme of “IT in Transition”.  The news about VM is interesting. But “interesting” does not warrant an exclamation mark or a period, it’s merely a comma.  We have a long way to go. 

By all accounts virtualization is still in its infancy.  Market estimates are showing perhaps $2B of trailing revenue directed to vendors that supply various virtualization technologies.  And VMware is about half of that.  The more interesting fact is that most, if not all of this revenue is for pilot and proof of concept work.  We haven’t seen the real demand play out yet.   

How we manage IT in an increasingly virtual IT world is still being hotly debated.  As with other emerging technologies that have great promise (web services come to mind), we often leave the “details” to later in the adoption cycle.   

One of the big concerns has been “how do I make sure my system is constructed with code that I trust?”  and “how does security work in this  on-demand world?” Sound familiar?  

 I would offer that the same IT controls and best practices being established around monolithic computing (standard reference image management, etc.) are even more important in the virtual machine (VM) world.  Effectively addressing these needs, and helping our customers to assure that the VM’s are constructed and deployed in a high-integrity state will enhance the speed of enterprise adoption for virtualization. 

But what about the challenge of compliance in the virtual domain?  This can be readily and transparently solved with the same methods, but we’ll leave that discussion for another day. 

Anyway – Wow.  We have a front row seat on some serious IT transition opportunities and challenges.  Should be an interesting couple of years.

An Industry in Transition

March 21, 2007

What is going on in the IT industry?  Many people are saying “we are no longer a growth industry!”  I see it differently.  We are an industry in transition.  Zoom back and think about it.  We are a young industry in comparison to other industrial sectors.  They have experienced several messy transitions.  Why not us too?    So what is happening? 

I think it is prudent to look to the past to get a sense of our future.  In my opinion we are moving thru an interim commoditization stage.  Globalization is driving economic compression of the compute stack.  Hardware is at the bottom of the stack so it carries the full weight of everything above it.  Could this explain the sideways trend in many chip stocks?  I think so. Further, we as an industry, have focused on speed, features, and tech whizzes often at the expense of security and manageability. 

Our customers are asking for something pretty simple if you really think about it.  Give me the hardware, software and management methods for me to deliver my business process in dependable, secure and cost effective way.  The fulfillment of this straight forward and reasonable request has triggered many fundamental changes.  And many of the assumptions that we have built our compute models and our threat vector models on simply are no longer accurate. 

A major one is the assumption of a “perimeter” for our IT.  Locked safely in a glass room, and touched by just a few, our compute devices generally work remarkably well.  But that does not represent current reality.  Assuming that our major threat risk is from the outside through an increasing diffuse perimeter is just silly.  It is clear that we are rapidly moving away from traditional monolithic computing methods where computing devices run a single OS with multiple business applications, each one easily capable of taking down the entire system.  In its place we will see: 

  • Virtualization: To deliver on the promise of improved hardware utilization and security, will be the norm, as each computing device easily supports multiple heterogeneous OS environments and specialized business functions.  With benefits such as improved stability, performance and process isolation, virtualization looks to become the dominant enterprise computing model.
  • Thin(ner) clients provisioned “on demand”:  These devices will be useful to reduce persistent data end-point exposure in enterprise environments.  Also these devices can pave the way for the pay as you go Software as a Service (SaaS) market.
  • Platform Absorption:  In all cases the platform will begin to subsume many (if not most) of the technologies and mechanisms that are currently imposed on the platform “after market”, usually at customer expense.  Like other industries, security and safety in IT should be built in to the cost of the product and/or services.  Early examples of platform implicit methods include the vPro offering from Intel, and the slated “secure methods” offerings from Sun relating to new, announced features in Solaris 10.

So the “think about” issues here are: Assuming these realities are true and the changes ARE imminent, how do existing vendors remap to the new paradigms? Where are the new business models when the vertical industry demarcations a remapped?  Where can new companies find sustainable, defensible business opportunities with reasonable margins? What do the new business models look like for emerging companies?    What happens to security and compliance as the effective lifetime of the traditional compute stack moves from months to hours?  I believe that these fundamental changes are way overdue and necessary.  With change comes opportunity.  Thoughts?