Microsoft Releases Hyper-V Server 2008

October 17, 2008

Well, well, well… look at this. Microsoft is unfurling more and more layers of its next-gen computing and software strategy – especially with regards to virtualization.

(Ok, a required disclosure: We are currently under NDA with Microsoft and have some confidential knowledge around certain roadmap and product plans, but NOTHING in this blog post is based on any inside knowledge derived from, or in any way based on, those confidential discussions).

See:

http://weblog.infoworld.com/virtualization/archives/2008/10/microsoft_relea_6.html?source=NLC-VIRTUALIZATION&cgd=2008-10-09

The reason I wanted to blog on this is in relationship to the IT in Transition theme is that, as I have written in several blogs, the entire landscape of the endpoint is changing. A lot of people see this, so this view is in no way unique or revolutionary to us.

A couple of posts ago I blogged on the coming Endpoint Wars of 2009. In order to make that post digestible, I intentionally left a detailed and deep discussion about the impact of virtualization and hypervisors out of that post.

Let me add a bit of my color (and opinion) here:

Quoting from David Marshall’s article:

So what’s new and different? Didn’t they already release Hyper-V? This platform is slightly different from the version found in Microsoft’s Windows Server 2008 operating system. According to Microsoft, it provides a simplified, reliable, and optimized virtualization solution for customers to consolidate Windows or Linux workloads on a single physical server or to run client operating systems and applications in server based virtual machines running in the datacenter. And it allows customers to leverage their existing tools, processes and skills. But perhaps best of all, Microsoft is making this product a no-cost Web download — yup, it’s free!

Yup, it’s free.

Also from the article:

The provisioning and management tools are based on Microsoft System Center, providing centralized, enterprise-class management of both physical and virtual resources.

And the management mechanisms and tools are “above platform” as we’d expect, with Microsoft System Center being adapted as the management framework, as we’d expect.

So the Hypervisor (HV) wars are in full force now as well. Obviously this is just the leading edge of the one of the fronts of the Endpoint Wars.

Seems like the three major battlegrounds are VMWare, Citrix and now Microsoft. If highly capable hypervisors are going to be “loss leader” in any go-forward virtualization platform strategy, then where will the value and revenue shift to as the traditional demarcations are realigned?

Our guess is that more of the instrumentation will be subsumed into the platforms (as we have stated for quite some time) including into the HV. This obviously will force more of the method “above platform” including image management and enforcement. And where does traditional infosec (AV, IDS, etc) move in this new world?

Think services.

And these services will go well beyond software streaming, and likely include image management and high-assurance software and full software stack delivery methods.

And platform intrinsic security and compliance “instrumentation”, supported by above platform validation and attestation methods, will likely become commonplace.

Food for thought.

Wyatt.

Advertisements

Norton 2009 and Whitelists

October 3, 2008

Over the last couple of months our friends at Symantec have readied and released Norton 2009. We hope that the engineering team is taking a long rest after the significant effort to bring this new offering to the market!

Norton 2009 is touting the use of “whitelists” as a part of the offering. See:

http://keznews.com/4878_Norton_2009_tackles_whitelisting

From the article:

Symantec has adopted whitelising techniques in an effort to dramatically improve the performance of its upcoming Norton 2009 security suite, according to the company’s vice president of consumer engineering, Rowan Trollope.

Trollope admitted that poor performance was the main reason Norton Internet Security customers abandoned previous versions of the product. In the next version, he explained, a “whitelisting approach” significantly reduced the amount of time scanning files that are known to be safe.

It is very interesting to see another use case and value definition for whitelisting. Not that we agree, mind you. It is just interesting.

All of the AV vendors are seeking ways to deal with “Blacklist bloat and overhead”. The use of “smart whitelisting” in an AV product is an interesting way to address this need. Essentially what the Symantec team realized is that there is no need to constantly scan and rescan good code. Bravo.

But what they missed is that this is not the real and valuable use case for code whitelisting in the longer term.

As discussed before in this blog, whitelisting is not a direct replacement for blacklisting. Blacklisting, in its pure form, is about discovery and hopefully the blocking/immunization of “malicious code” – code intentionally built and distributed to damage compute platforms and increasingly used to steal private information.

Whitelisting, in its pure form (admittedly by our definition) is about making sure that the “good and desired code set” remains in a good and desired integrity and configuration state over the entire usage life cycle of the software stack (physical or virtual).

In order to make this work in practice, one needs to redefine the architecture of computer security and systems management from end-to-end. Simply tacking whitelist onto an existing blacklist solution does not yield the real benefits that can be achieved with true whitelisting.

The test of true whitelisting is really driven by more fundamental benefits like improved computer stability, security and compliance – leading to higher availability (increased MTTF and reduced MTTR). Importantly these benefits should be delivered with LOWER operational costs than we have now. This means that we need to lower the people costs associated with delivering computer availability and capacity. This means we need to increase our visibility to ALL of the risks to computer stability, including malicious code. This also means that we need to instrument and automate IT best practices.

Positive IT Controls are the answer. We are making steady progress in “flipping the model” from blacklist to whitelist – but don’t be fooled. Norton 2009 is not really a “whitelist solution”. Real whitelist solutions will be very common as a true and vendor agnostic, high-provenance whitelist eco-system evolves. And as the endpoint wars begin to settle down in 2009.

Keep your eyes on these pages for more on this.

Wyatt.