Lest we forget the Federal IT Market Transition Indicators

August 29, 2007

I, Wyatt Starnes, do have an official position as a member of the National Institute of Standards (NIST) on the Visiting Committee for Advanced Technology (VCAT) and, further, I serve as the chairman of the IT subcommittee and work on strategic issues with regard to both NIST internally, as well as the constituencies that NIST supports worldwide.

Any comments in this blog related to IT in either a commercial or Government setting are mine and mine alone, and do not represent or imply any endorsement or opinion by NIST.

Ok, I am required to do that…

I tend to take a decided commercial market stance when I speak on various industry transitions. Shame on me. As many of you know, I have spent a fair bit of my career of late working on Federal IT market issues, needs and developments.

Recently there have been several major developments in Government as they relate to IT. On the agency side (an annual IT market of around $70B) the Office of Budget and Management (OMB) has been very active in appending requirements to the long standing Federal Information Security Management Act, or FISMA. In brief, this is the primary act that compels Federal Agencies to use best practice and IT controls to deploy, manage and report on agency-based IT usage. See: http://en.wikipedia.org/wiki/FISMA

OMB, in cooperation with several agencies including NIST, DoD, and DHS, recently completed the technical framework and put in place TWO supplemental requirements that will dramatically impact all Federal Agencies early next year. These are Memorandum M-07-11 and M-07-18.

M-07-11 covers the: “Implementation of Commonly Accepted Security Configurations for Windows Operating Systems,” and states: “agencies with these operating systems [Windows XP and VISTA] and/or plans to upgrade to these operating systems must adopt these standard security configurations by February 1, 2008.” See: http://www.whitehouse.gov/omb/memoranda/fy2007/m07-11.pdf

M-07-18 puts the teeth in the above memo by providing the recommended language for Agencies to use in solicitations to ensure new acquisitions include these common security configurations and information technology providers certify their products operate effectively using these configurations. See: http://www.whitehouse.gov/omb/memoranda/fy2007/m07-18.pdf

In short these represent some very good work in my opinion by the various agencies involved in their creation, and I applaud OMB for moving these into affect quickly. I must comment that these memorandums DO have some Microsoft benefit, and Microsoft has been very active in their creation, vetting and implementation behind the scenes, but I digress.

Fundamentally, these regulations are moving Federal IT to a new level of Security Configuration Automation Protocol (SCAP) and drive new baselines of pre-tested Federal Desktop Core Configurations (FDCC). As indicated by their very names, we are creating better standard configuration methods and conformance verification (IT Controls) along with pre-defined and vetted core configurations (Standard Reference Images). This is a notable and important IT transition, and a continued more toward the proactive, standardized and positive systems management and security model.

More later on some publicfacing actions in the DoD space.


Wow, what a week for Virtualization!

August 22, 2007

First we have the highly anticipated VMware offering….Off the charts.  VMware is now sporting a market cap of a cool $20B (symbol: VMW).  This is about $4B higher than our friends at Symantec (SYMC).   And then immediately following the IPO,  XenSource gets swallowed up by Citrix for a whopping $500M – likely a bit more than 150 times last twelve months trailing (LTM) revenue.  Nice work if you can get it.   

So, back to the theme of “IT in Transition”.  The news about VM is interesting. But “interesting” does not warrant an exclamation mark or a period, it’s merely a comma.  We have a long way to go. 

By all accounts virtualization is still in its infancy.  Market estimates are showing perhaps $2B of trailing revenue directed to vendors that supply various virtualization technologies.  And VMware is about half of that.  The more interesting fact is that most, if not all of this revenue is for pilot and proof of concept work.  We haven’t seen the real demand play out yet.   

How we manage IT in an increasingly virtual IT world is still being hotly debated.  As with other emerging technologies that have great promise (web services come to mind), we often leave the “details” to later in the adoption cycle.   

One of the big concerns has been “how do I make sure my system is constructed with code that I trust?”  and “how does security work in this  on-demand world?” Sound familiar?  

 I would offer that the same IT controls and best practices being established around monolithic computing (standard reference image management, etc.) are even more important in the virtual machine (VM) world.  Effectively addressing these needs, and helping our customers to assure that the VM’s are constructed and deployed in a high-integrity state will enhance the speed of enterprise adoption for virtualization. 

But what about the challenge of compliance in the virtual domain?  This can be readily and transparently solved with the same methods, but we’ll leave that discussion for another day. 

Anyway – Wow.  We have a front row seat on some serious IT transition opportunities and challenges.  Should be an interesting couple of years.

Where is independent quality control–post production?

August 2, 2007

Here’s something to think about…. In every production process there is an independent quality control check.  For example, a manufacturing line always pulls sample parts during the process and conducts an independent visual inspection.  Software engineers write code, and another group QA’s it.  A writer writes; an editor checks.  In any industry out-of-band quality assurance and feedback loops are fundamental to ensure consistent, repeatable performance and fewer errors.  

In IT what is being “manufactured” is software availability.  But where is the independent check after software is deployed into production?  Configuration and deployment tools are fine, but they only check against themselves.  And we know all too well that applications still fail.

I think the basis methodology of a maker/checker model belongs in IT post production.  Imagine how much more confidence IT Professionals would have in their enterprise if they knew and could prove their software is deployed as intended.