Market Roundup November 18, 2005 CA Has Visions of Enterprise Management |
|
CA Has Visions of Enterprise Management
At CA World this week, CA made several announcements.
Perhaps the most interesting to us was the announcement of Enterprise IT
Management (EITM), their vision for unifying and simplifying IT management
across the enterprise. In support of the vision, CA also announced twenty-six
EITM-enabled products within their portfolio, ranging from management to
security, that are driven by business policies and automated workflows which CA
believes will ensure continuous alignment and optimization of IT
infrastructure. The products are integrated with the CA Integration platform,
which has a workflow engine, management database (MDB), shared policies, and a
consistent user interface. CA uses a Service Oriented Architecture (SOA) to
deliver management and security services across products. CA has taken an
architectural approach to management, believing this is the best way to tie
together management of the entire IT environment, which they define to include
end users, infrastructure, data, applications, IT services, and business
processes.
Management of IT infrastructure is an evolving concept that
is crucial if IT is expected to interact with business processes effectively.
IT managers need to be able to treat their IT infrastructure as one entity made
up of many interacting parts, which implies that management has to be able to
understand the interdependence of the various parts within the whole. One of
the problems with the way IT has developed has been the emergence of silos
within IT. Traditionally they were treated independently of each other such
that there were exit application silos as well as infrastructure silos and even
management silos focused on their particular application or infrastructure.
Even between products from the same vendor there have been frequent issues
around product integration. The theory is that integration lowers the cost and
difficulty of managing effectively. Users who have CA products will hopefully
find that management has gotten easier and may have more reason to use more CA within
their infrastructure. This is important as software always has stronger loyalty
than hardware on the principle that software is the product the customer
touches more frequently. It is important for CA to make its move sooner rather
than later as its competitors are also attempting to gain the same client
mindshare through their management capabilities.
One area where we think CA has certainly gotten it right is in making sure that their storage management software is integrated from the beginning and that everything is done from a higher, architectural level. Looking at data, applications, services, and processes is the right way to approach management, and the role of individual devices or classes of devices falls into its proper context based upon a data set or application. Additionally, we’ve seen an awful lot of activity within the storage management space recently, but it is refreshing to find a vendor who believes that storage management is one piece of the larger management structure rather than something that should be developed independently for now and maybe one day will fit in with the rest of IT management. To get the fragmented bits of IT to function as a whole, one needs to begin with the end in mind.
Banking on the Virtualized Client
Infrastructure
IBM has announced the IBM Systems Solutions for Branch
Banking targeted at financial services and banking customers seeking to
centralize operations while reducing overhead costs. The new solution offers
customers a pre-configured, pre-tested, and scalable maintain platform built on
Intel Xeon-based IBM BladeCenter and xSeries platforms on which to consolidate
branch infrastructure. It includes software, networking, and security features
so that administrators can monitor operations and centrally deploy software to
branch locations. Systems Solutions for Branch Banking offers a variety of
solution options including VMware virtual infrastructure on a blade, Altiris
Deployment Solution provisioning systems, and Datacom Systems video
surveillance system blade. In addition, IBM also announced the next solution in
its Virtualized Hosted Client Infrastructure portfolio targeted at bank
branches and credit institutions. The solution is based on VMware to host
multiple users of desktop environments on a single blade within IBM BladeCenter
and leverages ClearCube client access technologies to increase efficiency in
resource utilization and aid quicker deployment of new users. IBM Systems
Solutions for Branch Banking and IBM Virtualized Hosted Client Infrastructure
will be delivered by IBM Global Services and select IBM Business Partners.
This is the second solution IBM has announced for the VCI,
which has only been out for a few weeks. While some may at first glance see
this offering as little more than a bundle of software on a BladeCenter, the
reality is much more. The essence of VCI that is intriguing to us is the
virtualized through-and-through message of the solution and its focus on
leverage and flexibility. Bringing many applications to a centralized server
location has inherent advantages in image control and other software updating
concerns, but this combined with the flexibility of resources offered by VMware
in conjunction with the scale-out nature of BladeCenter creates a compelling
dynamically responsive platform on the backend without requiring specific form
factors on the access side. Additionally, the value delivered by Citrix and
ClearCube technologies bolsters the flexibility, efficiency, and manageability
of the solution. This consolidation approach does not mandate replacement of
any access infrastructure, works with either thick or thin clients, and works
locally or remotely. Thus, a consistent user experience is much more likely and
is granted across a variety of access methods all with reduced operational
headaches.
Although there are many advantages of this solution, what
excites us the most about this approach is how it is very difficult from other
backend centralization solutions for PCs, such as the HP CCI. IBM VCI seeks to
deliver the maximum utilization of resources to the largest audience of users.
Each user access is virtualized, as is the storage, and the CPU on the blade on
which applications are executing. Taken within the BladeCenter context, this offers
a collection of resources made available to users in a dynamically allocated,
yet highly efficient undertaking. This is in sharp contrast to the HP CCI
approach which still maintains a one-on-one correlation between the user and
the CPU, local hard drive, and blade resources. Such a solution may allow for
fewer PCs to be deployed, but does not seek to maximize the utility of
resource, and hence maintains a high overhead of underutilized resources as the
demands and deployments scale. At the risk of sounding a broken refrain, for
most solutions the approach taken by VCI offers greater consolidation,
efficiency, and hence long-term savings, than that offered by CCI.
Virtualization through and through: it’s where things are going and what we believe will prove to be the most disruptive and compelling IT achievement of the early 21st century.
This week IBM purchased Collation, a company that does
detailed mapping of automatically captured information about IT resources. The
product is integrated into IBM’s Tivoli systems management software,
specifically within Tivoli Change and Configuration Management Database
(CCMDB), and will help customers to understand the effect of changes to an IT
environment. Collation helps users with the interrelationships between devices
and systems, so that users can model change scenarios to see what the results —
intended and unintended — might be. It provides a view of run-time dependencies
across application, system, storage, and network tiers, and supports
virtualized environments. The CCMDB is a service management platform that provides
a single view across multiple sources of IT information. IBM believes that the
acquisition strengthens its service management software portfolio and expands
management automation and simplification capabilities. IBM also believes that
with the Collation product, IT professionals can see how technology supports
business processes, such as order entry, supply chain, and enterprise resource
management.
IBM argues that 80% of business service-related failures are
due to IT changes that had unpredicted impacts. Currently IT managers must
manually map the interdependencies and order of relations of their IT
applications, which takes significant time and resource. Most management
currently looks at each system independently of the others. This provides a
less-than-holistic view of how IT functions and means that identifying problems
takes longer as sometimes the apparent problem is really a symptom of an
underlying problem elsewhere in the system. With the addition of Collation,
Tivoli is able to map the relationships and make immediate changes as the
infrastructure is altered. That capability should give managers better accuracy
and quicker turnaround on problem-solving. It should also help them to make
better decisions about how, when, and where to make changes within their
infrastructure.
IBM like most R&D-based IT vendors has been all about building mousetraps. But IBM lately has been about building mousetraps that are not only better in some manner, but actually smarter. Almost everything that IBM does now is about making businesses reach their next level of On Demand, which has come to signify access to the information they need to make smart, rapid business decisions. Collation clearly fits that strategy. IBM had a previous agreement with Collation and had already integrated the product with Tivoli, but IBM claims customers strongly urged it to purchase the company. As the product becomes embedded into Tivoli, it will change the nature of how the CCMDB functions and each iteration should make the product smarter yet, giving the IT department the ability to respond quickly to the impact of proposed changes, and confidently able to propose changes that might have positive impact. Tivoli customers that don’t already use Collation have a new tool in the toolbox that they should work on adopting as soon as possible.
Sony BMG music announced this week that it will be pulling
its copy-protected CDs and will offer consumers the option of trading them in for
versions that do not contain the XCP anti-piracy software code. That code,
according to researchers, utilized a rootkit as a
central element, by which it installed software on the user’s computer without
user knowledge. Security researchers argued that Sony’s protection code created
vulnerabilities in a user’s system that could be taken advantage of by
malicious hackers. Sony has said it will stop producing any more CDs with this
particular feature included. Sony had released a patch for the problem as an
initial response, but many consumers found it created even more problems than
the original code.
It has been quite clear for some time now that the recording
industry has no real idea of how to respond to the fundamental changes that are
occurring as a result of the Internet and its related personal music devices.
Consumers have greater flexibility than ever before to customize their personal
soundtracks in an easy and painless fashion. The music industry still clings to
older distribution models and has won significant legal victories against those
who would impose a new world music order, as it were. Yes, the RIAA has been
able to shut down music-sharing sites, but they have yet to address the
prominent notion that consumers’ demands and expectations on how music is
delivered and accessed has changed forever. No amount of litigation or
legislation will change that key fact. How the music industry responds to this
sea change will in large part determine its future viability.
Let us stipulate that the XCP anti-piracy technology was and is not the way to respond. Sony violated a cardinal rule in bits distribution: don’t surreptitiously download potentially (or actually) harmful code on a user’s machine. Not only does this violate a basic assumption of trust by the user that they will get what they pay for and nothing more — no viruses, no worms, no adware, no malware — it also is foolhardy in the extreme given the realities of the Internet. Sony executives may have assumed, rightly in all probability, that 95% of the users that had XCP installed on their computers might never have noticed because they are largely computer novices. But to assume that the other 5%, more knowledgeable and technically sophisticated, would somehow ignore such a gross violation of the implied trust relationship is just another indication that the recording industry is still rather clueless about the true state of affairs out in the Ether. Sony’s apparent judgment that not only would this 5% not discover or care about such an intrusion, or that they would not publicize it via the Internet, is a gross miscalculation of how the music industry is perceived by a significant portion of music aficionados and their expectations concerning purchased bits. Hopefully somewhere in Sony’s management a lonely soul warned executives that XCP would be a really bad idea. And hopefully that person wasn’t summarily fired. And hopefully that person may be allowed to participate in the discussions for Sony’s next efforts to manage its content in the Internet age. We suppose that’s the best that could be hoped for.
This week at the World Summit of the Information Society
(WSIS) meeting in Tunisia, the U.S. got to retain control of the Internet
Corporation for Assigned Names and Numbers (ICANN), an internationally
organized, non-profit corporation that has responsibility for IP address space
allocation protocol identifier assignment, generic and country code Top-Level
Domains (ccTLD) name system management, and root
server system management functions. Once upon a time, these services were
provided by the U.S. government, but not anymore. According to ICANN, they are
dedicated to preserving the operational stability of the Internet, promoting
competition, achieving broad representation of global Internet communities, and
developing policy appropriate to its mission through bottom-up, consensus-based
processes. Before this meeting, the EU in particular had wanted the U.S. to
cede sole control and create some sort of public-private partnership for better
management of the Internet; however, the status quo has been upheld. As an
olive branch of sorts, the U.S. did agree to the formation of the Internet
Governance Forum (IGF), which is designed for multi-stakeholder policy
dialogue, but this time under the auspices of the UN Secretary General.
The media likes to tout headlines about the US “controlling”
the Internet, as though the Internet were one entity that can be controlled. In
some sense, they are correct but only insofar as control means managing the
root addressing and managing servers of the Internet. Many countries in
particular feel they want more control over what they do with their ccTLD which is much more a story of politics than of root
control. After all, how can there be a system if everyone is doing their own
thing? Anarchy only works up to a point. ICANN is participating at WSIS, and
takes pains to point out that although it are a U.S.-based NGO, it are open to
other parties. And let’s face it, the reason the U.S. government gave control
to ICANN was that they realized that they as a government weren’t the best lot
to manage this mess. Government of any stripe is not usually a leader in
technology and practical day-to-day management. By definition everything in a
government is political, whereas management of the technical aspects of the
Internet should ideally be non-politicized. That of course isn’t likely in a
world populated by human beings, but the last group most of us want controlling
our Internet is a pack of governments. Trust us, some of us have been living in
the EU. And ICANN has no authority over issues like rules for financial
transactions, content control, spam, or data protection. Those things are
absolutely the sort of thing governments should decide together; but again, the
Americans don’t own “control” of any of that. Not that anyone is paying any
attention.
Now on matters technical, there actually is an interesting point that could be made, although it doesn’t seem that many are expressly stating it as such. ICANN coordinates the management of the technical elements of the DNS to ensure universal resolvability. That means that no matter where you are, when you type in your favorite Web address, it sends you to the right place. (As a reminder, DNS is the system that allows us to use www.something.ending (letters and words) rather than some IP address (123.1.12.23) (strings of numbers). The issue of course is that everything is done in a Latin alphabet now, which is fine if you use English or any of the many languages based on the Latin alphabet. However, if you use another alphabet — or worse yet have a language without an alphabet — well then, things can get entertaining. Longer term, as other language populations grow their Internet presence, there may come a time when either everyone uses Latin naming at the server level and finds a workaround, or we’ll have to come up with a system that recognizes a larger group of characters. Now that’s worth discussing, and a good reason to have international participation. In the meantime, if people want to fight about suffixes, then let them do it. We think that $100 laptops for the world’s poorest children is an idea that merits much more of governments’ time and energy than control of domain naming conventions. Next thing you know they’ll want to regionalize the names in the periodic table of the elements.