Archive for the ‘IBM’ Tag
Over the past few years, few technologies have been hyped as Cloud Computing. According to the pundits and early adopters, CC is transforming the face of corporate IT at the same time as delivering compelling business value. Simply put, CC is a suite of enterprise-level technologies that enables organizations to draw their computing power and data from a separate and centrally managed pool of compute resources including servers and software licenses. CC has a compelling basket of benefits for firms of all sizes in all industries. Company’s can significantly reduce IT operating costs and increase server utilization. Additionally, CC can enable a more agile and scalable computing infrastructure that better aligns IT to business requirements, including reducing new product time to market. Importantly, CC allows firms to focus on its core mission of delivering goods and servicing customers while outsourcing a big chunk of their IT (read: fixed costs and headaches) to experts.
Currently, there are many business functions delivered through a cloud, from CRM (salesforce.com) to messaging and collaboration (Google Apps) and high performance computing (Amazon Web Services). Not surprisingly, all the IT heavyweights including IBM, HP, and Microsoft have committed billions of dollars to marketing a plethora of products and services. No wonder Gartner, an IT research consultancy, named CC the second most important technology focus area for 2010.
Yet, CC has received a couple of black eyes recently arising from security breaches at Amazon and Sony that impacted millions of users. And, there remain important challenges to fully exploiting CC’s potential. Not all first generation initiatives have met expectations.
Given its young age, it is not surprising that CC carries a variety of definitions and connotations. For the sake of clarity, I use the US Department of Commerce’s National Institute of Standards and Testing definition. NIST defines 5 characteristics of cloud computing:
- On-demand, self-service computing – allows business units to secure the resources they need without going through internal IT for servers and licenses;
- Broad network access – enables application to be deployed in ways the business operates such as mobile and multi-device;
- Rapid resource elasticity – provides for quick resource scalability or downsizing depending on computing needs;
- Compute resource pooling – enables computing resources to be pooled to serve multiple consumers;
- Measured service – allows IT usage to be measured like a utility and charged back to users according to demand.
How do managers determine whether this technology is right for their business? Our firm has developed a quick and dirty checklist to test a company’s cloud readiness:
- Are your revenue-driving business applications hampered by inadequate computing power?
- Would significantly quicker resource availability enable you to reduce time to value with new products and key operational initiatives?
- Are your operating units and managers always fighting for more IT resources?
- Is your business environment characterized by unexpected surges in demand?
- Is IT redundancy an important risk mitigation strategy?
- Do new or short duration business projects have difficulty “making the cut” for IT priority?
- Are server, software license and data center costs rapidly out-pacing profit growth?
- Are you frustrated with the flexibility and responsiveness of your enterprise IT infrastructure?
If you answered yes to only 4 of the above questions, your business is being seriously impacted by IT constraints and higher than necessary operating, hardware and software costs. A compelling business case for CC exists and a pilot program should be investigated as soon as possible.
For more information on our services and work, please visit the Quanta Consulting Inc. web site.
Few technologies have received as much hype in the past couple of years as Cloud Computing. Virtually every major IT provider such as Amazon, Google, HP, Intel and IBM is now aggressively promoting their new CC services. Despite the excitement, business adoption has been slow for mission-critical production applications within traditional large IT buyers like financial services, healthcare and manufacturing.
Simply defined, CC is a range of enterprise-level technologies that enable organizations to draw their computing power and data from a centrally managed internal or external pool of compute resources including servers and software licenses. Acting like an electrical utility, a Cloud can supply users (Companies, operating units and individuals) computing resources as needed, when needed. In an ideal situation, CC enables organizations to reduce or defer the purchase cost of expensive hardware and software assets, accelerate application performance at peak load periods and drive up overall IT utilization – which for most firms languishes at around 25% of potential capacity. Importantly, CC also enables companies to move to a more flexible, scalable and efficient IT pay per usage model also known as Software-as-a-Service (SaaS).
CC and its predecessor Grid computing have been around for over 20 years. If the Cloud is going to move beyond niche applications into the mainstream of business computing, it will need to overcome some important adoption challenges, as follows:
Although slowly emerging, there are still a plethora of competing standards that inhibit a quick and low risk adoption of CC. For example, there are competing standards in the critical areas of IT infrastructure components, security, identity and system interfaces. CIOs need to ensure their CC adoption plans and technologies are readily aligned with standards as they are set, even if they do not represent the best technology at this moment. One simple step would be to follow the Open Data Center Alliance, an independent consortium comprised of leading global IT managers who seek to provide a unified vision for long-term data center requirements.
CC adoption continues to be stymied by (often hidden) organizational barriers such as who controls IT resources and how is IT linked to business priorities. Furthermore, ongoing concerns around computing resource availability, external cloud viability and data privacy often make CC a difficult to sell to the business unit owners. Because of its revolutionary nature, organizations must treat CC like it would any other transformational project. This requires using change management methodologies, right sizing the organizational structure to reflect new mandates & roles and using pilot projects to build internal support and generate key learnings. Gary Tyreman, CEO of Univa a leading Cloud Computing provider, says: “While Cloud looks like an easy way out, one needs to begin by connecting the project to a strategic imperative, orderly define a starting point, identify low hanging fruit and create the white space for the team to make this happen.”
Given its short history, its no surprise that there is considerable market uncertainly and bewilderment over what is CC, how are solutions best deployed and who really can deliver on its promise. In fact, almost every IT provider of consequence now promotes a CC and SaaS capability. This market clutter has created an adoption barrier for many firms. Despite this clutter, there are more than enough success stories for firm’s study. “There is now a compelling business case for the Cloud and enough proven case studies across many industries to speed implementation and reduce business risk,” says Tyreman.
Lack of IT transparency
Many CIOs lack sufficient visibility into their IT infrastructure and operating units to understand which business applications and cost centers represent the best opportunities to deploy CC. One of the most important first steps to moving to the Cloud is to understand what IT assets firms have, how they are used and where is the cost (hardware, software and operating).
Given its transformational value and record to date, CC is on the cusp of crossing the adoption chasm in 2011. Although they need to do their homework, CIO’s should look deeper into how CC can reduce their cost and improve business performance.
For more information on our products and services, please visit the Quanta Consulting Inc. web site.
When it comes to large companies changing and adapting to new conditions, conventional wisdom especially around Disruptive Innovation theory, is not optimistic:
- Due to a number of factors including size, reward systems and culture, it is extremely difficult for large companies to reinvent themselves.
- For change to be successful, companies need to bring in outsiders untainted by functioning in the culture.
- Since change has a low probability of success, firms should concentrate on their key competencies and customers and be cautious about moves beyond their core.
A recent study out of Stanford’s Graduate School of Business on IBM’s successful transformation from hardware to solutions provider challenges the above thinking and suggests that large companies are well-suited to reinventing themselves.
Back in 1999, IBM was a large company facing significant financial and market pressure and an uncertain future. Internal analysis revealed that the company had failed to capture value from 29 separate technologies and businesses that where internally incubated. Examples included the first commercial router (Cisco later dominated that market), new technologies to accelerate web performance (Akamai captured this market) and desktop PCs (Dell and Compaq took over this business). The Stanford authors summed up IBM’s issues this way: “The maniacal focus on short-term results, careful attention to major customers and markets, and an emphasis on improving profitability all contributed to the firm’s ability to exploit mature markets — and made it difficult to explore into new spaces. The alignment that made the company a ‘disciplined machine’ when competing in mature businesses was directly opposite to that needed to be successful in emerging markets and technologies.” Not surprisingly, this assessment may sound familiar to other executives.
IBM understood that their traditional business model was unsuited to capitalize on emerging technologies and market opportunities. So, what did they do to change? Use their global size, client & channel intimacy, and substantial resources & talent to their advantage while ignoring the cliché that large companies can’t be agile.
To focus sufficient management attention and resources, IBM situated its new technologies and products in a new, measurable line of business called the Emerging Business Organization (EBO). In the past, high potential technologies and products were lost in the shuffle when they were sprinkled across traditional business units. In addition, IBM put internal, experienced managers at the top of the new business units, a sharp departure from past practice. These managers had some key advantages that were crucial to success. They knew their way around the company, they had internal credibility and they were proven leaders. The logic of the earlier strategy was that younger leaders would be less imbued with the “IBM way” and more likely to try new approaches. More often than not, those leaders failed. As well, IBM cycled top managers through the EBO, helping cross-pollinate new products, technologies and processes throughout the entire company. This built internal awareness of EBO products, drove cross-selling and triggered new innovation. Finally, IBM’s used its strong cash flows to put its money where its mouth was. They properly invested in the EBO and its initiatives from R&D through to product commercialization.
These innovations have generated impressive results. Between 2000 and 2005, EBO projects added $15.2B to IBM’s top line. When compared to IBM’s M&A strategy at the time, EBOs added 19% to IBM’s top-line while M&A delivered 9%.
As the IBM case shows, scale and pedigree can be significant weapons to enabling change. Indeed, other companies such as P&G, J&J and Corning have successfully followed a similar approach to reinventing their business model or shifting product focus. When it comes to change, size does matter.
For more information on our services and work please visit the Quanta Consulting Inc. web site.
The holy grail of CRM, the ability to leverage and monetize internal data, is now within reach of most medium to large enterprises. Low cost computing power, new software tools and sophisticated math skills have converged to enable high-level data analytics, a powerful capability that can drive incremental revenue, improve workflow efficiency and enhance customer satisfaction. Basically, advanced analytics uses special algorithms to comb through large databases of transactions looking for important causal relationships between variables that can be leveraged to improve the efficiency and effectiveness of a program or process. For example, Internet Services and Retail companies are mining millions of their transactions to uncover critical (and hitherto unseen) insights about consumer and supplier behavior. In other cases, some firms in the Consulting and Software industries are using observations from their own mountain’s of data (as well as the clients they serve) to launch new practices focusing on data management and consulting.
The business of helping firms make sense out of proliferating data is growing quickly. This industry, which includes leading IT players such as IBM, SAP, Microsoft and Oracle, has estimated revenues in excess of $100B. The markets is growing at almost 10% a year, roughly twice as fast as the software market as a whole.
IBM is a pioneer in the use of mathematical models to analyze huge data sets. IBM’s analytics business began as an internal project undertaken by in-house mathematicians, who wanted to learn how to maximize revenue per client by analyzing years of sales data. The insights discovered in their work prompted them to retool their sales teams by account size & industry and to tweak their service offering. The result was $1B in new revenues and better sales coverage. Not surprisingly, IBM concluded that others could benefit from these capabilities and they built an entirely new business analytics and optimization group within IBM Global Business Services to support it. To date, this group has already trained 4,000 consultants
And they are busy. IBM mathematicians are using high-quantile modeling in its workforce analytics practice to help clients make decisions about human resources issues such as how best to deploy their sales people. In other cases, their mathematicians are using stochastic optimization algorithms in their human resources and marketing practice areas to help clients find new customers and determine the right mix of experienced and junior programmers to staff large software projects.
Walmart generates reams of data through their Retail Link inventory management system. The Company is using sophisticated analytics to crunch this data in a myriad of ways, turning information into a powerful profit accelerator. In one impressive example, Walmart’s analysis showed that they should offload inventory management to their suppliers and not to take ownership of the products until the point of sale. This new strategy allowed the firm to decrease inventory risk, conserve cash flow and reduce its costs.
Like many telecoms providers, Cablecom has grappled with churn. Using advanced data analytics, Cablecom discovered that although customer defections peaked in the 13th month, the decision to leave was typically around the 9th month (as indicated by things like the number of calls to customer support services). To reduce defections, Cablecom offered at-risk customers special deals 7 months into their subscription. The results were impressive: customer defections fell from 20% of subscribers a year to under 5%, enabling the firm to save significant marketing acquisition costs while boosting customer satisfaction.
Regardless of your data management objectives and strategy, there is gold in those terabytes of data.
For more information on our services and work, please visit the Quanta Consulting Inc., web site.