Welcome to my Blog - Happy Reading!
"Biz-Integrate" discusses the powers of business process integration and improvement, e-Commerce, and enterprise modernization and collaboration, for solving immediate business challenges and long term strategic goals.
Monday, April 18, 2011
Business Process Integration for Dummies
Late last year I worked with a global software development and integration company to write a Business Process Integration For Dummies eBook (a.k.a. "BPI for Dummies"). This is now available for download. For those of you that have not received a direct marketing email from this company, please go ahead and download the Business Process Integration for Dummies eBook at http://www.bpifordummies.com/ . This is a great introduction to business process integration, and it also references other useful links for more advanced integration research and literature.
Tuesday, February 1, 2011
The Art of Change Management
If you want your projects to come in on time, on target, and on budget, some sort of change management methodology is required. Change management plays a critical role in defining the successes and failures of today's business centric and IT-driven projects. It's an integral part of standard project management and Business Process Management (BPM) methodologies. In a recent CIO article, a survey of over 500 companies revealed that less than two-thirds of projects came in on budget, less than 60% achieved their ROI targets, and only 40% of the companies surveyed employed any sort of Project Management Office (PMO) or change management methodology. In addition, it is typical to see upward of 33% of the product development cycle time wasted on unnecessary work or stalled waiting for decisions or waiting for information regarding change. Adherence to a standard change management methodology would address these problems. So what is change management, and how does it relate to the world of IT?
What Is Change Management?
An IT department will typically have a configuration management database, or asset register, where each piece of hardware and all elements of the corporate IT network infrastructure are logged and the status and configuration of the organization's IT and communication infrastructure is fully known. Tight integration between the change management process and configuration management means that the state of any network hardware undergoing change is automatically updated as the change request progresses toward completion.
Known errors, reported in the IT support or call center incident logging systems, are usually linked with change requests. As the change request is successfully implemented, these incidents necessitate closure, and both IT management and the originators of the incidents require notification (typically through the engineering of a workflow process).
Each step, or milestone, in the change management process requires defined security controls built to ensure compliance with the methodology, completeness, synchronization with dependent tasks, and readiness for scheduling of promotion to the next phase, or environment, within the change management process.
As an organization's IT infrastructure becomes more complex and intricate to better support the underlying business, a request for change rarely results in an isolated change to an individual IS system; typically, it reaches much further and may span a multitude of areas (e.g., application development, middleware, systems configuration, hardware, electronic data interchange). For example, an enhancement to an e-commerce system may result in changes to the Web application residing on one or more application servers, target databases residing on one or more database servers, verification of or changes to application server and Web server systems and software configurations, compatibility with industry supported browsers, middleware compatibility (e.g., messaging, integration, EDI, Web services), integration with external business entities, and so on. Depending on the size of an organization and its IT department and the complexity of its systems and infrastructure, multiple configuration environments will exist to mirror all or a part of the true production IT environments. A high-end SMB or enterprise-level organization may have the following environments:
Before a request for change can be approved, the nature, or impact, of the proposed change must be quantified. From an IT perspective, this means an analysis of what hardware, software, applications, middleware, systems configurations, and even network infrastructure will be affected by the change. Then, from this analysis, a list of tasks, subtasks, and dependencies must be identified. Next, this list must be related to the equivalent list of tasks and subtasks identified within other business entities affected by the requested change. Conflict management also plays a role in comparing the nature of this requested change to that of other change requests and approved changes in progress. Depending on priorities and the amount of conflict or overlap identified with other requested changes and changes in progress, a potential timeline can be determined or the requested change denied.
Version control is the core feature of most IT change management software toolsets. It enables those objects, identified through impact analysis or software design, to be managed through the IT change management process. These object groups can reside on multiple hardware platforms or even across different network or communications infrastructures. Version control incorporates the linking together, or grouping, of like objects affecting the requested change and the promotion of these object groups through the chosen environments (whilst adhering to the laws of security). It also enables "roll back" when issues are identified with one or more parts of the change. The identified object groups' (software, middleware, or configuration management) new or modified objects are backed out, and the underlying infrastructure is set back to its previous version (or, in theory, any version).
Documentation:
Documentation encapsulates a multitude of sins and plays an important role in enabling the visibility to IT management and business management of change status and change history:
What Is Change Management?
Change management refers to the art of managing large, organizational changes in business and maximizing the collective efforts of all people involved in those changes. It spans all areas of an organization and is of particular importance in organizational development, IT management, strategic management, and process management. Since change is typically disruptive to an organization, change management seeks to both minimize the impact and increase the efficiency of the change, enabling the business entity to maintain its focus on continued growth. Effective change management requires business acumen, people skills, resource management, expectation setting, problem analysis, and the management of corporate politics. Its roots derive from the business reengineering practices of the 1950s and 1960s, when multinational organizations such as U.S. and Japanese auto manufacturers were looking for an innovative means to restructure their business and streamline their business processes for the purposes of increasing global market share and maximum ROI. From these initiatives grew methodologies and practices within the business reengineering and BPM world, geared toward managing process change from both psychological and technological standpoints. It is from this marriage of human resources, BPM, and technology that has grown the art of change management incorporated into today's leading project management methodologies and practice management. Kurt Lewin, the founder of modern social psychology, developed one of the earliest change models in 1951. His model described change as a three-step process. Step one, "unfreezing," referred to washing away the present organizational mind set and its resistance to change. Step two was "change," and it was in this step that the organizational, business, and technological process changes would occur. The final step was "refreezing," in which the organization achieved its original comfort level with the newly implemented process or process change. Delving under the covers of change management, we can summarize its key concepts and objectives as follows:
- Planning, testing, and implementing all aspects of the transition from one organizational structure or business process to another
- Defining the organizational behavior that will best support new work practices and overcome resistance to change
- Approving changes and documenting and communicating the impact of those changes to the organization
- Implementing, tracking, and monitoring changes in a visible, controlled manner.
Change Management with IT
With such a high dependency being placed upon IT and IS systems, the business world can little afford a poor, or ill adhered to, IT change management methodology. Yet close to 80% of system failures can be attributed to unplanned changes, and 20% of planned changes result in system outages due to factors such as lack of visibility to dependencies. These system failures--be they related to hardware, software, middleware, or communications--result in a high visibility within the organization, causing certain business processes to fail completely (e.g., call centers) or become severely impeded. However, the visibility of these IT-related systems failures rarely stops there. Typically, failures extend beyond the boundaries of the organization to affect its customers and trading partners, who have become reliant upon exchanging business documents electronically using e-commerce Web systems or who have intrinsically tied one or more of their own business processes with those of the source organization. So what constitutes an IT change management strategy or methodology? Change management as it relates to the world of IT is the ability to manage change and requests for change to the IT services and infrastructure of an organization. These changes rarely stand alone, but rather are tied to a larger business process change with an associated, observable ROI objective. An IT change management strategy should include the following:- IT service level agreements (SLAs) encapsulating all areas with any level of dependence upon IT infrastructure and IS systems (internal and external to the organization)
- IT organizational structure and procedures
- A process responsible for controlling and managing requests to effect changes to the IT infrastructure and IS services to promote business benefit
- A control mechanism to manage the implementation of changes that are subsequently approved
- Procedures to address minimum disruption to the IT infrastructure and IS services during the implementation of changes
- The process of planning, coordinating, and implementing changes to the information processing production, distribution, and system facilities
- A clear tie between the IT change management process and the originating business change process, leading to the eradication (or reduction) of duplication of work effort across business entities.
Delving a little deeper into this IT strategy, the change management methodology must provide processes and incorporate toolsets to address the following issues:
Configuration Management:
An IT department will typically have a configuration management database, or asset register, where each piece of hardware and all elements of the corporate IT network infrastructure are logged and the status and configuration of the organization's IT and communication infrastructure is fully known. Tight integration between the change management process and configuration management means that the state of any network hardware undergoing change is automatically updated as the change request progresses toward completion.
Incident and Problem Management:
Known errors, reported in the IT support or call center incident logging systems, are usually linked with change requests. As the change request is successfully implemented, these incidents necessitate closure, and both IT management and the originators of the incidents require notification (typically through the engineering of a workflow process).
Security:
Each step, or milestone, in the change management process requires defined security controls built to ensure compliance with the methodology, completeness, synchronization with dependent tasks, and readiness for scheduling of promotion to the next phase, or environment, within the change management process.
Environmental Management:
As an organization's IT infrastructure becomes more complex and intricate to better support the underlying business, a request for change rarely results in an isolated change to an individual IS system; typically, it reaches much further and may span a multitude of areas (e.g., application development, middleware, systems configuration, hardware, electronic data interchange). For example, an enhancement to an e-commerce system may result in changes to the Web application residing on one or more application servers, target databases residing on one or more database servers, verification of or changes to application server and Web server systems and software configurations, compatibility with industry supported browsers, middleware compatibility (e.g., messaging, integration, EDI, Web services), integration with external business entities, and so on. Depending on the size of an organization and its IT department and the complexity of its systems and infrastructure, multiple configuration environments will exist to mirror all or a part of the true production IT environments. A high-end SMB or enterprise-level organization may have the following environments:
- Multiple standalone development environments for their application developers, system integrators, and systems operations staff
- A development test environment in which the standalone change is tested and verified in a standalone test environment
- A system test environment in which the change is tested within the system context of the change (e.g., an application system change is tested as an integral part of its entire application but not outside of that boundary)
- An integration test environment in which project leaders/managers can orchestrate and simulate tests across all aspects of the change or effect of the change (e.g., application software, hardware, middleware, configuration, and networking)
- A quality assurance environment in which quality assurance testers perform end-to-end testing of the change requests in an environment that closely mimics the organization's production environment, using standard, pre-approved test scripts. This may include coordinating their tests with external trading partners and business users.
- A user test environment in which key users from the organization's business entities affecting this change can test in a production-like environment using standard, pre-approved test scripts. All aspects of the change will be tested here (IT and business).
- A staging environment in which key customers are allowed to review changes prior to those changes being loaded into the production environment
- A production environment.
Impact Analysis:
Before a request for change can be approved, the nature, or impact, of the proposed change must be quantified. From an IT perspective, this means an analysis of what hardware, software, applications, middleware, systems configurations, and even network infrastructure will be affected by the change. Then, from this analysis, a list of tasks, subtasks, and dependencies must be identified. Next, this list must be related to the equivalent list of tasks and subtasks identified within other business entities affected by the requested change. Conflict management also plays a role in comparing the nature of this requested change to that of other change requests and approved changes in progress. Depending on priorities and the amount of conflict or overlap identified with other requested changes and changes in progress, a potential timeline can be determined or the requested change denied.
Version Control:
Version control is the core feature of most IT change management software toolsets. It enables those objects, identified through impact analysis or software design, to be managed through the IT change management process. These object groups can reside on multiple hardware platforms or even across different network or communications infrastructures. Version control incorporates the linking together, or grouping, of like objects affecting the requested change and the promotion of these object groups through the chosen environments (whilst adhering to the laws of security). It also enables "roll back" when issues are identified with one or more parts of the change. The identified object groups' (software, middleware, or configuration management) new or modified objects are backed out, and the underlying infrastructure is set back to its previous version (or, in theory, any version).
Documentation encapsulates a multitude of sins and plays an important role in enabling the visibility to IT management and business management of change status and change history:
- Reports on the status of the change
- Environment promotion schedules
- Implementation schedules
- Test scripts
- Reporting or roll-up of tasks and subtasks to the overall project
- Workflow for approval and security
- A history of requests to assist in continuous quality improvement.
Don't Underestimate the Importance of Change Management and the need for a complete change management methodology. Whilst most organizations have implemented one or more pieces of the puzzle (e.g., version control and incident systems), very few have embraced the whole picture. Available statistics on project success and failure rates speak to this very point. The implementation of and adherence to a change management methodology is a major phase in an organization's migration toward quality business processes and toward IT systems that genuinely provide their original intent of supplying a reliable, supportive infrastructure to those business processes.
Saturday, November 13, 2010
Success - LANSA Colorado
A big thank you goes out to everyone who attended this week's LANSA Colorado Get-Together. It was a very successful evening spent in great company at Slattery's Irish pub in the Denver Tech Center.
We had a great turn out with about 30 people representing such companies as Dairy Information Systems, Dean Foods, Vistar, CoBank, Bank of America, MBM, Majestic Ventures, Stratum Global, LANSA and more. Some of us came from as far as Boulder (over an hour a way) to attend and this is much appreciated. The feedback on the event has been very positive indeed and I look forward to our next event tentatively planned for the middle of First Quarter 2011.
For those of you that could not make it, we all look forward to seeing you next time, early in the New Year - this is an "open house" and all are welcome. Remember, this is a pure networking / social event aimed at enabling companies to interact and leverage each other's knowledge and experiences in allowing proven, leading edge (and bleading edge) LANSA technology to empower their business to greater efficiency, productivity, and profitability.
Thank you, everyone.
Wednesday, August 11, 2010
The Information Explosion - Why We Hoard Data
The amount of information in the world is growing at an exponential rate. There are a number of reasons for this. Advances in technology, have led to a new generation of digital devices with increased capabilities which are digitizing information that was previously unavailable. There are also now significantly more people who interact with information. Between 1990 and 2005 more than 1 billion people worldwide entered the middle class - as societies become richer they become more literate, which fuels information growth. In recent years, a number of governments and global enterprises have embarked upon large scale information gathering initiatives – for example, Google’s “Books Library” project (to digitize and make searchable every book in the world in all languages) and the American Defense Department’s “Total Information Awareness” project (to gather and store personal information on its residents) and the European Union’s “Data Retention Directive” (to collate personal information and all communications activities on its residents)
All this data is reshaping our world, economically as well as socially, and there are visible signs that it is already starting to transform commerce, science, government, and everyday life. It has the potential to be for the greater good—as long as governments, consumers, and businesses make educated choices about when to restrict the flow of data, and when to encourage it.
In business, organizations today hoard data primarily for two reasons - compliance and auditing, visibility and planning. Compliance and auditing ensure proper adherence to government and industry regulations regarding the integrity, accessibility, confidentiality and retention of important data. Depending on whether you are a public, private, or government entity, and upon your industry vertical, this can include Sarbanes-Oxley, HIPAA, FISMA, GLBA, Basel committee initiatives, and even e-Discovery. Compliance and auditing crosses over into data management (an industry unto itself), and goes beyond the issues of adequate backups archiving or disaster preparedness. Regulations often prescribe severe financial and criminal penalties for organizations that fail to meet established standards, forcing many organizations to re-evaluate the way their data is handled and secured. Consequently, data compliance and auditing is a center piece in modern data management practices. Visibility and planning is providing business leaders with timely, relevant, and quality information that empowers them with a better understanding of their commercial context, so that they may make intelligent decisions regarding the current and future state of their organization (e.g. predict and respond to opportunities and threats, optimize operations to capitalize on new sources of revenue, and proactively manage risk while ensuring efficiency).
Common technology related techniques include business intelligence and business analytics. Business intelligence (querying, reporting, and OLAP) provides insights and tools that address “what and where” - “what happened”, “where exactly is the problem”, and “what actions are needed”. Business analytics (statistical and quantitative analysis, predictive modeling and fact-based management) focuses on trends, predictions and optimizations – “why is this happening”, “what if these trends continue”, “what will happen next” and “what is the best that can happen”. Business intelligence (BI) and business analytics are continuing to evolve. The newest generation of tools and techniques focus on the spread of predictive analytics, real-time performance monitoring and stream processing technologies, the use of “in-memory” products for faster analysis, embracing open source such as the “R” programming language, and the evolution of software-as-a-service for faster deployment.
A few industry verticals have taken the lead in their ability to collate and exploit data. For example:
In 2004, Florida was hit with a series of hurricanes in short succession. After “Charley” (the first of these hurricanes) had passed, Wal-Mart management analyzed Floridian consumer spending patterns immediately prior to the hurricane warning. Although bottled water, batteries and canned goods sold well, the far-and-away number one selling items were found to be beer and strawberry pop-tarts. Wal-Mart used this information to ensure their Florida stores were sufficiently stocked with these items during hurricane season leading to increased sales and revenue. This year, Wal-Mart and Yahoo are teaming up on a campaign called “365 Days of Mom”. The objective is to interrogate Yahoo’s databases (analyzing searches and clicks against demographics) to provide greater insight for Wal-Mart into the purchasing mindset of moms - when do they start helping their daughters shop for prom dresses, when do they start shopping for Valentine’s Day gifts, and so on. With the right information, Wal-Mart can operate more efficiently by tapping the right markets at the right time. Also, by analyzing “basket data”, supermarkets are now tailoring promotions to particular customers’ preferences.
Amazon and Netflix use a statistical technique called collaborative filtering to make recommendations to users based on what other users like. 65% of their recommendations have resulted in sales, which has produced millions of dollars of additional revenue.
EBay monitors listing activity, bidding behavior, pricing trends, search terms and the length of time users look at a page. Lots of searches but few sales for an expensive item may signal unmet demand, so eBay will find a partner to offer sellers insurance to increase listings.
Farecast, a part of Bing, interrogates over 225 billion flight and price records so it can advise customers whether to buy an airline ticket now or wait for the price to come down. The same idea is being extended to hotel rooms and cars. By providing this superior, “value-add” service, Farecast separates itself from the competition and attracts more potential customers to its site, leading to higher revenue generation capabilities.
By monitoring all purchases made, credit-card companies can now identify fraudulent transactions with a high degree of accuracy, using rules derived by crunching through billions of transactions. For example, stolen credit cards are more likely to be used to buy hard liquor than wine.
Insurance firms are more tuned at spotting suspicious claims - fraudulent claims are more likely to be made on a Monday than a Tuesday, since policyholders who stage accidents tend to assemble friends as false witnesses over the weekend.
Mobile-phone operators analyze subscriber calling patterns to determine whether most of their frequent contacts are on a rival network. If that rival network is offering an attractive promotion then this significantly increases the likelihood of that subscriber defecting - he or she is then red flagged to be offered an incentive to stay (and his or her contacts on a rival network offered incentives to defect).
In health care the trend is towards “evidence-based medicine”, where not only doctors but computers also get involved in diagnosis and treatment. Aggregated data is mined to spot unwanted drug interactions, identify the most effective treatments and predict the onset of disease before symptoms emerge. The on-going digitization of records will make it easier to spot and monitor health trends and evaluate the effectiveness of different treatments.
Whereas traditional “brick and mortar” businesses generally collect information about customers from their purchases or surveys, those doing business over the internet are able to collect the complete “click exhaust” – not only purchases, but what was searched, browsed, clicked, promotions and advertisements that generated interest, and so on. Companies that grasp these new opportunities, or provide the tools for others to do so, are already reaping the rewards of their endeavors. Knowledge is power, and the information management industry already generates $100 billion annually and has a healthy 10% growth rate.
For all the successes to date related to the data explosion, there have also been many failures. For example, during the recent financial crisis it became clear that banks and rating agencies had been relying on models which, although they required a vast amount of information to be fed in, failed to reflect financial risk in the real world. Another example is the well documented flaws in the systems used to identify potential terrorists.
For all the successes to date related to the data explosion, there have also been many failures. For example, during the recent financial crisis it became clear that banks and rating agencies had been relying on models which, although they required a vast amount of information to be fed in, failed to reflect financial risk in the real world. Another example is the well documented flaws in the systems used to identify potential terrorists.
In January 2000 the tidal wave of data pouring into the National Security Agency (NSA) brought the system to its knees. The agency was “brain-dead” for over three days. As the NSA’s director stated at the time “We were dark. Our ability to process information was gone.”
Also of concern is energy consumption - processing enormous volumes of information takes significant power. In 2006, the NSA came close to exceeding its power supply, which would have blown out its electrical infrastructure. Microsoft, Google, and similar companies, have built some of their largest data centers next to hydroelectric plants to ensure access to enough energy at a reasonable price.
Information and technology are neither good nor bad - it depends on how they are used. The world today contains a torrent of digital information which empowers us to accomplish things now that previously could not be done - prevent disease, combat crime, identify business trends, and so on. Managed well, the data explosion can be used to unlock new sources of economic value and provide fresh insights into science, and perhaps even hold governments to account.
Saturday, June 12, 2010
Thursday, June 10, 2010
The Rise of the Tech Savvy User (The Case for BPI – Part Two)
Oscar Wilde once said, “It is a very sad thing that nowadays there is so little useless information.” The year was 1894. If only he was alive today to witness a civilization now firmly addicted to data. Studies from UCSD (University of California San Diego) state that, in 2008, American households were bombarded with 3.6 zettabytes of information – that’s 3.6 trillion gigabytes, or the equivalent of 34 gigabytes per person per day. The biggest culprits were TV and video, followed by phone, music/iPod, social media and the internet, with written content amounting to less than 0.1%. However, there are marked changes in recent data consumption trends. The amount of reading people do has tripled since 1980 (due to internet searching) and, importantly, information consumption has migrated from passive to interactive.
Social media is now an integral part of our personal (and business) lives. 55.6 million American adults (about 32% of the US population) visit social media networks a least monthly – compared to 18% in 2008 and 15% in 2007. Americans already average over 5 hours per month on Facebook and 50 million tweets per day. Gartner predicts 1 billion Facebook users by year end with significant increases in traffic and unique visitors – European traffic alone grew last year by over 300% (999% in Spain, and 2,721% in Italy). YouTube last year served 75 billion video streams to 375 million unique visitors (at a rumored cost to Google, its owner, of 1.65 million dollars per day – a drop in the ocean when you consider it reported 23.6 billion dollars in annual revenue for 2009). Over the last two years, Twitter has averaged a global annual growth rate of 1,380%, and as of the end of 2009 there were 75 million user accounts. Social news sites (such as Digg, DiggBar, Reddit, and StumbleUpon), fared well through the end of 2009, with Digg averaging 36 million unique visitors per month (these numbers have dropped though through the first half of 2010).
It is no wonder that 2009 was dubbed the year of social media. Contrary to public opinion though, social media is not being fueled solely by the young – the fastest social media demographic is the 35 to 49 age bracket. Notably, at the close of 2008, social networking overtook email in terms of global reach.
A significant portion of the world is now almost permanently connected. There are nearly 5 billion active mobile phone subscriptions compared to a global population of around 7 billion people. 27% of the world has internet connectivity (77% in North America), a 400% increase since 2000.
Organizations today capitalize on the technical savvy of their target consumer and have extended their eCommerce strategies beyond SEO and SEM to include SMO (social media optimization) and socialytics. In return, this tech savvy consumer is now more demanding, expecting to receive relevant and personalized information to be pushed by default to them via text, email, RSS feeds and member community and forum subscriptions. This high level of personalized and timely content that we receive through rich contemporary user experiences in our daily lives raises the bar on what we expect when we transcend to our business lives, and these two worlds are rapidly colliding. For example:
1. If amazon.com can give me regular emails, texts, and web updates on the status of my orders and shipments and their anticipated arrival dates, and send me suggested products for purchase that match my profile, and
2. If my airline can proactively phone me with flight and gate announcements and changes, enable me to check in on-line, have my boarding pass sent to my mobile phone and allow me to use this for scanning at ticketing and security gates, and email me suggested future travel itineraries based upon my history, then
Why can’t my trading partners automatically collaborate with me and share relevant information in a timely, electronic, and streamlined fashion:
1. Automatically send/receive regular updated purchasing forecasts that, in turn, electronically feed into updated manufacturing schedules, raw material inventory planning and facilitate automatic placement/updates of orders with suppliers
2. warehouse and distribution center inventory forecasts to reduce excess inventory or backorders resulting from insufficient inventory
3. order changes and cancellations, anticipated shipping and deliveries of raw materials and finished products, and so on.
This pampered tech savvy user is also a powerful one with a collective voice. Case in point, in 2009 Pepsi/Tropicana changed their orange carton packaging. Consumers were so incensed with this new carton design and voiced their displeasure over social networks. Within days this reached tipping point, with national newspapers picking up on the social media traffic and running editorials, and Tropicana’s competition mass mailing disgruntled Tropicana customers with coupons for their products. Pepsi ultimately recalled all their new Tropicana orange cartons nationally and replaced these with the original carton design – this took three months at an estimated cost of 35 million dollars and a permanent loss of 20% of their customer base. A lesson learned in marketing and also underestimating the power of the modern consumer.
This initiative to fulfill the instantaneous needs and expectations of the modern tech savvy consumer and business user is further driving the need for organizations to work more efficiently, collaborate across their demand and supply chains more effectively, and modernize the technology toolsets they provide to enhance the capabilities, productivity, and quality of work performance.
The economist, Herbert Simon once wrote “What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention.” However, we rarely deal with raw data but rather processed data that has been aggregated and massaged via such technologies as business intelligence and analytics, recursive machine learning, augmented reality (AR) and augmented virtuality. In my next posting I will examine the successes of data mining and the data deluge, and examine the pitfalls (there are many).
Sunday, May 23, 2010
The Case for BPI (Part One – The Information Explosion)
Rollin Ford, Wal-Mart’s CIO, earlier this year stated “Every day I wake up and ask….how can I flow data better, manage data better, analyze data better”. Not surprising when you consider that Wal-Mart processes over 1 million client business transactions every hour, and manages databases over 170 times the size of the entire Library of Congress (the largest library in the world). However, Wal-Mart is not an isolated phenomenon. We are in an era that has been referred to as the “Industrial Revolution of Data” - The Economist calls it the “Data Deluge” and describes data as “the new raw material of business, on a par with capital and labor”.
In 2005, mankind generated 150 billion gigabytes (or 150 exabytes) of information, and here in 2010 we are expected to generate a whopping 1,200 billion gigabytes (or 1,200 exabytes). Digital data is increasing at a compounded growth rate of 60% per year, and this growth rate is expected to increase dramatically, not decrease. Google now manages 35,000 queries each second, and processes more data in half a day than the US Postal Service is expected to manage and deliver all year (about 5 petabytes worth, or 5 million gigabytes).
Corporate America is expected to archive 27 billion gigabytes (or 27 exabytes) of data this year alone. However, in 2007, the amount of data being generated started to exceed global storage capacity. So, we are now learning to prioritize what we need to store versus disregard. For example, experiments at the large hadron collider at CERN (Europe’s particle-physics laboratory), generate 40 terabytes every second - orders of magnitude more than can be stored or analyzed, so scientists collect what they can and let the rest dissipate into the ether.
A significant percentage of information being generated is also being shared. Cisco predicts that by 2013 the amount of traffic flowing over the internet annually will read 667 billion gigabytes (or 667 exabytes). It is already increasing at a rate that is faster than the ability of global networks to manage and keep pace.
These vast sizes and growth rates in data being generated, archived, managed and exchanged is now so huge it is almost too hard to comprehend, and it has already begun to transform all aspects of business, government, and our daily lives (in a future blog posting I will provide examples, and examine how this data deluge can be both potentially great and not without its pitfalls).
This information growth places an ever increasing burden on our systems, resources, and processes, and the urgency to mitigate errors and re-work. How we manage this information explosion, and our ability to extract the “nuggets of gold” hidden from under these “mountains of data”, cannot be underestimated. Providing decision makers and decision influencers with timely, reliable, aggregated, collaborative information assembled across all domains, so that they can appropriate important, timely, and strategic business decisions, is what ultimately separates leading organizations from the pack.
The power, potential and significance of Information Management have not gone unnoticed in the Technology Industry. It already generates $100 billion dollars a year in revenue and is increasing at a 10% annual growth rate.
Organizations worldwide conduct business today at a far greater pace than ever before and yet they are expected to respond and adapt to change almost instantaneously. With so much more information at their fingertips, the quantity of data to deal with is driving the necessity for greater reliability and quality. So there is even more pressure than ever for us to be agile, operate more efficiently and to collaborative with our demand and supply chain trading partners and customers, thus driving the need for business process improvement and business process integration.
Figures from industry groups and analysts confirm that those of us that buy into the BPI program will reap the rewards and benefits. For example, in 2007 and 2008, the average total return for companies in AMR Research’s “Supply Chain Top 25” was between 5% and 11% higher than those companies comprising the Dow Jones Industrial Average and the Standard and Poor’s 500 index. Other studies by leading global business and strategy firms (such as Bain & Company) demonstrate that companies employing sophisticated BPI programs and supply chain methods enjoy 12 times greater profit than those companies with no, or unsophisticated, BPI programs and supply chain methods.
Only 74 companies of the original S&P 500 were still on the list 40 years later, which equates to a mortality rate of about 10 companies per year. The average life span of an S&P company has now decreased from 50 years to 25 years, and only one third of today’s major corporations are projected to survive as significant businesses over the next quarter of a century.
Which companies will innovate and modernize to stay the course of time? And which will stagnate with a “maintain the status quo” mindset and inevitably traverse the well-worn roadmap to the global garbage heap? Only time will tell. So, if you and your business are not contemplating Business Process Improvement and Business Process Integration, then the chances are either your trading partners are, or your competition is – or both.
Subscribe to:
Posts (Atom)