The information overload
Research suggests that, collectively,employees of the average Times 1000 company spend more than 600,000 person-hours each yearsimply trying to access information. Another study, by Scribe Technologies, translatesthis into financial terms, finding that UK companies alone lose around £4.7 billion ayear because their employees cannot obtain the information they require in a timely andefficient way.
A further issue is whether this informationis even available in a suitable form in the first place. In many organizations, asignificant percentage of staff spend their entire time re-keying data for use inspreadsheets or other applications that support the business. By any standards, thisconstitutes an inappropriate use of skilled and costly resources.
Against this background, it’s hardlysurprising that organizations will welcome any measure that promises a dramatic fall inthe costs of obtaining the correct information when it is needed. Recent advances in IThave made it far easier for employees to make maximum use of the wealth of informationthat already exists within their organizations.
Data rich but information poor
For some years, IT departments haveconcentrated on providing efficient and fully functional transaction processing systems.As a result, many have built large databases that could prove immensely useful to theorganizations concerned if only they could access the data effectively. Thanks to advancesin software and hardware, the concept of business intelligence for the masses is now areality.
Business Intelligence Planning (known asBIP, or simply BI) is a term coined by industry analysts The Gartner Group. It is used todefine the analysis of information from databases in order to improve the decision-makingprocess.
Just a few years ago, most organizationshad relatively few PCs. Their specification was laughably basic by today’s standards,with 386/486 processors, 250Mb hard drives and 4Mb or 8Mb of memory being the norm. Now anentry level PC is likely to boast a 266MHz Pentium II processor, a minimum of 32Mb ofmemory and a 4Gb hard drive. Couple this with the latest Graphical User Interfaces (GUIs),networking capabilities, server technology, e-mail and the internet, and all the hardwareingredients required for a successful BI implementation are within the reach ofeveryone’s desktop.
However, the hardware on your desk isn’tthe only factor. Before purchasing any BI software, it’s important to understand thevarious types on offer. For example, does your organization require report writing,On-Line Analytical Processing (OLAP) analysis, data mining, or an integrated suite thatcombines all of these and more?
Reporting on the business
Traditionally, report writing has been themost popular method of producing information for an organization. With today’ssophisticated reporting tools, the report writer has become far more than just a printingmechanism.
The ability to provide a report directly toa computer screen is the most obvious improvement in modern day report writers. However,these tools also now enable you to utilize e-mail, web reporting for internet and intranetdeployment and – probably the most useful benefit – provide the ability to link reportstogether intelligently, enabling exception reporting.
Summarized reports can be produced, withthe user able to simply click on an item of interest in order to execute a more detailedreport on that subject. Technology such as this eliminates the need to produce thousandsof pages of paper, of which only very few may be of real importance to the business. Avaluable benefit is therefore the reduced computing load on the transaction processingsystem.
On-Line Analytical Processing(OLAP)
The real world is not two-dimensional,which is why OLAP systems have become increasingly popular to provide users with amultidimensional approach to their data analysis. Most people are familiar with thetwo-dimensional format of spreadsheets – or possibly even three dimensions, whenworksheets are used. In reality, though, people almost certainly need to analyze data inmore than three dimensions, to provide a ‘real world’ view that enables data tobe explored and analyzed from an enormous number of perspectives. This is where OLAP toolscome to the fore.
Most OLAP tools provide a very powerful GUIthat enables people to ‘slice and dice’ their data to perform complex analysison specific areas of interest. OLAP data is normally extracted from a transactionprocessing system or centralized repositories of data known as data warehouses, and cantherefore exist in several formats according to the volume of data involved. The data isstored in a multidimensional database (MDDB), otherwise known as an OLAP cube.
Desktop OLAP (DOLAP) applications givepeople with small data volumes the ability to store data locally on their PC; this isoften called Multidimensional OLAP, or MOLAP. More scaleable MOLAP solutions allow data tobe stored on an NT or Unix server, with the user’s PC acting as a client, or accessmedium.
Computer server technology enables OLAPmodels to be widely shared among many users across a network. The processing power of theserver performs the ‘number crunching’ of the data, enabling users to have muchlower specification desktop PCs – these now become the viewers of the data, rather thanthe processors. Such an approach helps to fully utilize the capability and capacity of anetworked computer infrastructure.
In very large OLAP installations, such asthose that administer loyalty and reward schemes in retail organizations, companies maychoose to deploy a data warehouse. A more scaleable Relational OLAP (ROLAP) tool may thenbe implemented above the data warehouse to extract key information that can make a realimpact on the business.
Balancing the business
One of the latest initiatives in BI,balanced scorecards allow an organization to define specific business measures, which canbe compared and consolidated to gain a high level overview of overall businessperformance. Well-balanced scorecard systems will incorporate a high quality GUI andprovide a ‘drill-down’ capability to focus on trends and to analyze problemareas within the business.
Mining your data
When handling large volumes of data, trendsand patterns can exist that are not immediately apparent when performing a manualanalysis. Data mining applications not only produce information that the user may haverequested specifically, but can be trained to identify obscure patterns and relationshipsby mimicking human reasoning.
In certain circumstances, it may beadvisable to deploy BI tools in a layer above a data warehouse. A data warehouse is arepository of data that is created from the ‘day-to-day’ transaction processingsystems, and which is kept separate from them in order to ensure the integrity of theoperational data. Typically, a data warehouse contains information consolidated from avariety of data sources, such as head office systems from different countries and externalindustry analysis.
Data warehouses are often held on adifferent hardware platform and in a different database that is better suited to therequirements of warehousing. One of the trends in data warehousing is the growing use of aWindows NT server as the host platform. Most database vendors provide NT versions of theirdatabases – for example, Oracle, IBM with Universal Database (UDB, previously known asDB2) and Microsoft with SQL Server. Windows NT also provides a seamless platform forgraphical BI analysis tools, as most are Windows-based.
In most cases, data within the warehousewill have been redesigned, ‘cleansed’ (for example, duplicates having beeneradicated) and aggregated to make it easier to use BI tools. A data warehouse also helpsto reduce the computing load on the main transaction processing system.
Deploying BI applications
Once you have identified which types of BIapplication your organization requires, you can plan the deployment of the solution. Hereare some key factors to consider. Which users will have direct access to the sourcedatabase? Which users will be permitted only to view the data? Are you going to load BItools on to each user’s desktop PC? Are you going to provide access to the data viathe internet/intranet using a web browser? It’s important to understand theimplications and ongoing costs of maintaining each of these options before choosing asuite of tools.
At a high level, small organizations maynot have the infrastructure or technical knowledge to set up an intranet, and desktoptools will be more appropriate. Large organizations, however, may select the intranetoption, as it is more cost-effective to have a single, central software repository fromwhich users update their PCs, rather than keeping hundreds of individual PCs updated withthe latest releases of multiple software applications.
The internet and intranets
Until recently, the enterprise-widedeployment of BI solutions was a problem for many organizations. Not only has the cost ofownership been very high but, as BI users strive to keep their PCs upgraded to the latestreleases of hardware and software, there are key administrative implications. In addition,data extracted from source systems has to be transferred to multiple PCs so that users cananalyze it.
However, everyone with a Windows 95/98/NTdesktop PC now has access to the Microsoft Internet Explorer web browser, which provides ameans of accessing web sites. The Netscape Navigator product is a similar browsersolution. Therefore, as BI software becomes increasingly web-enabled, people actually needvery little software on their PC. They can operate via a web browser after downloadingadditional applications such as Java or ActiveX components, and access data and softwarethrough a web server.
Some managers are concerned at the easewith which information can now be accessed over the internet. However, in reality, this israrely an issue. Most solutions of this type are deployed over an intranet using aninternal network of PCs, while internet installations are usually implemented on a secureweb server.
Metadata: data about data
When deploying BI solutions, one of thebiggest tasks is the development of metadata above the source applications. All BI toolsneed to understand the underlying database structure of their source system. Thedefinition of this database structure is known as metadata, although most BI tools haveproprietary terminology such as ‘knowledge bases’, ‘catalogs’ and’universes’. The need to provide metadata has resulted in the creation of more’shelfware’ than in any other area of BI implementations.
When choosing a set of BI tools, it’sextremely useful if the metadata already exists between the tools and your sourceapplications. If it doesn’t, you will probably have to develop it yourself. However,this is normally a lengthy process and requires detailed knowledge of the sourceapplication database.
Gaining an edge
There has never been a better time to reapthe benefits of Business Intelligence solutions. As projects related to year 2000 (Y2K)compliance near completion, you have an excellent opportunity to deploy solutions thatmake cost-effective use of the extensive data held in your Y2K-compliant applications.
Computer hardware now has the processingpower to handle the data required, and a PC/server/network infrastructure is in place inmost organizations. Coupled with the right choice of BI application, all of the keycomponents are in place to help you stay ahead of your competitors by turning raw datainto priceless, business-critical information. With far more efficient use ofemployees’ time into the bargain, significant financial savings are an added bonus.