Author Topic: Normalizing Statistics in the Data Mart  (Read 3616 times)

Vic

  • Guest
Normalizing Statistics in the Data Mart
« on: January 01, 1970, 12:00:00 AM »
Ok,

I do not know who did this, but if someone DID do it, please help.

I am reviewing Reporting 6.5 Reference manual and I ran across a really interesting secion on page 2, titled "Normalizing Statistics in the Data Mart" which deals with getting statistics ready for brio.

It says:
For tools, such as Brio, to deal much more effectively with this data, it must be normalized; that is, presented in a structure the program recognizes. While you canft change the Data Mart table structure, you can build views over those tables to present the data in a normalized format.

What I am curious about is why is it necessary?
I thought that when we create a new template in CCA we would automatically get all we need...
For example, for skillased reporting, can't we just create a template based on attach data in DMA and have Genesys do the rest? Why would there be a need for normalization?

Joe

  • Guest
Normalizing Statistics in the Data Mart
« Reply #1 on: January 01, 1970, 12:00:00 AM »
  • Best Answer
  • The process of removing redundant data from the tables of a relational database is called normalization. Normalization is the best and easiest way to arrive at an effective logical organization of the tables of a relational database. When you convert data into normalized form, you:

    Reduce a database structure to its simplest form
    Remove redundant columns from tables
    Identify all data that is dependent on other data

    A normalized database ensures referential integrity.

    What does this mean...if you architect your backend properly one of the benefits is that the front end in this case Brio will operate faster in retrieving its data from the datamart.