Wednesday, December 16, 2009

Installation of Cognos8.4 with Oracle Database and Apache Http Server


Step 1

Install Cognos components. This will take a lot and I recommend to restart the computer between every three components or so, in order to clear the cache, or whatever it is that it makes it run slower.
1.1 Create a folder Cognos in C drive or D drive and copy all the zip files from the installation CD to that folder.
1.2  Unzip the zip files in Cognos folder ( For example, right click the zip file Cognos 8 BI Server 8.3.7z --> ZipGenius --> Extract here, to Cognos 8 BI Server 8.3)
1.3 Open the Win32 folder created under the extracted folder and run the issetup.exe file. Do this for   Cognos 8 BI Server 8.3.7z, Cognos 8 BI Modeling 8.3, Cognos 8 Business Intelligence Samples 8.3.
Step 2

Download OracleXEUniv.exe from Oracle site and install it. Create a new user “cognos” with any password (I chose "cognos" as well) and grant all privileges to it. (NOTE - Character Set of the Database should be UNICODE)
Step 3

Locate ojdbc14.jar in oraclexe\app\oracle\product\10.2.0\server\jdbc\lib and copy it to cognos\c8\webapps\p2pd\WEB-INF\lib
If classes12.jar exists in this folder, delete it or rename it.
Step 4

1.    Download Apache2.2 free from Apache page and install it.
2.    Locate the file httpd.conf file in the installation folder\Conf and add the following text at the bottom:

ScriptAlias /cognos8/cgi-bin "d:/program files/cognos/c8/cgi-bin"

Options FollowSymLinks
AllowOverride FileInfo
Order Allow,Deny
Allow from All


Alias /cognos8 "d:/program files/cognos/c8/webcontent"

Options FollowSymLinks
AllowOverride FileInfo
Order Allow,Deny
Allow from All

Replace d: with c: if necessary.
Restart Apache service.
Step 5

1.    Start Cognos Configuration.
2.    Delete whatever it is under Content Manager and create a new item called “Content Store”, of type Oracle.
3.    Edit the user and password by clicking the pen that appears when selecting the field. User “cognos”, password “cognos”.
4.    Service name is “XE”.
5.    Right click on Content Store and Test.
6.    If all’s right click the green play button to start the service.
7.    You can close Cognos Configuration now, as the service will continue to run in the background.
Step 6

Test that Cognos Connection is working by calling the link http://127.0.0.1/cognos8 or http://localhost/cognos8.
I strongly advise to right click on the Apache icon in the tray bar and select “Open Services”. From this window, set all the Cognos, Oracle and Apache services to start manually instead of automatically. Otherwise all these services will attempt to start automatically on Windows startup and this takes a lot of time.
To start the services manually you have to (in this order):
1.    Click “Start Database” from Oracle start menu
2.    Start Cognos Configuration and click the play button
3.    Start the Apache service from the tray icon

Step 7

1.    You should now import the samples.
2.    Create five users in OracleXE called gosales, gosalesdw, gosaleshr, gosalesmr and gosalesrt. The password should be the same as the user id. Make sure to grant them all the privileges.
3.    Go to cognos\c8\webcontent\samples\datasources\oracle and extract the .gz files.
4.    Under this folder run the following commands:
  imp gosales/gosales@xe file = GOSALES.dmp full=y log=log.txt
  imp gosaleshr/gosaleshr@xe file = GOSALESHR.dmp full=y log=log.txt
  imp gosalesdw/gosalesdw@xe file = GOSALESDW.dmp full=y log=log.txt
  imp gosalesmr/gosalesmr@xe file = GOSALESMR.dmp full=y log=log.txt
  imp gosalesrt/gosalesrt@xe file = GOSALESRT.dmp full=y log=log.txt

Step 8

1.    In Cognos Connection select Cognos Administration and then Configuration.
2.    Click the “New Data Source” button.
3.    Type “gosales” for name and click Next.
4.    Select “Oracle” for type and click Next.
5.    Type “xe” for connection string (small caps needed). Select User ID and Password checkboxes. Enter “gosales” for user ID and the same for password.
6.    Test the connection. In case of failure restart the Oracle database.
7.    If all goes well click Finish.
8.    Repeat the steps for gosalesdw, and the rest.

Step 9

1.    Open Framework Manager.
2.    Open project C:\Program Files\cognos\c8\webcontent\samples\Models\ great_outdoors_sales\ great_outdoors_sales.cpf
3.    If you get an error during opening try replacing localhost:80 with 127.0.0.1:80 in Cognos Configuration gateway URI.
4.    Expand Data Sources and for each data source check that the “Content Manager Datasource” and “Schema” have the same value (gosales, gosalesdw,…).
5.    Also modify the Interface (under Type) to “OR” instead of “OL”.
6.    Test each Data Source. In case of failure restart the Oracle database. Save your changes to the project file.
7.    Expand Packages and publish each package (right click -> publish packages..).
8.    Ignore the errors.
9.    Save and exit.
10.    Do the same for project C:\Program Files\cognos\c8\webcontent\samples\Models\ great_outdoors_warehouse\great_outdoors_warehouse.cpf

Step 10

1.    Copy Cognos_Samples.zip from c8\webcontent\samples\content to c8\deployment.
2.    Open Cognos Connection and select Cognos Administration -> Configuration -> Content Administration.
3.    Click New Import icon.
4.    Cognos_Samples should appear in the list. Select Next, then Next again.
5.    On the ‘Select the Public Folders Content’ page, select the checkbox to the left of the Cognos_Samples package.
6.    Select Next, Next again, then Import Now.
   

Dynamically Hiding and showing Report columns in cognos report studio by parameter selection

In order to hide or show report columns conditionly you have to use a string variable and assign the values to it then dynamically set the value of the variable according to a parameter chosen by the user.

Steps

1.Create a parameter with choices for the user to choose
2.create a string variable and add values to it same as the parameter choices
3.And write the variable expression
as
if(ParamValue('P_year'))='2009'
then
'2009'
else
'2008'
4.Then go to the properties of report column to show or hide
5.Click the render variable property value
6.choose the string variable as render varaible
7. Then select values so when variable value matches to the selected values the column  will be shown
8.Do the same for all columns to show or hide according to the parameter selected by the user

Tuesday, December 15, 2009

Conditional expressions in Cognos Reportstudio

 Syntax

if(?monthcode?='1')
then
(' January')
else if(?Report Period?='2')
then
(' February')
else if(?Report Period?='3')
then
(' March')
else if(?Report Period?='4')
then
(' April')
.
.
.
.
.

else if(?Report Period?='12')
then
(' December')


Case when syntax
 
case ?monthcode?
When 1 then 'Jan'
When 2 then 'Feb'
...
When 12 then 'Dec'
end

Upgrde cognos 8.3 to 8.4

I recommend doing a database backup of your content store database. Create a new schema and restore the backup to the new one and use it for the cognos8.4.and when you start 8.4 configuration, it will convert the content store to 8.4.
This avoids the need to do the export all / import all process.If you do the db backup / rename process, all your users & security settings & data sources will still exist in the new content store.
If you use export all, be sure to select all the options - data sources, users, passwords, security settings, etc., etc. Unless you do everything, you'll wind up with an installation that won't allow people into things, etc., etc.
In addition, Export All does not copy individual users' My Folders.
Also, you cannot do a partial restore of an Export All.

Tuesday, November 24, 2009

Configure framework manager client to communicate with cognos server installed on another machine

Here I am explaining you how to install and configure framework manager  on a client computer.The steps to perform this operation is given below

1.Install framework manager on the client machine
2.Open the cognos configuration
3.Click on the Environment in the explorer
4.Change the Gateway URI to the http://ip:80/cognos8/cgi-bin/cognos.cgi(ip=ip address of the cognos server)
5.Change the Dispatcher URI for external applications to http://ip:9300/p2pd/servlet/dispatch
6.save the settings
7.Close the cognos configuration and open the framework manager and enjoy the work

Friday, November 20, 2009

CFG-ERR-0101 Unable to register Cognos 8 service. Execution of the external process returns an error code value of '2'.

Error

CFG-ERR-0101 Unable to register Cognos 8 service. Execution of the external process returns an error code value of '2'.

Description
When I am trying to install and configure Cognos8.4 I got the above error on XP machine.My installation was successful but while starting the cognos configuration I got the above error.

Solution
The error is raised because of the prevention of registering a cognos service by Mcafee antivirus.I fixed the problem by stoping the Mcafee then configured cognos succefully then restarted Mcafee .Later i added cognos in Mcafee exception list.You can fix it by just adding cognos in the exception list

Thursday, November 12, 2009

creating a stored procedure query subject in cognos8 framework manager

In order to create a stored procedure query subject in framework manager you have to create a stored procedure which returning a reference ref_cursor for the result set.Here i giving the code sample

step1
create a package with one reference cursor

create or replace package pkg_ref_cursor
as
type ref_cursor is ref cursor;
end;
/

step2
create a procedure to actually do the task and return the result set

create or replace procedure tfp.getData( p_ref_cursor in out pkg_ref_cursor.ref_cursor,text in varchar2)
as
Begin
-- some code
insert into tfp.tempTable select sysdate,text from DUAL;
-- the result set to return
open p_ref_cursor for SELECT * from TFP.tempTable;
end;
/

step3
now you can directly import this procedure in framework manager and use for creating reports
similar as a table or view

Start and stop cognos service using bat file

startcognos.bat

%ECHO OFF
echo "Starting Cognos 8"
net start "Cognos 8"

stopcognos.bat

%ECHO OFF
ECHO "Stopping Cognos 8"
net stop "Cognos 8"

Friday, November 6, 2009

Cognos Certification Dumps

Latest Cognos8 Business Intelligence Dumps for all of the tests listed below


   1.  Test BI0 -  210 - IBM Cognos 8 BI Professional
   2.  Test COG-105 - IBM Cognos 8 BI Technical Specialist
   3.  Test COG-112 - IBM Cognos 8 BI Author
   4.  Test COG-122 - IBM Cognos 8 BI Administrator
   5.  Test COG-125 - IBM Cognos 8 BI Data Warehouse Developer
   6.  Test COG-132 - IBM Cognos 8 BI Metadata Model Developer
   7.  Test COG-135 - IBM Cognos 8 BI OLAP Developer
   8.  Test COG-145 - IBM Cognos 8 BI Multidimensional Author
   9.  Test COG-180 - IBM Cognos 8 BI Professional
   10.Test COG-300 - IBM Cognos TM1 Analyst
   11.Test COG-310 - IBM Cognos TM1 Developer
   12.Test COG-400 - IBM Cognos 8 Planning Application Consultant
   13.Test COG-480 - IBM Cognos 8 Planning Professional


Please contact me for cognos dumps at   kvrajith@gmail.com
 

Tuesday, October 13, 2009

Supply Chain Management (SCM)

Supply chain management (SCM) is the management of a network of interconnected businesses involved in the ultimate provision of (business)|product and Service (economics) packages required by end customers (Harland, 1996). Supply Chain Management spans all movement and storage of raw materials, work-in-process inventory, and finished goods from point of origin to point of consumption (supply chain).

Another definition is provided by the APICS Dictionary when it defines SCM as the "design, planning, execution, control, and monitoring of supply chain activities with the objective of creating net value, building a competitive infrastructure, leveraging worldwide logistics, synchronizing supply with demand, and measuring performance globally
   

Enterprise Resource Planning (ERP)

Enterprise Resource Planning (ERP) is a term usually used in conjunction with ERP software or an ERP system which is intended to manage all the information and functions of a business or company from shared data stores.

An ERP system typically has modular hardware and software units and "services" that communicate on a local area network. The modular design allows a business to add or reconfigure modules (perhaps from different vendors) while preserving data integrity in one shared database that may be centralized or distributed

Some organizations — typically those with sufficient in-house IT skills to integrate multiple software products — choose to implement only portions of an ERP system and develop an external interface to other ERP or stand-alone systems for their other application needs. For example, one may choose to use human resource management system from one vendor, and perform the integration between the systems themselves.[citation needed]

This is common to retailers[citation needed], where even a mid-sized retailer will have a discrete Point-of-Sale (POS) product and financials application, then a series of specialized applications to handle business requirements such as warehouse management, staff rostering, merchandising and logistics.

Ideally, ERP delivers a single database that contains all data for the various software modules that typically address areas such as

Manufacturing
    Engineering, bills of material, scheduling, capacity, workflow management, quality control, cost management, manufacturing process, manufacturing projects, manufacturing flow
Supply chain management
    Order to cash, inventory, order entry, purchasing, product configurator, supply chain planning, supplier scheduling, inspection of goods, claim processing, commission calculation
Financials
    General ledger, cash management, accounts payable, accounts receivable, fixed assets
Project management
    Costing, billing, time and expense, performance units, activity management
Human resources
    Human resources, payroll, training, time and attendance, rostering, benefits
Customer relationship management
    Sales and marketing, commissions, service, customer contact and call center support

Data services
    various "self-service" interfaces for customers, suppliers, and/or employees
Access control
    management of user privileges for various processes

Customer Relationship Management (CRM)

Customer relationship management (CRM) are methods that companies use to interact with customers. The methods include employee training and special purpose CRM software. There is an emphasis on handling incoming customer phone calls and email, although the information collected by CRM software may also be used for promotion, and surveys such as those polling customer satisfaction.

Initiatives often fail because implementation was limited to software installation, without providing the context, support and understanding for employees to learn.[1] Tools for customer relationship management should be implemented "only after a well-devised strategy and operational plan are put in place".

Other problems occur when failing to think of sales as the output of a process that itself needs to be studied and taken into account when planning automation

SAP Products

 Customer Relationship Management (CRM)
 Enterprise Resource Planning (ERP)
 Product Lifecycle Management (PLM)
 Supplier Relationship Management (SRM)
 SAP Business Objects Suite (BOBJ)
 SAP Advanced Planner and Optimizer (APO)
 SAP Apparel and Footwear Solution (AFS)
 SAP Business Information Warehouse (BW)
 SAP Business Intelligence (BI)
 SAP Catalog Content Management (CCM)
 SAP Enterprise Buyer Professional (EBP)
 SAP Enterprise Learning
 SAP Portal (EP)
 SAP Exchange Infrastructure (XI)
 Governance, Risk and Compliance (GRC)
 Enterprise central Component (ECC)
  SAP Human Resource Management Systems (HRMS)
  SAP Internet Transaction Server (ITS)
  SAP Incentive and Commission Management (ICM)
  SAP Knowledge Warehouse (KW)
  SAP Master Data Management (MDM)
  SAP Service and Asset Management
  SAP Solutions for mobile business
  SAP Solution Composer
  SAP Strategic Enterprise Management (SEM)
  SAP Test Data Migration Server (TDMS)
  SAP Training and Event Management (TEM)
  SAP NetWeaver Application Server (Web AS)
  SAP xApps
  SAP Supply Chain Performance Management (SCPM)

Business Objects

BusinessObjects is the first  business intelligence (BI) platform that delivers a complete set of market-leading BI capabilities: best-in-class performance management, reporting, query and analysis, and data integration. BusinessObjects XI introduces significant innovations that deliver BI in new ways to a much broader set of users as well as completing the integration of the Crystal and BusinessObjects product lines.
BusinessObjects XI Release 2 builds on the world’s leading business intelligence platform, BusinessObjects XI, to deliver new ways to access the information you need to do your job, allowing you to be able to say “I can answer my questions. I can trust and share my insight. And I can do everything I need on one BI standard.”
BusinessObjects XI Release 2 builds on the proven and trusted BusinessObjects XI platform. It provides substantial functional improvements and innovations across the BusinessObjects XI platform and includes full platform-level support for Desktop Intelligence™ (formerly BusinessObjects full client) to allow a smooth transition path to BusinessObjects XI for all existing customers who have invested in that technology.


Sunday, October 4, 2009

Data mining

Data mining is the process of extracting patterns from data. As more data are gathered, with the amount of data doubling every three years, data mining is becoming an increasingly important tool to transform these data into information. It is commonly used in a wide range of profiling practices, such as marketing, surveillance, fraud detection and scientific discovery.
While data mining can be used to uncover patterns in data samples, it is important to be aware that the use of non-representative samples of data may produce results that are not indicative of the domain. Similarly, data mining will not find patterns that may be present in the domain, if those patterns are not present in the sample being "mined". There is a tendency for insufficiently knowledgeable "consumers" of the results to attribute "magical abilities" to data mining, treating the technique as a sort of all-seeing crystal ball. Like any other tool, it only functions in conjunction with the appropriate raw material: in this case, indicative and representative data that the user must first collect. Further, the discovery of a particular pattern in a particular set of data does not necessarily mean that pattern is representative of the whole population from which that data was drawn. Hence, an important part of the process is the verification and validation of patterns on other samples of data.
The term data mining has also been used in a related but negative sense, to mean the deliberate searching for apparent but not necessarily representative patterns in large numbers of data. To avoid confusion with the other sense, the terms data dredging and data snooping are often used. Note, however, that dredging and snooping can be (and sometimes are) used as exploratory tools when developing and clarifying hypotheses.

ETL(Extract, transform, load)

  • Extracting data from outside sources
  • Transforming it to fit operational needs (which can include quality levels)
  • Loading it into the end target (database or data warehouse)\
Extract
The first part of an ETL process involves extracting the data from the source systems. Most data warehousing projects consolidate data from different source systems. Each separate system may also use a different data organization / format. Common data source formats are relational databases and flat files, but may include non-relational database structures such as Information Management System (IMS) or other data structures such as Virtual Storage Access Method (VSAM) or Indexed Sequential Access Method (ISAM), or even fetching from outside sources such as web spidering or screen-scraping. Extraction converts the data into a format for transformation processing.An intrinsic part of the extraction involves the parsing of extracted data, resulting in a check if the data meets an expected pattern or structure. If not, the data may be rejected entirely or in part.
Transform
The transform stage applies a series of rules or functions to the extracted data from the source to derive the data for loading into the end target. Some data sources will require very little or even no manipulation of data. In other cases, one or more of the following transformation types may be required to meet the business and technical needs of the target database:
  • Selecting only certain columns to load (or selecting null columns not to load)
  • Translating coded values (e.g., if the source system stores 1 for male and 2 for female, but the warehouse stores M for male and F for female), this calls for automated data cleansing; no manual cleansing occurs during ETL
  • Encoding free-form values (e.g., mapping "Male" to "1" and "Mr" to M)
  • Deriving a new calculated value (e.g., sale_amount = qty * unit_price)
  • Filtering
  • Sorting
  • Joining data from multiple sources (e.g., lookup, merge)
  • Aggregation (for example, rollup - summarizing multiple rows of data - total sales for each store, and for each region, etc.)
  • Generating surrogate-key values
  • Transposing or pivoting (turning multiple columns into multiple rows or vice versa)
  • Splitting a column into multiple columns (e.g., putting a comma-separated list specified as a string in one column as individual values in different columns)
  • Disaggregation of repeating columns into a separate detail table (e.g., moving a series of addresses in one record into single addresses in a set of records in a linked address table)
  • Applying any form of simple or complex data validation. If validation fails, it may result in a full, partial or no rejection of the data, and thus none, some or all the data is handed over to the next step, depending on the rule design and exception handling. Many of the above transformations may result in exceptions, for example, when a code translation parses an unknown code in the extracted data.
 Load
The load phase loads the data into the end target, usually the data warehouse (DW). Depending on the requirements of the organization, this process varies widely. Some data warehouses may overwrite existing information with cumulative, updated data every week, while other DW (or even other parts of the same DW) may add new data in a historized form, for example, hourly. The timing and scope to replace or append are strategic design choices dependent on the time available and the business needs. More complex systems can maintain a history and audit trail of all changes to the data loaded in the DW.
As the load phase interacts with a database, the constraints defined in the database schema — as well as in triggers activated upon data load — apply (for example, uniqueness, referential integrity, mandatory fields), which also contribute to the overall data quality performance of the ETL process.

Data Mining tools

SPSS Clementine 8.5
IBM DB2 Intelligent Miner
Insightful Miner 3.0
KXEN Analytic Framework 3.0
Oracle Data Mining
Quadstone System V. 5
SAS Enterprise Miner 5.1
SPSS Clementine 8.5

Data warehouse architecture

Architecture, in the context of an organization's data warehousing efforts, is a conceptualization of how the data warehouse is built. There is no right or wrong architecture, rather multiple architectures exist to support various environments and situations. The worthiness of the architecture can be judged in how the conceptualization aids in the building, maintenance, and usage of the data warehouse.One possible simple conceptualization of a data warehouse architecture consists of the following interconnected layers:
Operational database layer
    The source data for the data warehouse - An organization's Enterprise Resource Planning systems fall into this layer.
Data access layer
    The interface between the operational and informational access layer - Tools to extract, transform, load data into the warehouse fall into this layer.
Metadata layer
    The data directory - This is usually more detailed than an operational system data directory. There are dictionaries for the entire warehouse and sometimes dictionaries for the data that can be accessed by a particular reporting and analysis tool.
Informational access layer
    The data accessed for reporting and analyzing and the tools for reporting and analyzing data - Business intelligence tools fall into this layer. And the Inmon-Kimball differences about design methodology, discussed later in this article, have to do with this layer.

Data warehouse

Data warehouse is a repository of an organization's electronically stored data. Data warehouses are designed to facilitate reporting and analysis.A Data Warehouse houses a standardized, consistent, clean and integrated form of data sourced from various operational systems in use in the organization, structured in a way to specifically address the reporting and analytic requirements.

This definition of the data warehouse focuses on data storage. However, the means to retrieve and analyze data, to extract, transform and load data, and to manage the data dictionary are also considered essential components of a data warehousing system. Many references to data warehousing use this broader context. Thus, an expanded definition for data warehousing includes business intelligence tools, tools to extract, transform, and load data into the repository, and tools to manage and retrieve metadata.

Subject-oriented

    The data in the data warehouse is organized so that all the data elements relating to the same real-world event or object are linked together.
Non-volatile
    Data in the data warehouse is never over-written or deleted - once committed, the data is static, read-only, and retained for future reporting.
Integrated
    The data warehouse contains data from most or all of an organization's operational systems and this data is made consistent.The top-down design methodology generates highly consistent dimensional views of data across data marts since all data marts are loaded from the centralized repository. Top-down design has also proven to be robust against business changes. Generating new dimensional data marts against the data stored in the data warehouse is a relatively simple task. The main disadvantage to the top-down methodology is that it represents a very large project with a very broad scope. The up-front cost for implementing a data warehouse using the top-down methodology is significant, and the duration of time from the start of project to the point that end users experience initial benefits can be substantial. In addition, the top-down methodology can be inflexible and unresponsive to changing departmental needs during the implementation phases

Benefits of data warehousing
Some of the benefits that a data warehouse provides are as follows:

    * A data warehouse provides a common data model for all data of interest regardless of the data's source. This makes it easier to report and analyze information than it would be if multiple data models were used to retrieve information such as sales invoices, order receipts, general ledger charges, etc.
    * Prior to loading data into the data warehouse, inconsistencies are identified and resolved. This greatly simplifies reporting and analysis.
    * Information in the data warehouse is under the control of data warehouse users so that, even if the source system data is purged over time, the information in the warehouse can be stored safely for extended periods of time.
    * Because they are separate from operational systems, data warehouses provide retrieval of data without slowing down operational systems.
    * Data warehouses can work in conjunction with and, hence, enhance the value of operational business applications, notably customer relationship management (CRM) systems.
    * Data warehouses facilitate decision support system applications such as trend reports (e.g., the items with the most sales in a particular area within the last two years), exception reports, and reports that show actual performance versus goals.

Disadvantages of data warehouses

There are also disadvantages to using a data warehouse. Some of them are:
    * Data warehouses are not the optimal environment for unstructured data.
    * Because data must be extracted, transformed and loaded into the warehouse, there is an element of latency in data warehouse data.
    * Over their life, data warehouses can have high costs. The data warehouse is usually not static. Maintenance costs are high.
    * Data warehouses can get outdated relatively quickly. There is a cost of delivering suboptimal information to the organization.
    * There is often a fine line between data warehouses and operational systems. Duplicate, expensive functionality may be developed. Or, functionality may be developed in the data warehouse that, in retrospect, should have been developed in the operational systems and vice versa.

Wednesday, September 30, 2009

CREATING CUBES USING COGNOS POWERPLAY

Step 1: Start->Programs->Cognos8(Frame Work Manager) ->Frame work manager
Step 2:Create new project-> Press ok
Step 3: Select the design language for the project
Step 4: Select Metadata Source->Next
Step 5: Select data source from the list or create new data source->Next
Step 6: Select the object according to the requirement->Next
Step 7: Generate the relationship->Import
Step 8: Finish the process->Finish
Step 10: Publish the package
Step 11: Open transformer, go to file_New_Model name (User defined)->Next
Step 12: New data source (DHW name), Source type name (Select cognos package)->Next
Step 13: package name  (browse the package)->Next
Step 14: Check the auto design check box, click finish database.
Step 15: Right click on Dim map->choose Insert Dimension-> Write the dimension name
Step 16: Right click on Customer1-> Select insert level->Click on Source-> click on more->select Customer_id->click ok
Step 17: Right click on power cube-> Select Insert power cube-> Define cube name->Ok
Step 18: Right click on the cube & select create selected power cube
Step 19: Processing the Cube
Step 20: A new model will be genrated with Dimension map, data source, measure, power cube,singons.Data Sources Measures Cube
step 21: From the generated cube you can create whatever reports you need.

Friday, September 25, 2009

Business Intelligence (BI) tools list

1.
Oracle Enterprise BI Server


2.
Business Objects Enterprise


3.
SAP NetWeaver BI


4.
SAS Enterprise BI Server


5.
IBM Cognos TM/1 & Executive Viewer


6.
BizzScore Suite


7.
WebFocus


8.
Excel, Performance Point, Analysis Server


9.
QlikView


10.
Microstrategy


11.
Hyperion System


12.
Actuate


13.
IBM Cognos Series 8





source
www.businessintelligencetoolbox.com

Cognos DecisionStream architecture

Cognos DecisionStream is a powerful ETL tool, comprised of a DecisionStream designer graphical design environment and an ETL engine acting as a server.
The Cognos Decision Stream designer is a graphical tool for implementing the ETL process. It lets users map data sources into data streams, define transformations of the data, organize facts and dimensions into a star or snowflake schema, etc.

The DecisionStream designer comprises of the following components shown below:

Dimension library

Dimension library browser with many dimensional templates. DecisionStream provides templates for managing surrogate keys and slowly changing dimensions (SCD).

Dimension and Fact builds

Dimensions and facts builds are defined by ETL developers, a module used for managing the ETL process

Visual design environment

ETL process designer which lets developers design a flow. The visual designer provides the main four views to manage the ETL process:
- Marketing perspective
- Data Stream
- Transformation Model
- Fact Delivery

Informatica Interview Questions

Over 500 interview questions related to Informatica and its Data Warehousing solutions. Reading those questions and answers should not be considered only as a preparation for an interview but also as a good source of information and tutorial which help learn

Informatica Developer Network

The Informatica Developer Network (IDN) is a program designed to help software developers expand the business value of Informatica's data integration platform and data quality products through the development of innovative and compelling third-party

Data Warehouse and Informatica ETL Tutorial

DWHlabs portal is exclusively designed for Data warhouse developers. Most of the content relates to data warehousing and ETL products with main emphasis on Informatica. The site provides a good Informatica ETL tutorial for beginners which covers such topics

Cognos PowerPlay Tutorial

Cognos PowerPlay Transformer technical tutorial with examples, sample solutions and a guide to develop data warehouse models in Cognos. The tutorial is aligned to a typical business scenario to facilitate the learning process. It is addressed to data

Cognos Analysis Studio quick tour

The Analysis Studio Quick Tour teaches the basic skills needed to analyze business results. The tutorial shows how to: Insert data into an empty crosstab, nest, replace, sort data in a cube Explore data by changing

http://cogpubbkp.cabq.gov/cognos8/documentation/en/tours/pshome.html

Saturday, September 19, 2009

List of ETL tools

ETL tools are widely used for extracting, cleaning, transforming and loading data from different systems, often into a data warehouse. The following ETL tools were thoroughly examined on 80 criteria considered important for attaining high productivity Business Intelligence systems that actually add value to your organization. In random order.

1.
Oracle Warehouse Builder (OWB)
11gR1
Oracle 
2.
Data Integrator & Services 
XI 3.0
Business Objects, SAP
3.
IBM Information Server (Datastage)
8.1
IBM
4.
SAS Data Integration Studio
4.2
SAS Institute
5.
PowerCenter
8.5.1
Informatica 
6.
Elixir Repertoire
7.2.2
Elixir
7.
Data Migrator
7.6
Information Builders
8.
SQL Server Integration Services
10
Microsoft 
9.
Talend Open Studio
3.1
Talend
10.
DataFlow Manager
6.5
Pitney Bowes Business Insight
11.
Data Integrator
8.12
Pervasive
12.
Transformation Server
5.4
IBM DataMirror
13.
Transformation Manager
5.2.2
ETL Solutions Ltd.
14.
Data Manager/Decision Stream
8.2
IBM Cognos
15.
Clover ETL
2.5.2
Javlin 
16.
ETL4ALL
4.2
IKAN
17.
DB2 Warehouse Edition
9.1
IBM
18.
Pentaho Data Integration
3.0
Pentaho 
19
Adeptia Integration Server
4.9
Adepti

source
http://www.etltool.com/etltoolslist.htm

Friday, September 18, 2009

Enabling cognos security


Cognos 8 components run with two levels of logon: anonymous and authenticated. By default,anonymous access is enabled .You can use both types of logon with your installation. If you choose to use only authenticated logon, you can disable anonymous access .For authenticated logon , you must configure Cognos 8 components with an appropriate namespace for the type of authentication provider in your environment. You can configure multiple namespaces for authentication and then choose at run time which namespace you want to use. Cognos 8 components support the following types of servers as authentication sources:
Active Directory Server
Cognos Series 7
Custom Authentication Provider
LDAP
eTrust SiteMinder
NTLM
SAP.


      After you configure an authentication provider for Cognos 8 components, you can enable single signon between your authentication provider environment and Cognos 8 components. This means that a user logs on once and can then switch to another application without being asked to log on again.
To use an authentication provider

Disable Anonymous Access

By default, Cognos 8 components do not require user authentication. Users can log on anonymously.If you want to use authenticated logon only, you can use Cognos Configuration to disable anonymous access.
Steps
1. On each Content Manager computer, start Cognos Configuration.
2. In the Explorer window, under Security, Authentication, click Cognos.The Cognos resource represents the Cognos namespace. The Cognos namespace stores information about Cognos groups, such as the Anonymous User, contacts, and distribution
3. In the Properties window, click the box next to the Allow anonymous access property and then click False.
4. From the File menu, click Save. Now, users are required to provide logon credentials when they access Cognos resources.

Restrict User Access to the Cognos Namespace

Access can be restricted to users belonging to any group or role defined in the Cognos built-in
namespace. By default, all users belong to several built-in groups or roles. To restrict access, you
must:
enable the property to restrict access
remove the Everyone group from the Cognos built-in roles and groups
ensure that authorized users belong to at least one Cognos role or group
Steps
1. On each Content Manager computer, start Cognos Configuration.
2. In the Explorer window, under Security, click Authentication.
3. In the Properties window, change the value of Restrict access to members of the built-in
namespace to True.
4. From the File menu, click Save.
You must now use the portal to remove the Everyone group from the Cognos built-in roles and
groups and then ensure that authorized users belong to at least one Cognos built-in role or group.
For information about adding or removing members of a Cognos group or role.
Configuring Cognos 8 Components to Use Active Directory Server
If you install Content Manager on a Windows computer, you can configure Active Directory as your authentication source using an Active Directory namespace.If you install Content Manager on a UNIX computer, you must instead use an LDAP namespace to configure Active Directory as your authentication source. If you install Content Manager on Windows and UNIX computers, you must use an LDAP namespace to configure Active Directory on all Content Manager computers. When you use an LDAP namespace to authenticate against Active Directory Server, you are limited to LDAP features only. By default, Active Directory Server uses port 389.
To use an Active Directory Server namespace and to set up single signon, do the following:




 Configure Cognos 8 components to use an Active Directory Server namespace
 Enable secure communication to the Active Directory Server, if required
 Enable single signon between Active Directory Server and Cognos 8 components

Configure an Active Directory Namespace





You can use Active Directory Server as your authentication provider. You also have the option of making custom user properties from the Active Directory Server available to Cognos 8 components. For Cognos 8 to work properly with Active Directory Server, you must ensure that the Authenticated users group has Read privileges for the Active Directory folder where users are stored.




Steps
1. On every computer where you installed Content Manager, open Cognos Configuration.
2. In the Explorer window, under Security, right-click Authentication, and then click New resource,Namespace.
3. In the Name box, type a name for your authentication namespace.
4. In the Type list, click the appropriate namespace and then click OK. The new authentication provider resource appears in the Explorer window, under the Authentication component.
5. In the Properties window, for the Namespace ID property, specify a unique identifier for the
namespace.
6. Specify the values for all other required properties to ensure that Cognos 8 components can
locate and use your existing authentication provider.
7. Specify the values for the Host and port property. To support Active Directory Server failover, you can specify the domain name instead of a specific domain controller. For example, use mydomain.com:389 instead of dc1.mydomain. com:389.
8. If you want to be able to search for details when authentication fails, specify the user ID and
password for the Binding credentials property.Use the credentials of an Active Directory Server user who has search and read privileges for that server.
9. From the File menu, click Save.
10. Test the connection to a new namespace. In the Explorer window, under Authentication,
right-click the new authentication resource and click Test.Cognos 8 loads, initializes, and configures the provider libraries for the namespace.

Make Custom User Properties for Active Directory Available to Cognos 8 Components

You can use arbitrary user attributes from your Active Directory Server in Cognos 8 components.
To configure this, you must add these attributes as custom properties for the Active Directory
namespace.The custom properties are available as session parameters through Framework Manager. For more information about session parameters, see the Framework Manager User Guide The custom properties can also be used inside command blocks that are used to configure Oracle sessions and connections. The command blocks can be used with Oracle light-weight connections and virtual private databases.
Steps
1. On every computer where you installed Content Manager, open Cognos Configuration.
2. In the Explorer window, under Security, Authentication, click the Active Directory namespace.
3. In the Properties window, click in the Value column for Custom properties and click the edit
button.
4. In the Value - Custom properties window, click Add.
5. Click the Name column and enter the name you want Cognos 8 components to use for the session parameter.
6. Click the Value column and enter the name of the account parameter in your Active Directory
Server.
7. Repeat steps 4 to 6 for each custom parameter.
8. Click OK.
9. From the File menu, click Save.

Enabling Secure Communication to the Active Directory Server

If you are using an SSL connection to the Active Directory Server, you must copy the certificate
from the Active Directory Server to the Content Manager computer.
Steps
1. On every Content Manager computer, use your Web browser to connect to the Active Directory
Server and copy the CA root certificate to a location on the Content Manager computer.
2. Add the CA root certificate to the certificate store of the account that you are using for the current Cognos session:
If you are running the Cognos session under a user account, use the same Web browser as
in step 1 to import the CA root certificate to the certificate store for your user account.
For information, see the documentation for your Web browser.
If you are running the Cognos session under the local computer account, use Microsoft Management Console (MMC) to import the CA root certificate to the certificate store for the local computer.
3. In Cognos Configuration, restart the service:
In the Explorer window, click Cognos 8 service, Cognos 8.
From the Actions menu, click Restart.


Include or Exclude Domains Using Advanced Properties

When you configure an authentication namespace for Cognos 8, users from only one domain can log in. By using the Advanced properties for Active Directory Server, users from related (parent-child) domains and unrelated domain trees within the same forest can also log in.
Authentication in One Domain Tree
If you set a parameter named chaseReferrals to true, users in the original authenticated domain and all child domains of the domain tree can log in to Cognos 8. Users above the original authenticated domain or in a different domain tree cannot log in.
Authentication in All Domain Trees in the Forest
If you set a parameter named MultiDomainTrees to true, users in all domain trees in the forest can
log in to Cognos 8.
Steps
1. On every computer where you installed Content Manager, open Cognos Configuration.
2. In the Explorer window, under Security, Authentication, click the Active Directory namespace.
3. In the Properties window, specify the Host and port property:
For users in one domain, specify the host and port of a domain controller for the single
domain.
For users in one domain tree, specify the host and port of the top-level controller for the
domain tree.
For users in all domain trees in the forest, specify the host and port of any domain controller
in the forest.
4. Click in the Value column for Advanced properties and click the edit button.
5. In the Value - Advanced properties window, click Add.
6. Specify two new properties, chaseReferrals and MultiDomainTrees, with the following values:
Authentication for chaseReferrals MultiDomainTrees
One domain False False
One domain tree True False
All domain trees in the forest True True
7. Click OK.
8. From the File menu, click Save.




Cognos BI

The main objective of Business Intelligence is to bring the right information at the right time. Usually, the data is pulled from several sources and transformed into accurate and consistent information which is stored in the Data Warehouse.

Cognos Business Intelligence solutions help understand, monitor and manage business performance which includes business reporting & analysis, profitability measurement, budgeting, forecasting optimization and cost management.

Cognos is a fast and efficient technique to deliver multidimensional business intelligence data.Building a Data Warehouse and a complete and well performing Information Management system in an organization is a challenge both for the business and technical experts. The most important thing to have in mind is that Business Intelligence is not an IT driven area and it needs to come out as a need from the business and require tight and constant cooperation.That is why in order to deliver valuable information to the business and perform efficiently it is required to understand end-users needs and cooperate closely with the business before implementing Cognos applications.