TMCS is one of the leading Institute for Informatica PowerCenter Training in Marathahalli. TMCS has got highly professional real-time Trainers, good infrastructure, and a highly organized placement cell to help the students with placements. Regarding Informatica PowerCenter Training in Marathahalli, TMCS has helped a lot of students to get quality training and placement. If you are looking for professional, real-time Informatica PowerCenter Training in Marathahalli please contact TMCS
Informatica PowerCenter Course Content
An Overview of Informatica PowerCenter
- Overview of PowerCenter covering areas such as the Architecture, Terminology, Tools GUI, Mappings, Transformations, Sessions, Workflows and Workflow Monitor and installation of client.
- Create flat file and relational Sources using the Source Analyzer
- Create flat file and relational Targets using the Target Developer
- Mappings using the Mapping Designer
- Workflows using the Workflow Designer
- Monitor the workflow using the Workflow Monitor
- Preview Target Data using the PowerCenter Designer
- Understand PowerCenter Log files
- Use PC log files to:
- View and look up error messages
- Correct mapping and workflow errors
PowerCenter Transformations, Tasks and Reusability
- Define list of PC Designer Transformations and Workflow Tasks
- Clarify Active vs. Passive Transformations
- Use the Expression Transformation and Editor in a mapping which applies a file list to load multiple flat files into a stage table utilizing expression and filter transformations to format and screen erroneous customer data
- Use Reusable Designer Transformations to apply same formatting logic to Employee records
Features & Techniques
- Outline use of Arrange All and Arrange All Iconic
- Show ‘Autolink’ feature
- Show ‘Select Link Path’ feature
- Define Propagation of Port Properties
Joins and Link Conditions Define Joins
- Clarify Heterogeneous vs. Homogeneous Joins
- Use a Joiner Transformation to join relational and flat file sources
- Use the Source Qualify to join two relational sources
- Make use of Link Conditions to execute one session when the first finishes successfully.
Using the Debugger
- Outline Debugger Interface
- Show creation of a break point
- Show ‘Evaluate the Expression’ functionality
- Re-Execute mapping with different values
Sequence Generators, Lookups and Caching
- Define the Sequence Generator
- Define the Lookup Transformation
- Understand the different types of Lookups
- Clarify Lookup Caching
- Use a Flat File Lookup to add data to a relational target.
- Create a Multiple Row Return Lookup, use Aggregators and Expressions to count the number of orders a Customer has as well as total the sales for that customer. Add those values to a target.
- Build a Dates Lookup Cache for use in determining start and end dates of target records
- Utilize Event Wait, Event Timer and Email Tasks to wait for flat files and email users upon successful/failed load.
- Use a decision task to set criteria by which the workflow will decide which execution branch to follow
Update Strategies, Routers and Overrides
- Build a mapping that uses Update Strategies and Routers to determine insert/update logic for a target. Overrides will be used for incremental (daily) loading of the target.
- Unconnected Lookups
- Mapping Parameter/Variables and Mapplets/Worklets
- Define Sorter Transformation
- Detail Aggregator Transformation and Aggregate Functions
- Explain Unconnected Lookups and how they are called.
- Describe Mapping Parameters/Variables and initialization priority
- Outline Mapplets and Worklets
- Use these Transformations to create a mapping which loads records from warehouse Dimension tables to a Fact table.
- Parameter File Instruction.
Mapping Design Workshop
- Business requirement details will be provided so that the student can design and build their own mapping necessary to load a Promotions Aggregate table. The workshop will provide Velocity Best Practices documents that can be used to determine the correct logic for the mapping
Workflow Design Workshop
- Business requirement details will be provided so that the student can design and build their own workflow necessary to load all staging tables in a single workflow. The workshop will provide Velocity Best Practices documents that can be used to determine the correct logic for the workflow.
- Ascertain the use of the IsExprVar property in a mapping.
- Determine the structure of a parameter file.
- Establish the use of parameter files in mappings and sessions
- Describe the flexibility of using parameter files to build mapping expression logic.
- Describe the use of a date/time mapping variable, in a parameter file for incremental loading
- Describe and implement advanced functions
- Describe User-Defined functions
- Create a public, User-Defined Function to create a standard name formatting function and implement the UDF in the mapping.
- Use the AES_Encrypt and Encode functions to encrypt and encode customer data before writing it to flat file.
- Debug the mapping using an existing session and observe the results
- Describe the use of a Normalizer transformation to normalize data
- Describe the use of an Aggregator to denormalize data
- Normalize data into a relational table
- Denormalize data into a Fact table.
- Define Dynamic Lookup
- Describe the Dynamic Lookup Cache
- Use a Dynamic Lookup to load data into a dimension table.
- Use a Dynamic Lookup in tandem with an Update Strategy transformation to keep historic data in a dimension table
Stored Procedure and SQL Transformations theory
- Call a SQL stored procedure from a PowerCenter mapping
- Create and configure a SQL transformation in script mode.
- Create and configure a SQL transformation in query mode.
- Use a SQL transformation to create tables on an “as needed” basis.
- Enter a properly formatted query into a SQL transformation.
- Locate database errors in the result output of a SQL transformation.
Troubleshooting Methodology and Error Handling
- Design error handling strategies appropriate for the intended purpose of a workflow
- Identify data errors and load them to an error table.
- Describe Update Strategies
Transaction Control Transformation
- Describe the use of the transaction control transformation for data-driven transaction control
- Control when data is committed to disk or the target database
- Use a transformation variable to create a flag that determines when to commit data to the RDBMS based upon data values
Performance Tuning Mapping Design
- Apply best practices in your mappings to optimize performance
- Locate session properties that can unnecessarily lower performance.
- Inspect and edit mappings for optimal performance design.
- Inspect and edit transformations for optimal performance design
Performance Tuning: Pipeline Partitioning theory
- Apply partition points to efficiently utilize your CPU
- Partition your data to efficiently utilize your CPU
- Distribute your partitioned data to preserve functionality while optimizing your CPU
- Optimize your memory usage according to your partitioning strategy