- Development of Multi-Dimensional Warehouse projects by reading and loading high-volume type 2 dimensions by Implementing SCD (Slowly Changing Dimensions) using required technologies.
- Discuss and Gathering ETL requirements for Data Warehouse from Business Analysts, developing and supporting it till the end of project lifecycle, as well working on new ETL enhancements and supporting it.
- Provide Technical inputs and guidance to the team contributing to solve Business Complex problems.
- Streamline and automate ETL processes allowing for their completion within Service Level Agreement (SLA) timelines.
- Handle day to day Production defects of ETL jobs and supporting it till Business requirement are matched.
- Extensively use IBM Infosphere Information Server DataStage Designer, DataStage Director for developing jobs and to view log files for execution errors to solve business problems.
- Integrate Data from Disparate Sources such as Flat Files (Fixed Width, Delimited), COBOL Files, Microsoft Access Database Files (.accdb), Extensive Mark-up Language (XML) Files, and Relational Databases (RDBMS – DB2, Teradata, SQL Server, Oracle).
- Build DataStage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Column Generator, Difference, Row Generator, Sequencer, Email Communication activity, Command activity, Sequential File, CFF stage, Dataset, Terminator activity.
- Transformation of data by using various conversion functions available and create High Complex custom conversion functions.
- Use Data Stage Director and its run-time engine for job monitoring, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis.
- Document ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
- Understand the behavior and enhance the jobs by Tuning of complexed running jobs.
- Prepare DDL’s for table creation, table modification, and index changes.
- Prepare the DML’s for maintenance tables, reviewed, tested and executed them.
- Develop batch/shell scripts for variable substitution, source file preparation, scheduling, executing BTEQ scripts, error logging, exception handling and address profiling.
- Build UNIX shell scripts to track DataStage job logs to Developer and Analyst groups.
- Build Real time Scripts for Data Analysis with Python using Jupiter Notebooks; By exploring and integrating Data Frames and Spark Framework of Big Data Technologies
- Use Sqoop as main data migration Tool built scripts to migrate data form RDMS (DB2) to NoSQL Databases (Hive).
Apply by Mail: Send Resume to firstname.lastname@example.org or HR, 1075 Jordan Creek Pkwy, Suite 295, West Des Moines, IA 50266