shell object

advertisement

Dice Candidate Resume Information:

Looking for Corp to Corp kumar_sri70@yahoo.com

Kumar

SUMMARY:

Over 5 years of IT experience in Software Analysis, Data Modeling, Design, Development,

Implementation and Application development using Ab Initio, Business Objects, Oracle, Teradata,

DB2, Java, ASP, MS SQL Server, Windows NT and Unix Environments with emphasis in

Business Intelligence, Data Warehousing.

Over 2 years of experience in developing strategies for ETL (Extraction, Transformation and

Loading) mechanism using Ab Initio tool in complex, high volume Data Warehousing projects.

Strong development skills including the ability to work through the entire development life (SDLC) cycle from gathering requirements through implementation, development, production, support and documentation of the complete project

Worked extensively in the GDE (Graphical Development Environment) configuring, connecting, and testing the components to produce executable flow graphs on Unix environment, Proficient with various Ab Initio Data Cleansing, Parallelism, Transformation and Multi File System techniques

Thorough knowledge of DML (Data Manipulation Language), UNIX Shell scripting (Korn and

Bourne). Developed various UNIX shell wrappers to run Ab Initio and Database jobs. Practical experience with working on multiple environments like production, development, testing

Worked with different source systems like DB2, Oracle 9i/8i, MS SQL Server 2000/7.0, Teradata,

Informix and Expertise knowledge in Dimensional Data Modeling, Star schema, Snow-Flake schema, creation of Fact and Dimension Tables, OLAP, OLTP and thorough understanding of

Data-Warehousing concepts

Performed Debugging, Troubleshooting, Monitoring and Performance Tuning.

Extensively worked with various databases such as SQL Server 2000, Oracle, DB2 UDB, and

Teradata

Extensively used Business Objects and Crystal Reports for the purpose of reporting.

Extensively used Teradata utilities (Multi Load, Fast Load, and TPump)

Strong skills in writing PL/SQL, SQL and Stored Procedures in Oracle 9i/8i, MS SQL Server and optimizing the SQL to improve the performance

Exceptional analytical, problem solving skills and flexible to learn new technologies in the IT industry towards companys success.

Excellent communication and interpersonal skills, positive attitude and perseverance to undertake challenging jobs

TECHNICAL SKILLS:

ETL Tools Ab Initio GDE 1.13/1.11/1.10, Co>Operating System 2.13/2.11/2.10, Informatica 6.1

Reporting Tools Business Objects 5.1/5.0/4.1, Crystal Reports 10.0/9.0/8.5/8.0

DataModeling Tools Erwin 4.0/3.5

Programming Languages C, C++, SQL, PL/SQL, Shell Scripting(Korn-Shell,C-Shell) PERL,HTML

Operating Systems UNIX, LINUX, Windows NT/2000 95/98/ME,Solaris 7.0

Databases Oracle 9.x/8.x/7.x, SQL Server 2000 /7.0/6.5, Teradata v2r5/v2r4, DB2 UDB 7.2, MS-

Access

Utilities& Others FastLoad, Multiload, Maestro, Control-M

PROFESSIONAL EXPERIENCE:

Merck, New Jersey Apr 05 Present

Ab Initio Developer

Description: Was responsible for doing an escrow analysis and load the (P and I) & (P and L) datamarts. The warehouse also allowed analysis for the marketing team by providing data about

Balloon payments and Bankruptcy filings. All the analysis was done under ECOA. The source

(ODS) was a DB2 system and the target datamarts were designed in Teradata.

Responsibilities:

Involved in the full life cycle of the Ab Initio project, from requirements to deployment and support.

Instrumental in installing and configuring Ab Initio Co-Operating System and GDE.

Data was extracted from sources like DB2, Oracle using Ab Initio Components.

Implemented partition techniques using MFS with Partition by Key, Partition by _Expression and

Round-Robin techniques on the data unloaded from multiple tables before sending the data through data quality checks.

Improved graph performance using Phasing and eliminated repeated partitions/sorts.

Improved graph performance by eliminating repeated partition/sorts.

Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.

Used Departition components such as Gather and Merge to combine data from multiple flows.

Used sandbox parameters to check in and check out of graphs from repository.

Used phases and checkpoints to prevent deadlock and safeguard against failures.

Involved in developing UNIX Korn Shell wrappers to initialize variables, run graphs and perform error handling.

Involved in setting up the Control-M scheduler and automated the monthly, weekly and daily updates to stage and dimension tables using the UNIX KSH scripts.

Loaded the data into the Teradata database using Load utilities like (FastLoad, MultiLoad, and

TPump).

Implemented performance tuning in Teradata database using Teradata utilities.

Involved in Unit testing and System testing.

Environment: Ab Initio (GDE 1.11, Co>Operating System 2.11), Business Objects 5.1, Oracle 8i,

DB2, Teradata v2r4, SQL, PL/SQL, Windows NT, UNIX (Sun Solaris), Shell Scripting.

Amex, Phoenix AZ Jul 04 Apr 05

Ab Initio Developer

Responsibilities:

Developed Ab Initio Graphs based on business requirements

Developed Ab Initio XFRs various data validations and business rules.

Improved the performance of Ab Initio graphs by using performance techniques such as Lookups instead of Joins

Extract the data from staging tables, perform transformations and load data into warehouse tables using Ab-Initio GDE

Implemented Lookups, Lookup Local, In-Memory Joins and Rollups to speed up the Graphs

Worked with Departition components like Gather, Interleave in order to partitioning and repartition the data from Multi File accordingly

Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.

Configured Ab Initio environment to interact with databases using Input Table, Output Table, and

Update table Components and configuration files.

Have experience in Ab Initio components such as Round Robin, Join, Rollup, Partition by key,

Gather, Merge, Interleave, Dedup Sorted, Scan, Validate, FTP

Conducted execution of the graphs using the Co>operating systems. Also carried out validation, unit testing, regression testing and integration test

Documentation of complete Graphs and its Components

Written complex SQL queries using joins, sub queries and correlated sub-queries

Wrote and implemented UNIX Shell scripts for migration of database from production to development system.

Deployed the graph as executable Korn shell scripts in the application system.

Extensively used VI editor.

Environment: Ab Initio (Co>Operating system 2.13, GDE 1.13.4), Oracle 9i, DB2, MS SQL Server

2000, Business Objects, UNIX Shell Scripts

Bank Of Oklahoma-Tulsa, OK Jan 04 - Jul 04

Ab Initio Developer

Responsibilities:

Involved in the preparation of mapping documents

Involved in the preparation of design documents

Involved in the preparation of documentation for ETL using Ab Initio standards, procedures and naming conventions.

Developed and supported the extraction, transformation and load process (ETL) for a Data

Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on

mentoring in the use of Ab Initio.

Involved in designing fact, dimension and aggregate tables.

Created Database Configuration files (.dbc). Generated Table Configuration files (.cfg).

Documents the graphs and components

Involved in the development of reports using Business Objects

Executed the graphs using the co-operating system.

Environment: Ab Initio (GDE 1.10.6, Co>operating system 2.10), DB2, Oracle, Business Objects,

UNIX, SQL Navigator.

Sierra Optima Ltd, New Jersey Jan03 Dec03

Informatica Developer

Description: Create a data warehouse that would involve source data from different departments like Finance, Sales, and Marketing and provide complete analytical solution.

Responsibilities:

Involved in Building Dimension tables and Fact tables.

Extensively used Informatica Power Center for Extracting, Transforming, and Loading into different databases.

Analyzed the data model and identification of heterogeneous data sources.

Designed, developed and tested the different mappings according to the requirements.

Wrote PL/SQL stored procedures and triggers for implementing business rules and transformations.

Involved in implementing Informatica tool to set up the repository using Repository Manager.

Used Source Analyzer and Warehouse Designer to import the source and target database schemas.

Worked extensively on different types of transformations like Source Qualifier, _Expression,

Filter, Aggregator, Rank, Lookup, Stored Procedure, Sequence Generator, and Joiner.

Used the mapping designer to map the sources to the target.

Wrote triggers and stored procedures using PL/SQL for incremental updates.

Developed Informatica mappings and Mapplets and also tuned them for Optimum performance.

Extensively used ETL to load data from Flat Files to Staging area (Oracle 8i) using Informatica

Power Center 6.0.

Completed the tasks in a given timeframe for every release.

Environment: Informatica PowerCenter 6.0, Flatfiles, SQL Server 2000, Oracle 8i, UNIX.

Airtel Cellular, India Sep01 Nov 02

ETL Developer

Responsibilities:

Involved in the design, development and implementation of the Enterprise Data Warehousing

(EDW) process and Data Mart

Used Informatica PowerCenter 6.2 for migrating data from various OLTP servers/databases to the data mart

The data migration included identifying various databases where the information/data lay scattered, understanding the complex business rules that need to be implemented and planning the data transformation methodology

Used relational sources and flat files to populate the data mart

Translated the business processes into Informatica mappings for building the data mart

Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, _Expression, Joiner,

Sequence generator

Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance

Configured the mappings to handle the updates to preserve the existing records using Update

Strategy Transformation

Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables

Used Debugger to test the mappings and fix the bugs

Environment: Windows 2000, Informatica PowerCenter 6.2, Oracle 9i, SQL*Plus, TOAD.

Sun Technologies, India Nov 00 Sep 01

Oracle Developer

Description:

The transaction data is extracted from the various branches across the country and developed reports for the project, which the most comprehensive, intranet powered information management system ever available to efficiently manage and integrate all to their financial, and administrative data services. FTP component was developed internally for this project to send the reports to sonata server. Providing secure, intranet based ATM transaction management system.

Responsibilities:

Working with Oracle databases to

manage data for development, training and production environments.

Created

database objects such as tables, views, synonyms, indexes, sequences and database links as well as custom packages tailored to business requirements.

Develop and maintain scripts for monitoring, troubleshooting and

administration of the databases.

Developing and Tuning SQL, PL/SQL

triggers and stored procedures.

Created scripts to insert static records

and execute them using SQL*Plus.

Developed SQL *Loader scripts, conversion

scripts for the conversion of the data to the custom

new schema.

Provided

database support and worked closely with the development team as they submitted daily changes recorded as releases to the production database.

Tuning

database and SQL statements and schemes for optimal performance.

Involved

in the design and development of Forms and Reports using Developer2000.

Environment: Oracle9i, SQL, PL/SQL, SQL*Plus, Developer2000 (Forms 6i, Reports 6i) SQL

Loader, UNIX, Windows NT.

Download