Resume - Ramesh Nayaka J P

advertisement
RAMESH NAYAKA J P
C/O Sujatha, #163/1B, 7th Main, MV Garden
Sadananda Nagar, NGEF Layout, Bangalore - 560 038, India
Mob: +91 96323 93167; Email: rmshnyk@gmail.com
Experience of 9 years, seeking assignment as Data Scientist, Hadoop Developer/Administrator/Architect,
Business Analyst, Database Architect(RDBMS or NOSQL) in IT/Analytics sector
‘Improving profitability by reducing cost and building operational efficiency’
Key skills in Hadoop Ecosystem, mongoDB(No SQL), iPython, Oracle SQL, PL/SQL, TSQL, Teradata, Unix, SAS 9.1
Business Intelligence Tools
SYNOPSIS























2.5 years in Hadoop administration and development out of which 1.5+ years of experience with
MongoDB(No SQL) database
Good knowledge of Cloudera, Hortonworks, Windows Azure, AWS, EMR, Redshift, S3, RDS, SWF,
Implementation of Map Reduce in python.
Experience with Linux, UNIX Shell Scripting, Java, and ETL solutions.
Worked in a team of diverse skill sets and geographies.
Sole responsible in setting up hadoop multinode cluster with different environments.
Implement security and compliance best practices according to policies
Overall knowledge of RDBMS platforms, Data Modeling, Analytical Tools, Big Data technology trends, Big
Data vendors, and products.
Work on proof of concepts; design Hadoop deployment architectures - with features such as high
availability, scalability, process isolation, load-balancing, workload scheduling, etc.
Shared knowledge on Hadoop/NoSQL best practices with developers.
Worked on PIG, HIVE and SQOOP for the data store and ETL process.
Supporting NoSQL environments, publish, implement best practices and create a robust maintenance plan
automating routine tasks.
Fair knowledge of map reduce development framework
Configured HDFS to store upto 5 TB of data in a multimode(5 nodes) cluster environment
Hands on experience in SQL, PL/SQL and SQL* loader.
Extensive experience in RDBMS and also in NOSQL database like mongoDB
Hands on experience in creating stored procedures, functions, packages, and triggers in Oracle 9i
Well versed in SQL* LOADER to load .csv files to oracle tables
Worked in a client location Norwich (U.K) for a month
Worked in client location Paris (Europe) for two months on BIG Data Project
Knowledge of statistics, Machine Learning
Worked as a ‘Data Scientist’ on a BIG Data project using ‘Dataiku’ French tool and iPython for data
analaysis
Worked with SAS Enterprise Guide, Teradata SQL, SAS Add-Inn for Microsoft Excel, SQL Server
2000/2005/2008
Consistently exceeded organizational expectations with exceptional planning, analytical, and team leading
skills and the ability to work in cross-cultural and multi-ethnic environments
Technical Skill Set
RDBMS
SQL  PL/SQL  TSQL SQL Server 2000/2005/2008  Oracle 9i 
Teradata
Tools/Packages
SQL* Loader  DTS Package  SSIS
BIG Data Tools
Hadoop 1.2.1, Hive, Pig, Sqoop, Oozie, Zoo Keeper, Amazon cloud,
HBase
BIG Data Analytics Tool
Dataiku
NoSQL Database
mongoDB
Data Analysis
iPython, NoSQL, SQL
Operating Systems
MS Windows 2000/XP  Linux  UNIX
Resume of Ramesh Nayaka J P / Page 1 of 6
Certifications/Trainings






Oracle 9i PL/SQL Developer Certified Associate (OCA)
Oracle 9i Database Administrator Certified Professional (OCP)
Certification of participation of Red Hat Linux Essentials - RH 033
Certification of participation of Red Had Linux System Administration RH 133
Certification of participation from 10gen(mongodb) for M101P – MongoDB for Developers
Certification of participation from 10gen(mongodb) for M102 – MongoDB for DBAs
Employment Snapshot




Sr. Specialist(Process Lead), Business Intelligence and Analytics(BIA) in AXA Business Services,
Bangalore, India (Jul 2012 to till date)
Analyst in WNS Global Services Pvt. Ltd, Bangalore, India (Nov 2010 to Jun 2012)
Software Engineer in Aroha Technologies, Bangalore (Jan 2008 to Oct 2010)
Support Executive in Idenizen Smartware Pvt. Ltd., Bangalore, India (Nov 2005 to Mar 2007)
Hadoop Innovation Lab( December 2012 to till Date)
Areas/Streams
Activities


Administration












Development &
Data modeling









Installation and Configuration of Hadoop 1.2.1, Pig, Hive, HBase, Zookeeper,
Kafka, Spark, Sqoop, Spark, mongoDB
Hadoop cluster setup Single Node, Pseudo Distributed and fully distributed
Mode
Troubleshooting
Commissioning and Decommissioning of nodes from a cluster
Monitor Hadoop cluster connectivity and security
Manage and review Hadoop log files.
File system management and monitoring.
HDFS support and maintenance.
Table partitioning in Hive
Managing own virtual private servers
Managing HDFS built on top of FAT32, NTFS and Ext3 file systems
Install and Configure mongoDB compass
Install and configure python virtual environment
Created a multinode cluster with the below specifications
 1 node having Windows XP environment with SQL Server 2008
installed(for sql server data)
 1 node having Redhat Linux environment with Oracle 10 g
installed(for oracle data)
 1 node having Windows 7 environment with MS office installed(for
excel files)
 1 node having Ubuntu environment with mongodb installed
Configured HDFS in all the nodes to store 5 TB of data
Installed and configured HBASE, PIG and Hive to process the data
Installed and configured oozie for the workflow to import/export the data in
all these environments
Written Hive scripts for the data transformation
Created dashboard using Microsoft exel which refresh the data from the hive
using hive connectors
Database design from OLAP system to HBASE
Transform and load OLAP data into HBASE(column oriented database)
Develop and publish reports in Tableau(visualization tool)
Developed automated scripts for Sqoop Data extraction
Resume of Ramesh Nayaka J P / Page 2 of 6
Projects Executed
AXA US – FMC
Client: AXA Equitable – New York (USA)
Duration: Oct 2015 to till date
Team Size: 3
Role: Developer
Platform: MS Windows 7
Tools & Access: Tableau 9.2, Passport, Essbase
Brief: FMC team in AXA US wanted to converted their existing 32 reports that are currently in excel and in other
reporting tool to Tableau. Our team along with other vendor started working on this project. Vendor’s roles and
responsibilities is to interview stakeholders to understand and preparing the report templates.
Responsibilities:





Data cleansing based on the requirements
Understanding the data sources and preparing data model to build a cube
Producing prototype of the reports
Designed and develop architecture for the whole project
Develop reports in Tableau 9.2 visualization tool
Elite Producer Group(EPG) NoSQL Database Migration
Client: AXA Equitable – New York (USA)
Duration: Jan 2015 to Sep 2015
Team Size: 2
Role: Administrator and Developer
Platform: MS Windows XP,
Environment: MS-Access 2007, mongoDB, Node.js
Brief: Elite Producer Group is a group of agents whose annual net worth is 250 million dollars and there was a
database in place to track their performance and maintain the information. There was a request from the NY
office to enchance the EPG database. Since this database had lot of redundant data. So our work started cleansing
and re-structuring from relational tables to No SQL and creating new queries and reports.
Responsibilities:

















Data cleansing based on the requirements
Converted relational database schema Design to No SQL schema
Worked with JSON Data
Created javascripts for CRUD operations
Designed and implemented replication and sharding in multimode cluster environments
Indexing and monitoring of mongoDB
Sharding setup, monitoring, sharding key selection
Install and configure mongoDB databases and related softwares
Interacted with client projects in cross-functional teams.
Support and troubleshoot issues working closely with end users.
Understanding the requirements and create a table, queries, forms and reports
Data Integration from the different sources
Writing code’s by using excel VBA to connect to the datasource, and retrieve the data automatically
Interaction with the US client on need basis to provide status on the project
Preparation of requirement document and maintaining the status of the requests
Security, backups and restoring for backups of mongoDB
Trained peers on mongoDB adoptability
AXA-US (403b)(BIGDATA Project)
Client: AXA US – New York
Duration: Mar 2014 to Dec 2014
Team Size: 7
OnSite: 4th Sept to 4th October 2014(1 Month), 11th Nov to 10th Dec 2014(1 Month)
Role: Data Scientist
Platform: Nano Server
Tools/Language: Dataiku, iPython, SQL, statistics(Random Forest, Machine Learning)
Resume of Ramesh Nayaka J P / Page 3 of 6
Brief: AXA US(403b) is project related to big data analytics. Objective of the project is to build a target of
customers who are likely to increase their contributions. Data Innovation Lab had setup a nano server to store
Terabytes of data.
Responsibilities:

Data Integration from the different sources

Explore internal and external data sources

Validating the data sources

Finding important datasets and variables to build the statistical models

Developing data transformations steps using Dataiku

Finding features from the internal and external sources

Finding and developing derived variables

Developing python scripts to include in workflows

Building final datasets which includes all the important variables

Building and training models using Dataiku

Updating the data analysis document with all the necessary information about the data exploration

Preparation of requirement document and maintaining the status of the requests
Service Datawarehouse
Client: AXA Travel Insurance – Redhill (UK)
Duration: Feb 2013 to May 2013
Team Size: 3
Role: Database Developer
Platform: Windows 2003 Server
Environment: Microsoft Business Intelligence Studio [SSIS, SSAS, SSRS]
Brief: AXA Assistance receives call detail files fom different entities and is of .xls and .csv files. ATI wanted a
datawarehouse which has to be integrated with underwriting database. From this ATI wanted to generate a
Tier1 report by collating all these information from the service and the underwriting database.
Responsibilities:

Shared technical document with the client

Governing calls scheduled for ATI

Understanding the requirements and created a tables, TSQL scripts.

Data Integration from the different sources

Used SSIS to create a packages for daily, weekly and monthly run.

Preparation of requirement document and maintaining the status of the requests
Commercial Marketing Analytics
Client: Confidential (Insurance)
Duration: Nov 2010 Jul 2012
Team Size: 7
Role: Database Analyst
Platform: MS Windows XP, Unix
Environment: SAS, Unix, Teradata, Excel VBA, MS Map point
Brief: This is a marketing insights project, which requires generating analytical reports using SAS EG and
Teradata. Aviva has now migrated its data warehouse to Teradata sensing its seamless capabilities to produce
impeccable business intelligence solutions. Ours is a new team, comprising of seven members and were sent to
Norwich, UK for three week process training at client’s site. We got to know how Aviva do its marketing research
and predict and develop a rigorous model to attain new customers. The project gave us an opportunity to have a
deep insight of insurance domain. The client gave us rigorous trainings and knowledge transfers on how to run
existing reports on SAS and create new one’s once a requirement arises. We were supposed to generate reports
for the Capability team, who is designated to run campaigns for various new offers and products of Aviva. These
campaigns were usually targeted on a group or class of population, and hence our reports help the capability
team to understand the ways a particular group of responds or might respond to a particular campaign.
Module Description: Apart from generating the portal report, the client gave us a responsibility to build a
Data Catalogue for some of the views present in the Teradata database. By Data catalogue, the client meant that
it wanted to have a handy documentation of views’ descriptions. For instance, a database might contain views
which are update weekly and hold customer and policy information. Now these weekly views are generously
used by modellers to create analytical and predictable models for Campaigns and Insights. Hence, this data
catalogue on weekly views will help these modellers to view various columns and their attributes or
information, in a consolidated way. This activity of developing the data dictionary involved a deep
understanding of Teradata system tables and SQL.
Responsibilities:
Resume of Ramesh Nayaka J P / Page 4 of 6






















Maintaining documentation for all the tasks processed
Collation of all the requirements from the client for all data catalogue
Writing SQL queries to analyze the existing data and monitoring the behavior’s of the views
Writing complex queries in SQL to generate a consolidated Data Catalogue for monthly and weekly
views
Incorporating descriptions and identifying missing population for each of the columns of all the views
Writing procedures and creating views to automate certain modules
Writing code’s by using excel VBA to connect to the datasource, and retrieve the data automatically
Performing quality checks on data which is available in the catalogue
Involved in creating and modifying of SQL script files
Involved in creation of maps using Map point Software, written VBA code to connect to map point
software to crearte maps and also to show how the customers are penetrated based on the territory
across all business area
Interaction with the UK clients on weekly/monthly basis to provide status on projects
Condcuted training for peers on SQL, PLSQL, Database concepts and business logic
Worked extensively on ad-hoc requests from the client
Involved in executing/modifying monthly model scores which is written in Unix
Writing scoring code’s which assists in scoreing customers
Initiative taken to run weekly and Monthly BAU reports
Creating/maintaining Campaign reporting Dashboard, Planning Packs, Quarterly packs report on
weekly/fortnightly/Monthly basis
Created and maintained the legal care report to show the retention of the customer in 13 week cycle
Took initiative to create the reports on SAS portal using Web Report Studio and published the reports
on the client portal
Automated reports like Campaing dashboard, Maps & penetration report and data dictionary which
inturn has reduced the time taken to generate the reports manually
Preparation of checklist to attain accuracy of reports generated and also maintain the data quality of
the report
Conducted demonstration with clients on the above mentioned automation of reports
RDCP
Client: HMC (Well Point) - Health Care
Duration: Jun 2010 to Sep 2010
Team Size: 4 (Mu-Sigma)
Role: Database Analyst
Platform: MS Windows 2003 Server
Environment: SQL Server 2005/2008, Teradata, Informatica 8.2, Oracle 10g, SAS
Brief: RDCP provides the details for the analysis and data correction process associated with client member
demographics, eligibility, and claims (medical and pharmaceutical). It has been developed in support of the
report delivery requirements to provide Client Outcomes with accurate eligibility and claims data to facilitate
the production of their Annual DM (Disease Management) client reports.
The scope of the RDCP is aimed at making WellPoint’s EDL Membership and Claims data, which has
been successfully loaded into the Information Hub, available for Client Outcomes’ reporting needs. Alloy is used
as the workflow management tool, identifying requirements and tracking their progress. When a report is
required, a ticket is entered through Alloy, triggering the report development and delivery cycle.
Responsibilities:

Involved in creating / modifying of procedures, packages

Involved in writing scripts on teradata environment

Analyzed user requirements

Creating test data and executing unit test cases

Involved in Importing of the data from the flat files to databases

Interaction with the US clients on daily basis to provide status

Received internal training on metadata repository using informatica 82

Extracted customer base data from Teradata databases using Teradata SQL Assistant, SAS/Base, and
data warehouse and prepared into standard format using SAS functions
Resume of Ramesh Nayaka J P / Page 5 of 6
Organization Recognition




Awarded as Best Employee of April 2014 month
Awarded as a Best Sr. Specialist in Q3 2014
Best Innovation Award in providing the big ideas and implementing
Special Achievemnet Award in contribution towards success of the projects
Education


M.C.A(Master of Computer Applications), P.E.S Institute of Technology, Visveswaraiah Technological
University, Karnataka, India (2005)
B. Sc (Computer Science), SJM College, Kuvempu University, Karnataka, India (2002)
Resume of Ramesh Nayaka J P / Page 6 of 6
Download