Hadoop Sample Resume-5 here

advertisement
xxxx xxxx
xxxxx
San Diego, CA 92127
Email: [email protected],
Ph: xxx-xxx-xxxx
SUMMARY
 Hands on technical architect with 14 years of experience in building data platforms.
 Architecting data platform supporting more than 25 million daily active users and more than 60
million monthly active users.
 Building Big Data warehouse systems supporting PB scale data.
 Strong experience working with Relational and No-Sql databases
 Strong experience building data pipelines / ETL
 Strong experience working with distributed systems and big data.
 Strong experience in developing and supporting SaaS enterprise level applications.
 Possess excellent analytical, problem solving and communication skills.
TECHNICAL SKILLS
Databases:
Big Data:
Cloud:
Programming:
MS SQL Server, Redshift, Hive and Couchbase.
Cloudera Hadoop, HIVE, Pig, Impala, Sqoop, Oozie and Map Reduce
Amazon AWS, S3, ec2, Redshift and Qubole.
SQL, C++, Java and Python
EDUCATION
MS in Computer Science, West Virginia University, Morgantown, WV (May 2001)
BS in Computer Science, Nagarjuna University, India (May 1999)
WORK EXPERIENCE
Manager / Data Architect
Sterkly, Carlsbad, CA
(Mar 2011 to Present)
Responsible for managing backend data platform and data infrastructure. Responsible for coordinating
with various teams like IT, engineering, business analysts and product management.
Technologies used: Cloudera Hadoop, Hive, Pig, Impala, Sqoop, Oozie, Sqlserver, Couchbase, AWS and
Redshift.
Some of the key projects / accomplishments:
 Designed and implemented scalable big data collection system to ingest the data from different data
sources. Any application can use this system with simple message registration and start sending the
data. Right now it collects about 12 million data points per minute at the peak and more than 4 billion
messages/records per day. We process about 1 TB of data per day using this system and stored in
HDFS.
 Built end to end scalable data pipelines / ETL to ingest data from different sources and process and
transfer the data to various destinations using PIG, Hive, Sqoop and Oozie.
 Designed and implemented big data warehouse with nearly 1 PB of data using HDFS and Hive
 Designed and implemented backend systems for custom Ad exchange platform that servers more than
2 billion impressions per day using Couchbase.
 Designed and re-architected the scalable data platform based on distributed shared nothing
architecture. This system currently runs on 4 nodes supporting 25 million daily active users and
supports more than 100k transactions per second with 99.99% up time. It can be scalable up to 256
nodes.
 Designed and implemented the infrastructure for real-time analytics.
 Build a great data infrastructure team. Recruited highly talented team members and motivated them to
be productive.
 Successfully transitioned our team to build the platform using no-sql / big data technologies in
addition to traditional relational databases.
Sr Database Administrator
Blackbaud Inc., San Diego, CA
(Nov 2004 to Mar 2011)
Leading provider of SaaS online CRM, CMS and payment processing software to nonprofit
organizations. (Previously Kintera Inc acquired by Blackbaud)
Responsibilities Include:
 Designed and developed complex multi-tenant database applications for SaaS platform.
 Production monitoring, performance optimization and scalability for high volume OLTP database
applications
 Database administration and maintenance including production deployment and production support.
 Replication setup and maintenance, reporting services development.
 Work with UI developers for TSQL code review and help them with db and SQL problems.
Database Administrator
Megabyte systems Inc. Irvine, CA.
(2001 to Nov 2004)
Leading provider of property tax software for California counties.
Responsibilities Include:
 Installing, configuring and administering our as well as our clients’ production and report servers.
 Managing database security, logins, roles and object permissions.
 Implementing database backup and recovery procedures.
 Evaluating and monitoring databases to resolve performance issues and fine tuning queries and stored
procedures.
 Creating and managing DTS packages, Triggers and Stored procedures.
 Implementing and administering database replication between SQL Server databases.
 Providing consultation on SQL Server to our development team.
Research Assistant (1999 to 2001)
West Virginia University, Morgantown, WV
Done potential research in the field of graph algorithms and it is concentrated in analysis and
implementation of various algorithms using complex data structures in C++.
Download