xxxxx xxxxx

Email:

Phone: xxx-xxx-xxxx

Current Location: Boston, MA

Summary

·  7+ years of full Software development life cycle out of which 3+ years of Hadoop developer/Admin/Data Science (with experience of Sqoop, Pig, Hive, Map-Reduce, Oozie, Hue, Flume, Hbase) including Analysis and development.

·  Expertise in Hadoop ecosystems HDFS, Map-Reduce, Hbase, Pig, Sqoop and Hive for scalability, distributed computing and high performance computing.

·  Experience in Creating a Hadoop Cluster, Breaking the Cluster, Using Fair Scheduler, Configuring HDFS High Availability

·  Cloudera Certified Developer for Apache Hadoop.

·  Experience in working with Map-Reduce programs using Apache Hadoop for working with Big Data.

·  Experience in installing and running daemons on Namenode, JournalNode, ZookeeperFailover, ZooKeeper Server, DataNode, NodeManager, ResourceManager, JobHistoryServer

·  Experience in installation, configuration, supporting and monitoring Hadoop clusters using Apache, Cloudera distributions.

·  Experience in using Pig, Hive, Sqoop, HBase and Cloudera Manager.

·  Experience in working with Kerberos Realm (domain ASA.ORG)along with Cloudera Manager

·  Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.

·  Experience in analyzing data using HiveQL, BeeLine, Pig Latin, and custom Map Reduce programs in Java.

·  Experience with Java virtual machine (JVM) and multi-threaded processing.

·  Worked on NoSQL databases including HBase, Cassandra and MongoDB.

·  Participant in setting up Zookeeper and making system High Availability.

·  Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the Hadoop ecosystem.

·  Good understanding of XML methodologies (XML,XSL,XSD) including Web Services and SOAP

·  Techno-functional responsibilities include interfacing with users, identifying functional and technical gaps, estimates, designing custom solutions, development, producing documentation, and production support.

·  Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.

·  Experience in Agile, Waterfall methodologies

Technical Skills

Hadoop/Big Data: HDFS, MapReduce, Hbase, Pig, Hive, Sqoop, Flume, MongoDB, Cassandra, Oozie, Zookeeper

Java: Core Java, JDBC

IDE’s: Eclipse, Net beans, VisualStudio, RStudio

Big data Analytics: R

Programming languages: R, C,C++, Java, C#,ASP.NET

Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server

Web Servers: Web Logic, Web Sphere, Apache Tomcat,IIS

Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL

Network Protocols: Kerberos, TCP/IP, UDP, HTTP, DNS, DHCP

Testing: Selenium, Junit, Nunit,Win Runner, Load Runner ,QTP

Professional Experience:

American Student Assistance, Boston, MA Aug 2011 – Till Date

Big Data/Hadoop Developer

American Student Assistance is a private nonprofit organization with public mission to empower students and alumni to successfully manage and repay their college loan debt. ASA do analyze, design, develop and implement students based benefit application for its various clients. ASA has built different applications out of which SALT helps students to access the benefits and other loan repayment programs. SALT helps the students and alumni to have a personalized user experience and to understand their loans and repayment options. The goal of the application is to develop a single point solution that supports all lines of business in a common, unified participant experience.

Project: FPA Oct 2013 – Till Date

FPA is File processing Architecture system where raw data is received in different format text, logs, csv etc. from different resources like Universities/Colleges/Surveys/SALT/loan-section & Education Department etc. Data is loaded into HDFS, Mapping & Reducing is done on students raw data in order to analysis students data and to calculate students loan default ratings, Results are then store in HDFS and then results are loaded onto FPA database for further analysis and reporting.

Responsibilities:

·  Load and transform large sets of structured, semi structured and unstructured data.

·  Extracted files from Oracle, and SQL Server through Sqoop and placed in HDFS and processed. Involved in defining job flows, managing and reviewing log files.

·  Installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, Hive, Pig, Sqoop, HBase, Flume and Spark.

·  Involved in importing and exporting the data from RDBMS to HDFS and vice versa using sqoop.

·  Written Hive queries for data analysis to meet the Business requirements.

·  Moved 10 TB of Student Loan Volumes into HDFS and did analysis for finding the Delinquent Loan Students in the last 10 Years using R Graphics.

·  Got good experience with NOSQL database.

·  Helped in Migrating from MRV1 to MRV2/YARN (From Secondary Node to Standby Node)

·  Involved in creating Hive tables loading with data and writing hive queries which will run internally in map reduce way.

·  Installed and configured Pig for ETL jobs. Written Pig scripts with regular expression for data cleaning.

·  Team player with good interpersonal relations, strong leadership and problem solving skills.

·  Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.

·  Wrote customized map reduce to extract data per requirement.

·  Participant in the building scalable distributed data solutions using Hadoop.

·  Helped System Admin for cluster maintenance, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files.

·  Handled importing of data from various data sources, performed transformations using Hive, Map-Reduce, loaded data into HDFS and Extracted the data from Teradata into HDFS using Sqoop.

·  Analyzed the data by performing Hive queries and running Pig scripts to know user behavior.

·  Involved in the process of Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.

Environment: Hadoop, Map-Reduce, Java, HDFS, Hbase, Hive, Java, SQL, Pig, Sqoop, Oozie, Zookeeper, R

Saltmoney.org, Boston, MA Aug2011– Oct 2013

Big Data/Hadoop Developer

SALT is a free, nonprofit-backed educational program that helps every student who wants a college degree to get in a financially responsible way. SALT’s neutral advice, practical information, and interactive lessons help students gain money knowledge for college and beyond, keeping them on the path to success.

Responsibilities:

·  Involved in importing and exporting the data from RDBMS to HDFS and vice-versa using sqoop.

·  Analysis students data and feeds from Saltmoney/Community websites

·  Created Reusable Hive Queries to assist day to day as well adhoc analysis.

·  Created strategy to create Data warehouse on Hadoop Cluster.

·  Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.

·  Provided quick response to adhoc internal and external client requests for data and experienced in creating adhoc reports.

·  Load and transform large sets of structured, semi structured and unstructured data using Hadoop/Big Data

·  concepts.

·  Responsible for creating Hive tables, loading data and writing hive queries.

·  Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS.

·  Extracted the data from sql server management studio into HDFS using the Sqoop.

Environment: Hadoop, Map-Reduce, Java, HDFS, Hbase, Hive, Java, SQL, Pig, Sqoop, Oozie, Zookeeper,R, C#

Liberty Mutual, Dover, NH Jun 2009 to Jul 2011

Data Analysts

The Elements project will introduce a new multiplicative rating program to the Homeowner line of business. Elements will use well-known loss predictors (e.g. credit and prior losses) and new predictors (e.g. smoker and granular territories) in conjunction with a by-peril structure in underwriting and rating to determine more competitive and profitable rates. Elements will also provide clarity to the rating plan by differentiating cost factors from marketing factors. This new pricing plan will increase future growth and profitability by appropriately pricing each policy sold based on level of risk.

Responsibilities:

·  Performed Time-series analysis to identify Sales and Net revenue trends by location

·  Maintain, trouble shoot and reset SQL Database while keeping backup

·  Work with clients to integrate third party data into existing database.

·  Identified the granularity level of the data required to be available for analysis.

·  Interacted with business representatives and end users for requirements analysis and to define business and functional specifications

·  Drafted prototypes with entailed KPI's (Key Performance Indicators) and key metrics.

·  Coordinated with the management to discuss business needs and designed database solutions accordingly.

·  Involved in gathering specifications and requirements from development personnel prior to testing.

·  Analyzing the complex requirements, Strong influence management skills, reporting and analytical/problem solving skills with attention to detail.

·  Highly motivated team player with excellent Interpersonal and Customer Relational Skills, Proven Communication, Organizational, Analytical, Presentation Skills, and Leadership Qualities.

·  Outstanding Data analysis skills including Data mapping from source to target database schema, Data Cleansing and processing, writing data extract scripts/programming of data conversion and researching complex data problems.

Environment: HTML, Java, JavaScript, Oracle 9i, SQL and MS Office., URDB

Environment: SQL, HTML, JavaScript, Java, URDB

Wall Mart, Bentonville, AR Aug 2008- May 2009

Developer

·  Developed solutions for diverse programming scenarios in C# 3.0, employing Object Oriented Programming (OOP) concepts such as encapsulation, inheritance, polymorphism, and abstraction.

·  Worked extensively on ASP.Net developing forms and User Controls.

·  Actively Participated in all phases of the Software Development Life Cycle (SDLC) from implementation to deployment.

·  Involved in creating and designing various web services.

·  Involved in analyzing and designing UI, middle-ware and the data layer of the application.

·  Involved in designing various Service contracts, Operation contracts and Data contracts for the application.

·  Designed and implemented WCF based services using service orientated architecture.

·  Created and maintained database objects, Stored Procedures, Tables, Views and SQL Joins by Implementing SQL Server 2008.

·  Worked on Oracle and SQL server database as a backend connecting from ASP.Net application.

·  Involved in designing various User Controls in MVC.

·  Involved in Writing the Business Logic for various modules.

·  Used Accurev for source control and file management.

·  Converted various business requirements to technical specification documents.

·  Job operated within agile environment, daily scrum meetings, pair programming, presentations and review.

·  Involved in analyzing and identifying various pointers and touch points for performance improvement within the suite.

Environment: .Net Framework 2.0/3.0/3.5, Visual studio, C#.Net, ASP.Net, ADO.Net, WCF, SQL Server 2008, JavaScript, Microsoft IIS Server, XML, HTML, Web Services, WSDL, UML, LINQ,, JIRA, Confluence, Zephyr, PostgreSQL.

Education:

·  Pre-Doctorate in computer Science from Madurai Kamaraj University, India 2005

·  MCA from Madurai Kamaraj University, India, 2002

Certifications

·  Cloudera Certified Developer for Apache Hadoop 2015