Senior technologist with strong business acumen and technical experience in Big Data space. Results oriented, decisive leader in Big Data space that combines an entrepreneurial spirit with corporate-refined execution in Big Data strategy, Big Data consulting, implementations, CoE setup, architecture, pre-sales and revenue optimization. Adept strategic partnerships builder across technical and global businesses for Fortune 500 customers.
Areas of expertise include:
~ Hadoop Administration, development
~ NoSQL and real time analytics
~ Big Data/Hadoop solution Architecture, Design & Development
~ Social Media, Community & Viral Marketing
~ Business Development, Global Scale
~ Big Data strategy
Specialties: Business Development, Business Strategy Consulting, Planning & Operations in Big Data space, Business Development, Big Data Architecture, Big Data/Hadoop implementations, Hadoop Administrator, NoSQL/HBase/Cassandra, EDW, RFP/RFI/Defence proposal, Big Data strategy budgeting
Sr. Systems Architect @ I help organizations succeed with Hadoop.. From September 2014 to Present (1 year 2 months) Greater St. Louis AreaBig Data Consultant @ EMPI, E&M, OpenEMPI, Hadoop, Healthcare, ACO From February 2014 to August 2014 (7 months) Greater St. Louis AreaBig Data Consultant, COE @ Work with Mahindra Satyam and Tech Mahindra pre-sales and delivery team for Big Data services, strategy engagements with our existing and new customers in Big Data space.
RFP, Defense pitch, Big Data sales pitch and strategies, engage with customers and provide solution design, consulting and recommendations.
Provide consulting to customers in identifying Big Data use cases and then guiding them towards implementation of use cases.
Big Data strategic planning, technology roadmap, talent acquisitions and mentor team for cutting edge technology competitiveness such as Hadoop, HIVE, HBase, Cassandra, AWS, Cloud computing, Tableau, Spark, R, social graph, big data analytics, data visualizations.
Lead a team of 30+ Big Data CoE team on different Big Data Solutions/PoCs for multiple customers and verticals.
Infrastructure setup, capacity planning and administration for CoE and customers - 124 node Cloudera CDH4, 18 nodes CDH3u1, 38 node Hortonworks HDP 1.3, 6 node MapR M5, 26 nodes MapR M7, 67 node Apache Hadoop.
PoCs and implementations: ETL on Hadoop and freight analytics for Major Australian Airlines, Social Network Analytics for Telco major in APAC, BIDW integration for Networking major in US, Predictive Asset Maintenance for O&G customer, Device Capacity Management for Telco Major in US, Social Media Analytics for US based M&E major, Fraud analytics for major BFSI customer, Mainframe Operation Analytics for US based Auto Major and Location Based services for US based telco major From October 2010 to August 2014 (3 years 11 months) Greater St. Louis AreaBig Data Consultant @ 1. Work with Lumeris technology and business groups for Hadoop migration strategy
2. Validate and assess Lumeris current planned phases for Hadoop/Big Data migration
3. Recommend suitable technology stack for Hadoop migration considering Lumeris current architecture
4. Provide consulting services on Lumeris Hadoop migration strategy, roadmap and technology fitment
5. Validate and Recommend on Hadoop Infrastructure and data center planning considering data growth
6. Execute and advice on the optimal solution implementation From November 2013 to December 2013 (2 months) Greater St. Louis AreaCloudera Hadoop Admin/Consultant @ Setup CDH4.4 Hadoop cluster and Security on AWS.
Advise on tech stack for the required use case. From August 2013 to September 2013 (2 months) FreelancerOracle DBA @ 1. Performed DBA activities like – Access Issues, Security, Space, Tuning, and Alternative Solutions like Distributed Databases, Database Objects, Backup and recovery, Migration etc.
2. Development of DB Objects like Materialized View, Views, Tables and Associated objects like Indexes and Triggers.
3. Wrote UNIX shell scripts to compress, backup and maintain remote databases.
4. Performed regular activities like export-import, creation of indexes or embedded hints to improve performance, ERD update, deployment etc.
5. Contributed to development of ETL utility using UNIX Shell scripts, PL/SQL and SQL Loader.
6. Worked on Oracle CDC and contributed in creating database replication and audit management utility with the use of dynamic triggers and procedures. From October 2010 to January 2011 (4 months) Pune Area, IndiaSoftware Engineer @ From July 2010 to October 2010 (4 months) Chandigarh Area, IndiaSummer Internship @ From January 2010 to June 2010 (6 months) Bangalore
Bachelor of Technology (B.Tech.), Computer Science @ Punjab Technical University From 2006 to 2010 Pardeep Kumar is skilled in: Hadoop, Apache Storm, Spark, Solr, Apache Kafka, ETL, Hive, Big Data, HBase, PL/SQL, Apache Pig, Cassandra, Core Java, MapReduce, MySQL, Business Intelligence, Unix Shell Scripting, Solution Architecture, Linux, Data Warehousing, Sqoop, NoSQL, Java Enterprise Edition, Consulting, Amazon Web Services..., Shell Scripting, Data Modeling, Amazon EC2, Agile, Riak, MongoDB, Oracle DBA, Healthcare, Record Linkage, Teradata, Pre-sales, Agile Methodologies, Cloud Computing, Integration, Apache Hadoop
Websites:
http://www.hadooptutor.com