-- Managed and lead a team to successfully migrate on-prem data warehouse to cloud.
-- Architecting and leading all data warehouse and business intelligence in cloud
-- Seasoned data warehouse professional with over 10 years of experience in architecture, design, modelling and end to end development life cycle of DW projects
-- Extensive experience in Big Data Solutions, open source and cloud solutions
-- Expert at managing, co-coordinating and mentoring a big team at on-site and offshore
-- Well versed in building integration and analytical framework with click stream data from Omniture, HEAP and cloud based subscription system like Zuora
-- Implemented near real time solutions by sourcing data from AMQ
-- Certified Horton Hadoop developer with through knowledge on Sqoop, Pig, Hive, Azkaban
-- Expert at tradition ETL technologies like Abinitio, Informatica
-- Expert at open source ETL Talend
-- Expertise in UNIX Shell Scripting, Python, Perl, SQL, PL/SQL
-- Thorough knowledge of databases like ORACLE, DB2, Teradata,
-- Through knowledge of HP Vertica, Redshift and Snowflake Computing
-- Expertise in cloud computing and data management using Amazon S3
-- Expertise in dealing with JSON, BJSON, CLOB , BLOB, XML data
-- Expertise in performace tuning, system analysis, Agile and Waterfall Methodologies, Erwin
-- Knowledge on data visualization using Tableau
Data Management @ -- Mange, architect and design solutions for
Data Warehouse and Business Intelligence on cloud
Migration of all on-prem data solutions to cloud
Complicated data usage for quality mining,solutions
Real time or near time data needs
-- Architect, design and implement data solutions using open source and cloud technologies
-- Lead initiative for Infrastructural upgrades for ETL platform
-- Evaluate new technology and solutions
-- Solutions for online event data stream using message queue architecture
-- Planning and effort estimation
-- Provide guidance, solutions and mentor other members of the team
-- Team management and co-ordination
Environment: Amazon S3, Snowflake Computing, Redshift, Talend Open Source ETL, Unix Shell Scripts, Python, Java, Tableau, AMQ From May 2015 to Present (8 months) Data Solutions Architect @ -- Architect and design solutions for
Big data
Complicated data usage for quality mining,solutions
Real time or near time data needs
-- Architect, design and implement big data solutions on Hadoop
-- Lead initiative for Infrastructural upgrades for ETL platform
-- Evaluate new technology and solutions
-- Solutions for online event data stream using message queue architecture
-- Planning and effort estimation
-- Provide guidance, solutions and mentor other members of the team
-- Team management and co-ordination
Environment: Hadoop, Pig, Hive, Azkaban, Unix Shell Scripts, Python, Ab Initio, Informatica, Teradata, Oracle, SQL, PL/SQL, LINUX (GNU), Perl, Erwin From February 2013 to May 2015 (2 years 4 months) Sr Lead Engineer - Data Integration @ - Architect and design solutions
- Create data models using Erwin
- Planning and effort estimation
- Leading the team, coordinating and distributing the job across team members.
- Gathering and understanding the business requirements.
- Creating the detailed technical design based on the requirements
- Developing the process based on detail design.
- Code development Using Ab Initio, Informatica PowerCenter, PL/SQL, UNIX and Perl.
- Migrating existing Informatica mappings to Ab Initio graphs.
- Developing UNIX Shell scripts and Perl scripts used as wrapper for automating various ETL processes.
- Developing complex SQL, PL/SQL code.
- Code review.
- Tuning and optimizing performance.
- Taking active part in the production deployment of the code.
- Supporting production run and investigating customer queries and taking part in troubleshooting.
- Writing various utility UNIX shell scripts for monitoring the file mount usage, data cleansing and housekeeping and data archiving.
- Writing SQLs to fetch data from tables to generate a report to be sent to users.
- Responsible for implementing common functions or utilities to be used in project.
Environment: Ab Initio (GDE 3.0.3.1, Co>Operating System 3.0.4.3), Ab Initio EME, Informatica PowerCenter 8.6.1, ORACLE 11g, SQL, PL/SQL, LINUX (GNU), UNIX Shell Scripting, Perl Scripting, TOAD, Perforce, JIRA, Erwin From August 2011 to February 2013 (1 year 7 months) Application Owner (Consultant) @ -- Owning and supporting the application.
-- Gathering and understanding the business requirements.
-- Creating the detailed technical design based on the requirements
-- Developing the process based on detail design.
-- Code development Using Ab Initio, Informatica PowerCenter and Informatica B2B Data Transformation.
-- Migrating existing Ab Initio graphs to Informatica Mappings.
-- Developing UNIX Shell scripts used as wrapper for automating various ETL processes.
-- Developing complex SQL, PL/SQL code.
-- Reviewing the code and suggesting appropriate changes for the same.
-- Tuning the code for optimising the performance.
-- Unit testing the code in the DEV environment.
-- Taking active part in the production deployment of the code.
-- Supporting production run and investigating customer queries and taking part in troubleshooting.
-- Writing various utility UNIX shell scripts for monitoring the file mount usage, data cleansing and housekeeping and data archiving.
-- Writing SQLs to fetch data from tables to generate a report to be sent to users.
-- Responsible for implementing common functions or utilities to be used in project.
Environment: Ab Initio (GDE 1.15.7.1, Co>Operating System 2.15.4.2), Ab Initio EME, Informatica PowerCenter 7.1.4, Informatica PowerCenter 8.6.1, Informatica B2B Data Transformation 8.6.2, ORACLE 10g, SQL, PL/SQL, UNIX (Sun Solaris), UNIX Shell Scripting, TOAD, Clear Case, Clear Quest, BMC Remedy. From July 2010 to August 2011 (1 year 2 months) SME / ETL Lead (Consultant) @ I am working as a Subject Matter Expert for the VISA Global Member Billing System.
Responsibilities include:-
- Subject Matter Expert for the Visa Global Member Billing System
- Analysing the requirements and used cases
- Building the technical design
- Building custom code and implementing best practices
- Designing complex code using AbInitio Product Suite (GDE, Conduct>It, BRE), UNIX Shell Scripts, SQL
- Technical review and performance tuning
- Guiding and leading the developers and production support
- Co-ordinating work and implementations
Environment: AbInitio Product Suite (GDE, Conduct>It, BRE, EME), AIX UNIX, DB2, Microstrategy, Java and J2E, BMC Remedy, Clear Case, Clear Quest, MVS From May 2008 to July 2010 (2 years 3 months) Lead ETL Developer (Consultant) @ Didn't get much time to work here as I had to relocate to west coast due to family issues.
I was assigned the role of the ETL lead, which had the following responsibilities.
- Team Lead for the ETL group
- Analysing the requirements and used cases
- Building the Detailed Design
- Building custom code and implementing best practices
- Designing complex code using AbInitio GDE, UNIX Shell Scripts, SQL
- Code review and performance tuning
- Co-ordinating work and implementations
Environment: AbInitio GDE and EME, AIX UNIX, DB2, Microstrategy, Java and J2E, BMC Remedy, MVS From January 2008 to April 2008 (4 months) ETL Developer / Admin (Consultant) @ I had a dual role of an ETL Developer as well as ETL Administrator.
Responsibilities included:-
- Analysing the requirements and used cases
- Building the detailed design
- Building custom code and implementing best practices
- Designing complex code using AbInitio GDE, UNIX Shell Sripts, SQL
- Code review and performance tuning
- Managing, creating and maintaning projects in the EME.
- Promoting and deploying code to production.
- Building custom UNIX Shell Scripts for environment maintenance.
- Testing, presenting and rolling out new releases from AbInitio.
- Upgrading the Co>op and EME
- Allocating and maintaining data storage for ETL projects.
- Checking the health of the EME and Co>Op.
- Addressing security issues and implementing best practices.
- Guiding and training junior team members.
Environment: AbInitio GDE, AbInitio EME, AIX UNIX, DB2, ORACLE, Teradata, BMC Remedy, MVS From December 2005 to December 2007 (2 years 1 month) Assistant Systems Engineer @ While I was working for TCSL client Kaiser Permanente at offshore delivery center then I also took up the additional responsibility of an
AbInitio trainer for AbInitio Center of Excellence (COE) in Tata Consultancy Services Location in Kolkata, India.
The training consisted of 4 days of presentation and 2 days of hands-on. As a trainer I had to do the following.
- A detailed presentation about the ETL tool AbInitio as a product in general and more details about its functionalities and advantages over other ETL tools available in the market
- Performance tuning.
- Working of the Co>Op and the EME
- Clarification of all queries or doubts from the trainees
- Assigning the trainees a hands-on exercise and making sure of their completion
- Conducting an exit evaluation From January 2006 to June 2006 (6 months) ETL (AbInitio) Developer (Consultant) @ I was working as an ETL (AbInitio) Developer for the TCSL client British Telecom at offshore. The responsibilities included the following.
- Understanding the High Level design (HLD)
- Creating the component design or Low Level Design (LLD) based on the requirements ans HLD
- Developing the process based on HLD/LLD
- Creating the unit test case specification
- Reviewing the code
- Building code using AbInitio GDE, UNIX Scripts, PL/SQL, SQL
- Taking part in the code implementations, turnover and bug fixes.
Environment: AbInitio GDE, AbInitio EME, HP UNIX, ORACLE. From November 2004 to December 2005 (1 year 2 months)
BE, Electronics Engineering @ Nagpur University From 2000 to 2004 High School, Science @ St Joseph's College, Kolkata, India From 1998 to 2000 All Subjects, Junior School @ St Joseph's College, Kolkata, India From 1987 to 1998 Kunal Ghosh is skilled in: ETL, Data Warehousing, Unix Shell Scripting, Informatica, Ab Initio, SDLC, Oracle, Teradata, SQL, Data Integration, PL/SQL, Unix, DB2, Data Modeling, Shell Scripting
Websites:
http://www.shutterfly.com/