Informatica Data Quality Specialist @ American Family Insurance
Sr.Application Developer @ EMC
Bachelor of Technology, Electrical and Electronics Engineering @
Over 6 years of experience in data management, data integration, data quality, data governance, data architecture, project management, and business intelligence implementations. 5 years of experience in many Informatica Products including Power Center, Data Quality, Data Explorer, Test Data Management, Address Doctor, Data Validation, Meta Data ,Business Glossary and Master Data Management. Worked in many project leadership
Over 6 years of experience in data management, data integration, data quality, data governance, data architecture, project management, and business intelligence implementations. 5 years of experience in many Informatica Products including Power Center, Data Quality, Data Explorer, Test Data Management, Address Doctor, Data Validation, Meta Data ,Business Glossary and Master Data Management. Worked in many project leadership roles and possesses great interpersonal skills.
Data Quality Specialist @ From July 2015 to Present (4 months) Madison, Wisconsin AreaSenior Developer @ Data Quality Definition and Processes
Works with data stewards across the enterprise to identify critical data elements and define data quality criteria including business rules, definitions, and tolerance levels.
Works with Data Governance to set standards for how data is used and consumed. Enacts and enforces uniform data entry standards.
Establishes the criteria for how often data quality is checked for accuracy.
Determines data quality priorities for the organization.
Establishes data quality dimensions, such as data completeness, conformance, consistency, validity, and timeliness.
Develops data improvement processes to maintain and/or improve its value
Centralizes data quality processes into one data quality program. Incorporates data cleansing, standardization, and matching processes handled by external vendors.
Data Quality Assessment
Completes data profiling activities; assessment of existing data for completeness and accuracy relative to the quality specifications for the data.
Partners with Information Services to examine the interactions of business applications, movement processes, and storage for impacts on data quality and to look for ways to prevent poor data from entering the systems.
Determines existing quality deficiencies and the practicality and cost of overcoming them.
Identifies specific data quality problems. Develops recommendations on how to handle data quality problems: exclude, accept, correct, or create a default value. Works with business and IT partners to implement recommendations using both manual and automated solutions.
Investigates and addresses the root cause of data problems.
Defines and implements the method to track and store data quality outputs.
Data Quality Monitoring and Program Metrics
Monitors ongoing measurement of data quality rules and metrics and tracks progress to ensure levels of quality are maintained From July 2014 to Present (1 year 4 months) Madison, Wisconsin AreaApplication Developer @ •Performed business and data analysis based on the functional specifications provided by the end users and managers.
•Worked with The quality of the data records in datasets can be checked and verified with six measures like Completeness, Conformity, Consistency, Integrity, Duplication, Accuracy and timelines.
•Worked with Data Governance to set standards for how data is used and consumed and Enacts and enforces uniform data entry standards.
•Worked Extensively on profiling ,Data & Domain Discovery, Score carding, Dashboards, standardization, matching, , consolidation, Address Validation, Data Quality assistant for Exception Records, Character Labeler, Rule Based Analyzer, Parsing Components, Weight Based Analyzer and dictionaries (Reference Tables)
•Developed recommendations on how to handle data quality problems in IDD (Informatica Data Director) to exclude, accept, correct, or create a default value. Worked with business to implement recommendations using both manual and automated solutions.
Meets with data stewards to review data quality metrics. Sets data quality improvement targets for the business and IT and provides recommendations to achieve goals.
•Implemented Customer Creation Process with Real Time Duplicate Checking (Identity Matching), Address Validation, Dnb Services Process.
•Determines data quality priorities for the organization by Implementing the Data Domain Scores with Dashboard Tools
•Implemented Business Glossary at with linkage to actual metadata source objects and Scorecards for Data Lineage
•Worked with New Components in 9.5.1 like NLP (Natural Language Processor), Classifier Transformation, Data Processing
•Designed and led several Power Center(ETL) strategies at different customer sites to integrate data from multiple data systems including SAP R/3 ECC/BW, Oracle R12, SQL ServerDB2, VSAM Flat Files, XML files, and other proprietary systems From April 2012 to July 2014 (2 years 4 months) southboro,MASystem Analyst @ • Understand the customer Requirements and Technical specification documents.
• Participating in the review of the detail Architecture and design for the project.
• Worked on Informatica Analyst Tool IDQ, to get score cards for data issues.
• Involved in designing the database structure for data staging area and loading the BW data in to Fireplace Database.
• Worked with data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
• Worked with data quality plans, components, uses the data quality report viewer to get reports, dash boards
• Worked on ETL design, development, administration, source to target mappings, data warehouse transformations, on call support, troubleshooting, testing, and documentation.
• Used Informatica Power center/Power Exchange to extract/transform and load data from different operational data sources like Oracle, Sql Server, XML, Flat files in to staging area (Oracle) and load in to Oracle data warehouse.
• Developed the various mappings using various transformations like source qualifier, joiner, filter, router, expression transformations etc.
• Involved in the design of incremental load.
• Developed standard and re-usable transformations and maples using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and Router.
• Wrote SQL-Overrides and used filter conditions in source qualifier thereby improved the performance of the mapping Involved in performance tuning on Informatica in all levels.
• Developed huge workflows with work lets, event waits, assignments, conditional flows, and email and command tasks.
• Applied slowly changing dimensions (SCD) in various complex Mappings to load data from source to target.
• Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads.
• Experience in writing shell scripting for various ETL needs From February 2011 to April 2012 (1 year 3 months) HERNDON,VA
Looking for a different
Get an email address for anyone on LinkedIn with the ContactOut Chrome extension