ETL, DWH, BI Consultant
- Verfügbarkeit einsehen
- 2 Referenzen
- 70€/Stunde
- 81927 München
- Weltweit
- ru | en | de
- 12.12.2024
Kurzvorstellung
Auszug Referenzen (2)
"It was a great pleasure working with D., very active candidate and highly recommend."
8/2021 – 10/2022
Tätigkeitsbeschreibung
Sport, Data Warehouse and KPI Reporting
Responsibilities:
• Migration of data extraction from Data Virtuality to Adverity
• Collection and analysis customer's requirements
• Analysis of existing systems, products, processes and frameworks
• Developed ETL processes using Data Virtuality, Adverity, Amazon Redshift
• Analyzed performance issues and optimization of long running queries
• Configuration of Instagram connectors in Adverity
• Loaded data from Instagram, Facebook
• Scheduled/monitored loading processes
• Analysis and comparisson of data between source systems and existsing processes,
• Conversion old ETL logic to new ETL logic (procedures, SQLs, views, jobs)
Tools: Data Virtuality, Adverity, Amazon Redshift
Datenbankentwicklung
"Zusammenarbeit lief reibungslos"
5/2020 – 10/2022
Tätigkeitsbeschreibung
Retail, Migration of Data Warehouse from Teradata to Snowflake
Responsibilities:
• Collection and analysis customer's requirements
• Development of ETL processes for data replication from Teradata to Snowflake using Ab Initio, Microsoft Azure and Teradata
• Analysis of existing systems, products, processes and frameworks
• Development, tests and deployment Ab Initio, Snowflake and Teradata patches and hotfixes
• Performance tuning
• Processes monitoring and support
• Analysis, comparison and fixes of data differences between Teradata and Snowflake
• Generation avsc files and metadata for ETL processes
• Comparison table’s DDL and avsc files
• Documentation
Tools: Snowflake, Ab Initio, Microsoft Azure, Teradata
Datenbankentwicklung
Qualifikationen
Projekt‐ & Berufserfahrung
5/2024 – offen
Tätigkeitsbeschreibung
Bank, Unterstützung der Betriebsabteilung
Verantwortlichkeiten:
• Überwachung von ETL-Beladungsprozessen für DWH in Ab Initio ControlCenter
• Alarmierung bei Störungen
• Analyse der Gründe für die Abbrüche
• Neustart von Jobs in Ab Initio ControlCenter
• Analyse bestehender Systeme, Prozesse und Frameworks
• Entwicklung von Hotfixes
• Automatisierung operativer Aufgaben zur Alarmierung in Microsoft Teams in folgenden Fällen: Jobs Abbrüche, spätere Jobsstarts, fehlende Quelldatenanlieferung und Teilung des Beladungsstatus
• Entwicklung, Tests, Deployment und Dokumentation
Technologien: Ab Initio, Oracle DB, Jira, Confluence
ETL-Entwickler
1/2023 – 5/2024
Tätigkeitsbeschreibung
Retail, Umbau des Data Warehouse, Migration von DB2 zu Snowflake und von Informatica PowerCenter zu Matillion
Verantwortlichkeiten:
• Erfassung und Analyse der Kundenanforderungen
• Entwicklung von ETL-Prozessen mit Informatica PowerCenter, Matillion, Snowflake, DB2, Microsoft SQL Server
• Analyse bestehender Systeme, Produkte, Prozesse und Frameworks
• Analyse der PowerCenter ETL-Prozesse und Migration der ETL-Logik zu Matillion
• Vergleich der Daten zwischen DB2/Microsoft SQL Server und Snowflake
• Entwicklung, Tests und Deployment von Patches und Hotfixes
• Überwachung von ETL-Prozessen
• POC für Informatica Cloud
• Entwicklung ETL-Prozesse mit Informatica Cloud
• POC für DBT
• Entwicklung ETL-Prozesse mit DBT
• Dokumentation
Technologien: Informatica PowerCenter, Informatica Cloud, DBT, Snowflake, Matillion, Python, DB2, Microsoft SQL Server, SAP, Denodo, Confluence, Jira, GitLab CI/CD
Data Warehousing, ETL, IBM DB2, Informatica, Microsoft SQL-Server (MS SQL)
11/2022 – 8/2023
Tätigkeitsbeschreibung
Bank, Umbau des Data Warehouse, Migration von SAS DIS zu Ab Initio
Verantwortlichkeiten:
• Erfassung und Analyse der Kundenanforderungen
• Entwicklung von ETL-Prozessen mit Ab Initio
• Analyse bestehender Systeme, Produkte, Prozesse und Frameworks
• Entwicklung, Tests und Deployment von ETL Prozessen
• Dokumentation
Technologien: Ab Initio, SAS DIS, DB2, Confluence, Jira
IBM DB2, SAS (Software), Data Warehousing, ETL
8/2021 – 10/2022
Tätigkeitsbeschreibung
Sport, Data Warehouse and KPI Reporting
Responsibilities:
• Migration of data extraction from Data Virtuality to Adverity
• Collection and analysis customer's requirements
• Analysis of existing systems, products, processes and frameworks
• Developed ETL processes using Data Virtuality, Adverity, Amazon Redshift
• Analyzed performance issues and optimization of long running queries
• Configuration of Instagram connectors in Adverity
• Loaded data from Instagram, Facebook
• Scheduled/monitored loading processes
• Analysis and comparisson of data between source systems and existsing processes,
• Conversion old ETL logic to new ETL logic (procedures, SQLs, views, jobs)
Tools: Data Virtuality, Adverity, Amazon Redshift
Datenbankentwicklung
5/2020 – 10/2022
Tätigkeitsbeschreibung
Retail, Migration of Data Warehouse from Teradata to Snowflake
Responsibilities:
• Collection and analysis customer's requirements
• Development of ETL processes for data replication from Teradata to Snowflake using Ab Initio, Microsoft Azure and Teradata
• Analysis of existing systems, products, processes and frameworks
• Development, tests and deployment Ab Initio, Snowflake and Teradata patches and hotfixes
• Performance tuning
• Processes monitoring and support
• Analysis, comparison and fixes of data differences between Teradata and Snowflake
• Generation avsc files and metadata for ETL processes
• Comparison table’s DDL and avsc files
• Documentation
Tools: Snowflake, Ab Initio, Microsoft Azure, Teradata
Datenbankentwicklung
12/2017 – 4/2020
Tätigkeitsbeschreibung
• Developing Ab Initio ETL processes, patches and hot fixes
• Conversion of ETL processes from Teradata to Ab Initio
• Applying Data Quality rules
• Developing processes for deployment of objects from Control Center
• Loading processes monitoring
• Setting Data Lineage
• Teradata performance tuning
• Optimizing Teradata views for business intelligence layer
• Optimizing Teradata queries for Microstrategy reports
• Code review
• Collecting and analyzing customer's requirements
Tools: Teradata 15.10 DB, Ab Initio 3.3.4.2, 3.5.2.2
ETL
6/2017 – 10/2017
Tätigkeitsbeschreibung
• Mentored team
• Developed procedures and views in Data Virtuality for loading data from sources
• Developed models, looks and dashboards in Looker
• Analyzed source data
• Analyzed business requirements
• Collected and analyzed customer's requirements
Tools: Data Virtuality 1.9.7, Looker 4.0
Business Intelligence and Reporting Tools (BIRT)
6/2016 – 6/2017
Tätigkeitsbeschreibung
• Explored Teradata Global Control Framework (GCFR), an automated framework to manage the ELT processes
• Collected and analyzed customer's requirements
• Analyzed DWH model
• Analyzed source data
• Created Source to Target mappings (S2T)
• Developed ETL processes using Teradata GCFR
• Tested ETL processes
• Developed procedures for GCFR metadata generation
Tools: Teradata 15.10 DB, Teradata Global Control Framework (GCFR)
ETL
2/2015 – 6/2016
Tätigkeitsbeschreibung
• Analyzed Enterprise Data Warehouse
• Analyzed and developed ETL processes
• Analyzed and fixed ETL incidents from customer
• Searched for ETL error causes and removing them
• Developed patches for ETL processes
• Developed data fixes
• Simulated data errors and tested patches
• Reviewed ETL code
• Applied Multi-Value Compression (MVC) with saving more than 14 TB free space in Teradata database
• Resolved deadlock issues
Tools: Teradata 14 DB, Informatica PowerCenter 9.1.0
ETL
7/2014 – 2/2015
Tätigkeitsbeschreibung
• Supervised a team
• Analyzed customer requirements
• Designed ETL processes
• Developed mapping templates using PowerCenter Mapping Architect For Visio
• Generated multiple IPC mappings and IPC workflows
• Analyzed sources
• Mentored ETL developers
Tools: Oracle 11 DB, Informatica PowerCenter 9.5.1, PowerCenter Mapping Architect For Visio, Informatica Data Replication
ETL
3/2014 – 7/2014
Tätigkeitsbeschreibung
• Led a team of 8 people
• Analyzed Master Data Management system and ETL processes
• Used Informatica MDM: Informatica MDM Data Director, Informatica MDM Hub Console
• Developed IDD : Subject Area Group, Subject Area, Subject Area Child
• Used MDM Hub Console: Metadata Manager, Schema, Schema Viewer, Queries, Packages, Cleanse Functions, Mappings, Hierarchies, Security Access Manager, Data Steward, Batch Group, Batch Viewer
Tools: Oracle 11 DB, Informatica PowerCenter 9.5.1, Informatica MDM
ETL
8/2013 – 3/2014
Tätigkeitsbeschreibung
• Analyzed S2T (Source To Target)
• Analyzed sources
• Developed ETL processes
• Analyzed Neoflex Summary Report system
• Completed Neoflex Summary Report system conversion
• Developed Summary Report ETL processes
• Tuned IPC mappings and workflows performance
• Analyzed Data Warehouse
• Analyzed ETL processes
• Extracted IPC objects metadata using SQL scripts for IPC repository DB
• Wrote document «Data Warehouse technical description»
• Wrote document «Administrator guide»
• Used Informatica PowerCenter Versioning
Sources: Openway Way4 Cards, CFT-Bank, Summary Report Forms (Excel files)
Tools: Informatica PowerCenter 9.5.1, Oracle 11 DB
ETL
7/2013 – 3/2014
Tätigkeitsbeschreibung
• Led a team of 2 people
• Gathered ETL design requirements
• Designed ETL processes
• Developed ETL processes
• Created 2500+ IPC mappings, 1300+ IPC workflows with a team of 3 ETL consultants in 3 months (8 sources, 434 tables)
• Analyzed PowerCenter Mapping Architect For Visio
• Designed mapping templates using PowerCenter Mapping Architect For Visio and generated multiple IPC mappings
• Generated multiple IPC workflows using XML and VBS scripts
• Tuned IPC mappings and workflows performance
• Generated multiple file parameters using VBS scripts
• Imported multiple IPC objects using pmrep
• Created dynamic SQL and shared IPC mapping for referential integrity checking
• Extracted database objects metadata using SQL scripts
• Created document for generating IPC mappings using PowerCenter Mapping Architect For Visio
• Mentored bank ETL developers
• Used Informatica PowerCenter Versioning
Tools: Informatica PowerCenter 9.5.1, PowerCenter Mapping Architect For Visio, Oracle 11 DB
ETL
10/2008 – 7/2013
Tätigkeitsbeschreibung
• Led a team of 5 people
• Analyzed sources
• Created S2T (Source To Target) for source systems
• Developed ETL processes
• Created Loan Portfolio Data Mart
• Maintained new and old ETL processes
• Tuned IPC mappings and workflows performance
Sources: Diasoft 5NT, FIS Profile, Openway Way4 Cards, FICO Capstone Decision Accelerator
Tools: Informatica PowerCenter 8.1.1, Oracle 10 DB
ETL
Zertifikate
Ausbildung
Moskau
Über mich
I am an active, cheerful, self-motivated and positive thinking person. I have good communications skills and ability to learn quickly new technologies and systems. I like getting unusual and interesting challenges and finding different ways to solve them, I can work equally well and hard in a team or alone.
Weitere Kenntnisse
Ab Initio 3.3.1, 3.3.4.2
Ab Initio Metadata Hub 3.5.2 (MHUB)
Snowflake
Informatica PowerCenter 8.1.1, 8.6.1, 9.5.1, 9.6.1, 10
Informatica Developer 10
Informatica PowerCenter Mapping Architect for Visio
Informatica MDM 9.5.1
Informatica Big Data Management
Oracle 10g, 11g
Oracle BI EE 10, 11
Teradata 14, 15
Teradata Global Control Framework (GCFR)
Teradata Mapping Manager
IBM Data Stage 8.7, 9.1
IBM Cognos TM1 10.2.2
Apache Hadoop, Cloudera CDH 5.8
MicroStrategy 10
Looker 4.0
Data Virtuality 1.9.7
Ataccama Reference Data Manager 11
Analytix DS Mapping Manager
Persönliche Daten
- Russisch (Muttersprache)
- Englisch (Fließend)
- Deutsch (Gut)
- Europäische Union
Kontaktdaten
Nur registrierte PREMIUM-Mitglieder von freelance.de können Kontaktdaten einsehen.
Jetzt Mitglied werden