About the role:
You will be responsible for the delivery of the Data Products within the organization. The underlying business use cases cover different areas such as Reporting & Dashboard, Data Science & Analytics or Data Governance. The team is responsible for the design, development and operation the products in close collaboration with business stakeholders and with regional teams or external partners. The mission is to provide valuable data assets in a self service mode and to facilitate business data driven decisions.
Location and Work Setup:
The position is based in the Bucharest Office (Plaza), near Lujerului metro station, with a flexible hybrid way of working.
Responsibilities:
- Understand the business needs & rules for a specific set of use cases or data products and to translate those into comprehensive data definitions, data models and data quality requirements
- Understand the business processes and underlying data in order to translate raw data into actionable information and insights
- Analyze data sources, define the most suitable data structure depending on the nature of the use case (reporting, analytics, operational)
- Define the relationships between different data entities and create data models that reflect the organization’s data architecture
- Collaborate with data scientists, BI developers and analysts to ensure data integrity and accessibility
- Work with stakeholders to translate business requirements into data structures and ensure that data models support business processes and analytics needs
- Design, implement, and maintain scalable data pipelines and workflows using Databricks and Azure data framework
- Ensure data quality, integrity, and compliance with security standards
- Document processes and workflows to support team knowledge sharing and future development
What we are looking for:
- Prior experience of minimum 4-5 years in a related work experience
- Understand the data lifecycle value chain and underly components (ETL/ELT, DWH/Data Lake)
- Familiar with Azure data ecosystem (ingestion, storage, analytics & data visualization): Azure Ingestion framework, Data Bricks, Azure Synapse, Delta Lake
- Data modelling skills (from logical to physical model, different modelling techniques: star schema, snowflake schema, etc.
- Functional and technical architecture principles
- Experience with Python, SQL, DAX and Power BI
- Test-driven development