• Open positions
  • Get to know us
  • FAQ
Open positions

Data Engineer (Senior)

Data Engineer (Senior)
B2B21 000 - 27 000 PLN NET
LOCATION + Remote Poland: Kielce, Kraków, Wrocław
Apply now

We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self-development opportunities and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of ​​“measuring outcomes, not hours”. Join us & see for yourself!

About the role

Python Advanced
Data modeling Advanced
SQL Advanced
Azure Advanced
Apache Airflow Advanced
Trino/Starburst with Iceberg Regular
Power BI or other BI tool Regular
Power Automate Regular
Power Apps Regular
dbt Regular
Terraform Basic
English Advanced
View available projects
Project
Insurance Broker Company
Project Scope
We are building a modern Data and Integration Platform for a fast-scaling Insurance client. Our work consolidates fragmented legacy systems, organizes data from 200+ sources, and creates a standardized, future-proof cloud-native environment. We aim to unlock the full value of the company's data, enable more informed and faster decision-making, and provide the backbone for business growth, integration, and AI readiness. This includes setting up a transparent, role-based, governed data environment and engineering a robust, scalable integration hub to connect internal systems and third-party services
Tech Stack
SQL, Data modelling, Data Quality, Python, Azure, Apache Iceberg, Trino, Airflow, dbt, DevOps, IaC
Challenges
We focus on delivering a trusted, high-quality, and well-governed data platform to replace a highly fragmented and immature technology landscape. The key challenges include consolidating over 200 legacy systems into a streamlined, standardized technology stack and designing and implementing a modern cloud-native data platform leveraging Azure, Starburst with Iceberg, Airflow, and the Power Platform suite. We are also building an integration layer and API hub to support third-party data ingestion, such as sanctions checks, foreign exchange, and entity validation. Another primary task is phasing out outdated tooling like KNIME and replacing it with maintainable, scalable workflows. Embedding strong DevOps practices, including Infrastructure as Code (IaC), automated testing, and CI/CD pipelines, is critical to the platform delivery. Additionally, we aim to enable tactical business outcomes, such as early data marts and reporting capabilities, while building towards a complete platform. Enhancing the developer experience, ensuring operational excellence, and embedding strong data governance with role-based access control are fundamental. All initiatives are entirely cloud-native, designed with automation, traceability, scalability, and business agility at their core
Team
We aim to build a small, agile, cross-functional team capable of delivering the complete data and integration project, from initial architecture to production operations. The team will be flexible and multidisciplinary to foster strong ownership, collaboration, and rapid delivery. It will work closely with the client's CTO and business stakeholders to ensure technical excellence, effective knowledge transfer, and alignment with strategic goals
Project
Data Engineering Project by Virtuslab
Project Scope
We are building a modern Data Platform for a fast-growing and ambitious client. Our work focuses on two key areas: bringing in and organizing a wide range of datasets from different sources, and creating easy-to-use Data Marts to support analysts, business users, and Data Science teams. Our goal is to help the client unlock the full value of their data, support better decision-making based on insights, and build a strong foundation for AI initiatives. We are also helping to set up a transparent and agile data organization that drives innovation, flexibility, and growth
Tech Stack
SQL, Data modelling, Data Quality, Python/Java/Scala, Azure/GCP/AWS, Lakehouse Architecture, Airflow or other orchestration tool, dbt, DevOps, IaC
Challenges
In the projects we are involved in, we focus on delivering high-quality, reliable, and trusted datasets across modern data platforms to support business operations, analytics, and emerging use cases. We consistently implement robust, automated monitoring solutions to ensure continuous data quality, traceability, and operational excellence. Our work includes tooling the teams into the platform SDLC, streamlining data delivery processes, and enhancing the developer experience. Delivering comprehensive data insights across the entire data portfolio, from acquisition pipelines to processing workflows and consumption patterns, is critical to enabling data-driven decision-making. We are also committed to building modern, cloud-native data platforms allowing scalable AI and Machine Learning solutions through clean, accessible, and well-structured data assets. Across all initiatives, we embed strong DevOps and Infrastructure-as-Code (IaC) practices to ensure consistency, automation, and agility in development and operations. All solutions are designed and operated entirely in the cloud, leveraging modern cloud services' scalability, resilience, and innovation
Team
We build cross-functional and well-formed data engineering teams that can deliver entirely shaped data engineering projects, from initial design and development to production deployment and ongoing maintenance. We keep our teams small and focused, typically composed of up to 10 people, including engineers, product managers, managers, and analysts, to ensure tight collaboration and a strong focus on delivering real business value. Roles within the teams are cross-functional, allowing us to maintain agility, ownership, and end-to-end responsibility across the project lifecycle

What we expect in general

  • Strong SQL skills
  • Strong engineering skills
  • Experience with modern data pipelines powered by robust orchestration tools
  • Strong focus on delivering high data quality
  • Polyglot engineer with experience in traditional Data Engineering and a knowledge of current trends like Modern Data Stack
  • Demonstrated ability in the design, build, and implementation of software solutions with an unwavering focus on quality
  • Ability to work in an agile environment, partnering with team members and peers to find solutions to challenging problems with transparency
  • Experience of working using CI / CD and DevSecOps approach
  • Strong modelling skills

Don’t worry if you don’t meet all the requirements. What matters most is your passion and willingness to develop. Moreover, B2B does not have to be the only form of cooperation. Apply and find out!

A few perks of being with us

Building tech community
Building tech community
Flexible hybrid work model
Flexible hybrid work model
Home office reimbursement
Home office reimbursement
Language lessons
Language lessons
MyBenefit points
MyBenefit points
Private healthcare
Private healthcare
Training Package
Training Package
Virtusity / in-house training
Virtusity / in-house training
And a lot more!

Apply now

Data Engineer (Senior)

"*" indicates required fields

Accepted file types: pdf, Max. file size: 5 MB.
Current recruitment process: For the purpose of recruitment, I hereby give consent as per art. 6.1.a of the GDPR to processing of my personal data (other than that listed in art. 22 [1] § 1 Labour Code) by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that my personal data will be kept for the duration of the recruitment process and as regards any potential claims, for the period of 36 months maximum, and that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any time, but this withdrawal does not make the previous processing illegal*.(Required)*

Current recruitment process: For the purpose of recruitment, I hereby give consent as per art. 6.1.a of the GDPR to processing of my personal data (other than that listed in art. 22 [1] § 1 Labour Code) by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that my personal data will be kept for the duration of the recruitment process and as regards any potential claims, for the period of 36 months maximum, and that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any time, but this withdrawal does not make the previous processing illegal*.

(Required)
Future recruitment processes: I hereby give consent as per art. 6.1.a of the GDPR to the processing of my personal data by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211, in order to use this data in future recruitment processes. I hereby agree to possible storage of my personal data for this purpose in Virtus Lab’s database for a period of 36 months maximum. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any point, but this does not make the previous processing illegal*.

Future recruitment processes: I hereby give consent as per art. 6.1.a of the GDPR to the processing of my personal data by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211, in order to use this data in future recruitment processes. I hereby agree to possible storage of my personal data for this purpose in Virtus Lab’s database for a period of 36 months maximum. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any point, but this does not make the previous processing illegal*.

Karolina Buraś
Coordinated by
Karolina Buraś
Senior IT Talent Acquisition Specialist
linkedin
Data Engineer (Senior)
B2B21 000 - 27 000 PLN NET
LOCATION + Remote Poland: Kielce, Kraków, Wrocław
Apply now
group of people gathered together
Not sure if this role is right for you?
It doesn't mean that you don't match. Tell us about yourself and let us work on it together.
Contact us
We create and engineer software
Privacy Policy