• Open positions
  • Get to know us
  • FAQ
Open positions

Data Engineer (Senior)

Data Engineer (Senior)
B2B21 000 - 27 000 PLN NET
LOCATION + Remote Poland: Kielce, Kraków, Wrocław
Apply now

We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self-development opportunities and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of ​​“measuring outcomes, not hours”. Join us & see for yourself!

About the role

You will participate in defining the requirements and architecture for the new platform, implement the solution, and remain involved in its operations and maintenance post-launch. Your work will introduce data governance and management, laying the foundation for accurate and comprehensive reporting that was previously impossible.
You will adhere to and actively promote engineering best practices, data governance standards, and the use of open standards. Build data ingestion & processing pipelines. Collaborate with stakeholders to define requirements, develop data pipelines and data quality metrics

Python Advanced
Databricks/Snowflake Advanced
Data Engineering Advanced
SQL Regular
AWS/Azure/GPS Regular
Apache Airflow or other orchestration tool Regular
Data modeling Nice to have
LLM productivity tools like Cursor/Claude Code Nice to have
dbt Nice to have
View available projects
Project
Compass BI
Project scope 

Our client is a company that specialises in insurance, reinsurance, and asset management. It focuses on delivering scalable capital solutions that cater to various areas, including property, casualty, and speciality insurance lines. 

Currently in its startup phase, it is actively strategising to enhance its operations and expand its service capabilities. A key aspect of their plans involves the adoption of modern technologies, which will enable the company to streamline processes and increase overall efficiency in their offerings through improved management of data. By leveraging modern technology, our client aims to position itself as a competitive player in the insurance industry while also addressing the evolving needs of its clients.

As part of this transformation, Virtuslab (VL) will accelerate progress on its roadmap and in building a modern data platform with scalable compute capabilities, enhance reporting and workflow automation, and embed cloud-native engineering practices.

Tech stack

Azure, Databricks, SQL, Python, Power Bi 

Challenges

The project involves building a comprehensive reporting and analytics platform from the ground up. Key challenges include integrating data from multiple complex sources, ensuring high data quality and consistency, and designing scalable data models that support both operational and analytical reporting. It also requires close collaboration with business stakeholders to understand reporting needs and translate them into effective data solutions.

Team

The team is small but highly motivated, taking on a broad scope of responsibilities as the platform is built and expanded.

Project
Data Foundation & AI Enablement
Project Scope

We are architecting a modern Data Platform for a fast-scaling client in the Insurance sector. Our work consolidates fragmented legacy systems, organises data from a vast number of sources, and establishes a standardised, governed, and future-proof data foundation.

We aim to unlock the full value of the company’s data, enabling faster, informed decision-making and providing the backbone for business growth and AI readiness.

Tech stack

SQL, Python, Snowflake, dbt, Data modelling, Data quality, Power BI, Azure, Terraform

Challenges

The primary objective is to deliver a robust data foundation and enable AI capabilities for a client that has grown organically. The work focuses on several key areas:

  1. Establishing a production-ready, fully operational Snowflake environment and driving operational excellence.
  2. Translating complex business logic into accurate data models to ensure the platform truly reflects business reality.
  3. Integrating diverse data sources to build reliable data products and comprehensive data dictionaries.
  4. Managing the full Data Engineering and Data Science lifecycle to support production ML and AI experimentation.
  5. Taking ownership from concept to deployment.
  6. Cultivating an engineering mindset by promoting automation, CI/CD, and rigorous standards.
Team

We are building a small (4-6 people), agile, cross-functional team capable of delivering the complete data platform, from initial architecture to production operations.

Roles involved: DevOps, Data Engineer, Snowflake Specialist, MLOps/AI Engineer, Business Analyst (BA). The team will collaborate closely with business stakeholders to ensure effective knowledge transfer and strict alignment with strategic goals.

What we expect in general

  • Hands-on experience with Python
  • Proven experience with data warehouse solutions (e.g., BigQuery, Redshift, Snowflake)
  • Experience with Databricks or data lakehouse platforms
  • Strong background in data modelling, data catalogue concepts, data formats, and data pipelines/ETL design, implementation and maintenance
  • Ability to thrive in an Agile environment, collaborating with team members to solve complex problems with transparency
  • Experience with AWS/GCP/Azure cloud services, including: GCS/S3/ABS, EMR/Dataproc, MWAA/Composer or Microsoft Fabric, ADF/AWS Glue
  • Experience in ecosystems requiring improvements and the drive to implement best practices as a long-term process
  • Experience with Infrastructure as Code practices, particularly Terraform, is an advantage
  • Proactive approach
  • Familiarity with Spark is a plus
  • Familiarity with Streaming tools is a plus

Don’t worry if you don’t meet all the requirements. What matters most is your passion and willingness to develop. Apply and find out!

A few perks of being with us

Building tech community
Building tech community
Flexible hybrid work model
Flexible hybrid work model
Home office reimbursement
Home office reimbursement
Language lessons
Language lessons
MyBenefit points
MyBenefit points
Private healthcare
Private healthcare
Training Package
Training Package
Virtusity / in-house training
Virtusity / in-house training
And a lot more!

Apply now

Data Engineer (Senior)

"*" indicates required fields

Accepted file types: pdf, Max. file size: 5 MB.
Please submit a CV no longer than two pages.
Current recruitment process: For the purpose of recruitment, I hereby give consent as per art. 6.1.a of the GDPR to processing of my personal data (other than that listed in art. 22 [1] § 1 Labour Code) by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that my personal data will be kept for the duration of the recruitment process and as regards any potential claims, for the period of 36 months maximum, and that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any time, but this withdrawal does not make the previous processing illegal*.(Required)*

Current recruitment process: For the purpose of recruitment, I hereby give consent as per art. 6.1.a of the GDPR to processing of my personal data (other than that listed in art. 22 [1] § 1 Labour Code) by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that my personal data will be kept for the duration of the recruitment process and as regards any potential claims, for the period of 36 months maximum, and that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any time, but this withdrawal does not make the previous processing illegal*.

(Required)
Future recruitment processes: I hereby give consent as per art. 6.1.a of the GDPR to the processing of my personal data by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211, in order to use this data in future recruitment processes. I hereby agree to possible storage of my personal data for this purpose in Virtus Lab’s database for a period of 36 months maximum. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any point, but this does not make the previous processing illegal*.

Future recruitment processes: I hereby give consent as per art. 6.1.a of the GDPR to the processing of my personal data by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at 23 Zofii Nałkowskiej Street, Rzeszów, 35-211, in order to use this data in future recruitment processes. I hereby agree to possible storage of my personal data for this purpose in Virtus Lab’s database for a period of 36 months maximum. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any point, but this does not make the previous processing illegal*.

Karolina Buraś
Coordinated by
Karolina Buraś
Senior IT Talent Acquisition Specialist
linkedin
Data Engineer (Senior)
B2B21 000 - 27 000 PLN NET
LOCATION + Remote Poland: Kielce, Kraków, Wrocław
Apply now
group of people gathered together
Not sure if this role is right for you?
It doesn't mean that you don't match. Tell us about yourself and let us work on it together.
Contact us
We create and engineer software
Privacy Policy