• Open positions
  • Get to know us
  • FAQ
Open positions

Data Engineer/Consultant (Senior/Staff)

Data Engineer/Consultant (Senior/Staff)
B2B21 000 - 31 080 PLN NET
LOCATION + Remote Poland: Kielce, Kraków, Wrocław
Apply now

We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self-development opportunities and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of ​​“measuring outcomes, not hours”. Join us & see for yourself!

About the role

The majority of these roles will be at the forefront of client collaboration and building VL positions in the industry (spearheading projects).
You will work closely and directly with a different specialist from the client side. Collaborate with stakeholders to define requirements, develop data pipelines and data quality metrics.

You will participate in defining the requirements and architecture for the new platform, implement the solution, and remain involved in its operations and maintenance post-launch
Your work will also introduce data governance and management, laying the foundation for accurate and comprehensive reporting that was previously impossible.
Build data ingestion & processing pipelines. All of the above with a strong focus on the customer’s needs.

Flexibility in action and the ability to overcome obstacles are highly valued in this role.

Python Advanced
Databricks/Snowflake Advanced
Data Engineering Advanced
SQL Regular
AWS/Azure/GCP Regular
Apache Airflow or other orchestration tool Regular
Data modeling Nice to have
LLM productivity tools like Cursor/Claude Code Nice to have
dbt Nice to have
View available projects
Project
Data Foundation & AI Enablement
Project Scope

We are architecting a modern Data Platform for a fast-scaling client in the Insurance sector. Our work consolidates fragmented legacy systems, organises data from a vast number of sources, and establishes a standardised, governed, and future-proof data foundation.

We aim to unlock the full value of the company’s data, enabling faster, informed decision-making and providing the backbone for business growth and AI readiness.

Tech stack

SQL, Python, Snowflake, dbt, Data modelling, Data quality, Power BI, Azure, Terraform

Challenges

The primary objective is to deliver a robust data foundation and enable AI capabilities for a client that has grown organically. The work focuses on several key areas:

  1. Establishing a production-ready, fully operational Snowflake environment and driving operational excellence.
  2. Translating complex business logic into accurate data models to ensure the platform truly reflects business reality.
  3. Integrating diverse data sources to build reliable data products and comprehensive data dictionaries.
  4. Managing the full Data Engineering and Data Science lifecycle to support production ML and AI experimentation.
  5. Taking ownership from concept to deployment.
  6. Cultivating an engineering mindset by promoting automation, CI/CD, and rigorous standards.
Team

We are building a small (4-6 people), agile, cross-functional team capable of delivering the complete data platform, from initial architecture to production operations.

Roles involved: DevOps, Data Engineer, Snowflake Specialist, MLOps/AI Engineer, Business Analyst (BA). The team will collaborate closely with business stakeholders to ensure effective knowledge transfer and strict alignment with strategic goals.

Project
JetBrains
Projectt scope

The client is introducing Atlan as a new internal Data Catalogue solution and uses Glean as a company-wide unified search platform for thousands of employees.

To ensure a smooth transition from our existing Knowledge Base and OpenMetadata setup, we need to index Atlan assets into Glean so that metadata for databases, tables, metrics, and reports is easily discoverable through search.

Tech stack

Python,  System & Data Integration, Kubernetes, System design, Infrastructure mindset

Skills

We’re looking for a Data Platform Engineer with experience in data platforms and system design at scale. We expect a track record in designing integration architectures for external systems and streamlining data migration/ingestion.

As a Data Platform Engineer, you will design and implement a solution that: Periodically indexes Atlan metadata assets into Glean, runs on a configurable schedule (hourly/daily), is production-ready, observable, and maintainable by our DevOps team after handover.

Moreover, ensure compliance and data governance at the appropriate level in line with the company’s standards.

What we expect in general

  • A proactive approach and flexibility in action were a must
  • Very good command of English (written and spoken)
  • Hands-on experience with Python
  • Proven experience with data warehouse solutions (e.g., BigQuery, Redshift, Snowflake)
  • Experience with Databricks or data lakehouse platforms
  • Strong background in data modelling, data catalogue concepts, data formats, and data pipelines/ETL design, implementation and maintenance
  • Ability to thrive in an Agile environment, collaborating with team members to solve complex problems with transparency
  • Experience with AWS/GCP/Azure cloud services, including: GCS/S3/ABS, EMR/Dataproc, MWAA/Composer or Microsoft Fabric, ADF/AWS Glue
  • Experience in ecosystems requiring improvements and the drive to implement best practices as a long-term process
  • Experience with Infrastructure as Code practices, particularly Terraform, is an advantage
  • Proactive approach

Don’t worry if you don’t meet all the requirements. What matters most is your passion and willingness to develop. Apply and find out!

A few perks of being with us

Building tech community
Building tech community
Flexible hybrid work model
Flexible hybrid work model
Home office reimbursement
Home office reimbursement
Language lessons
Language lessons
MyBenefit points
MyBenefit points
Private healthcare
Private healthcare
Training Package
Training Package
Virtusity / in-house training
Virtusity / in-house training
And a lot more!

Apply now

Data Engineer/Consultant (Senior/Staff)

"*" indicates required fields

Accepted file types: pdf, Max. file size: 5 MB.
Please submit a CV no longer than two pages.
Current recruitment process: For the purpose of recruitment, I hereby give consent as per art. 6.1.a of the GDPR to processing of my personal data (other than that listed in art. 22 [1] § 1 Labour Code) by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at Szlak 49 Street, 31-153 Cracow. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that my personal data will be kept for the duration of the recruitment process and as regards any potential claims, for the period of 36 months maximum, and that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any time, but this withdrawal does not make the previous processing illegal*.(Required)*

Current recruitment process: For the purpose of recruitment, I hereby give consent as per art. 6.1.a of the GDPR to processing of my personal data (other than that listed in art. 22 [1] § 1 Labour Code) by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at Szlak 49 Street, 31-153 Cracow. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that my personal data will be kept for the duration of the recruitment process and as regards any potential claims, for the period of 36 months maximum, and that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any time, but this withdrawal does not make the previous processing illegal*.

(Required)
Future recruitment processes: I hereby give consent as per art. 6.1.a of the GDPR to the processing of my personal data by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at Szlak 49 Street, 31-153 Cracow, in order to use this data in future recruitment processes. I hereby agree to possible storage of my personal data for this purpose in Virtus Lab’s database for a period of 36 months maximum. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any point, but this does not make the previous processing illegal*.

Future recruitment processes: I hereby give consent as per art. 6.1.a of the GDPR to the processing of my personal data by Virtus Lab Sp. z o. o. (as Co-Controller for a full list of joint controllers, see Privacy Policy) with its headquarters at Szlak 49 Street, 31-153 Cracow, in order to use this data in future recruitment processes. I hereby agree to possible storage of my personal data for this purpose in Virtus Lab’s database for a period of 36 months maximum. At the same time I accept the Privacy Policy of the Data Controller. I acknowledge that I have the right to access this data or have it rectified or deleted on demand. This consent can be withdrawn at any point, but this does not make the previous processing illegal*.

Anna Goraj-Schuster
Coordinated by
Anna Goraj-Schuster
Talent Acquisition Lead
linkedin
Data Engineer/Consultant (Senior/Staff)
B2B21 000 - 31 080 PLN NET
LOCATION + Remote Poland: Kielce, Kraków, Wrocław
Apply now
group of people gathered together
Not sure if this role is right for you?
It doesn't mean that you don't match. Tell us about yourself and let us work on it together.
Contact us
We create and engineer software
Privacy Policy