Data Platform Engineer (Expert)
Data Platform Engineer (Expert)
B2B26 000 - 31 000 PLN NET
LOCATION
+ Remote
Poland: Kielce, Kraków, Wrocław
We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self-development opportunities and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of “measuring outcomes, not hours”. Join us & see for yourself!
About the role
Java/Scala/Python
Expert
Self-organization
Expert
Tech Leading
Expert
Data Engineering
Advanced
AWS / GCP
Regular
Beam/Spark/Presto
Regular
SQL
Basic
Kafka
Basic
Apache Airflow or other orchestration tool
Basic
English
Advanced
View available projects
Project
Apollo Data Platform
Project Scope
Our client is a NASDAQ-listed company that provides a range of solutions to support Go-To-Market (GTM) strategies. They offer a comprehensive B2B database platform that enables sales and marketing professionals to identify, connect with, and engage qualified prospects effectively. The core mission of our client is to equip every company with a complete, 360-degree view of their ideal customer, enhancing every phase of their GTM strategy and boosting their success in achieving business targets.
As a Data Platform Engineer, you'll lead a small engineering team in an R&D setting and play a crucial role in designing, implementing, and deploying elements of a Data Platform for our customer. We seek an experienced software engineer skilled in developing large-scale data pipelines/systems with strong problem-solving abilities. You'll lead innovation, swiftly test and validate new concepts, and integrate them ensuring team collaboration.
As a Data Platform Engineer, you'll lead a small engineering team in an R&D setting and play a crucial role in designing, implementing, and deploying elements of a Data Platform for our customer. We seek an experienced software engineer skilled in developing large-scale data pipelines/systems with strong problem-solving abilities. You'll lead innovation, swiftly test and validate new concepts, and integrate them ensuring team collaboration.
Tech Stack
Java, Beam, Spark, Kafka, Airflow, Presto, AWS and GCP (Dataflow, Dataproc/EMR, Cloud Function/Lambda, BigQuery, BigTable), Python
Challenges
*Design, build, and operate highly scalable and flexible systems that can manage and process billions of records a day and support complex and diversified data pipelines
*Leverage cloud computing architectures to support development needs
*Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and prototyping
*Develop processes and tools to monitor, analyze, maintain, and improve data operation, performance, and usability.
*Leverage cloud computing architectures to support development needs
*Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and prototyping
*Develop processes and tools to monitor, analyze, maintain, and improve data operation, performance, and usability.
Team
Currently, there are 7 engineers from VL working there in 2 subteams. The team is primarily based in the US and Israel.
What we expect in general
- Software Engineering experience in data platform/big data software, with a proven track record of delivering highly scalable and efficient solutions
- Substantial experience with Java 8+ (preferred), Scala or Python
- Experience with streaming/data processing technologies such as Beam, Spark, Kafka, Airflow, HBase, Presto
- Experience in building enterprise-grade software in a cloud-native environment (GCP or AWS) using cloud services
- Experience in system architecture and design
- Familiarity with designing CI/CD pipelines with Jenkins, Github Actions, or similar tools (nice to have)
- Experience with Kubernetes using GKE/EKS (nice to have)
- Experience with SQL (nice to have)
- Experience with Graph and Vector databases or processing frameworks (nice to have)
- Excellent communication skills and a pragmatic approach to problem-solving
Don’t worry if you don’t meet all the requirements. What matters most is your passion and willingness to develop. Moreover, B2B does not have to be the only form of cooperation. Apply and find out!
A few perks of being with us

Building tech community

Flexible hybrid work model

Home office reimbursement

Language lessons

MyBenefit points

Private healthcare

Stretching

Training Package

Virtusity / in-house training
And a lot more!
Apply now
Data Platform Engineer (Expert)
"*" indicates required fields
Data Platform Engineer (Expert)
B2B26 000 - 31 000 PLN NET
LOCATION
+ Remote
Poland: Kielce, Kraków, Wrocław

Not sure if this role is right for you?
It doesn't mean that you don't match. Tell us about yourself and let us work on it together.