Senior Data Engineer

  • IT / Engineering
  • Cracow, Poland

Senior Data Engineer

Job description

Our client is an award-winning global player offering leading currency solutions for both retail and corporate organizations, from a tech start-up to global corporations. Founded in 1996, the company became first to share exchange rate information on the internet free of charge and in 2001, they have launched a trading platform that helped pioneer the development of online-based trading around the world, enabling forex and CFD investors the ability to trade the financial markets.

 

We are looking for an exceptional Senior Data Engineer to help us define and build our next generation data platform on top of Google Cloud, guiding its direction and integration within the company. This platform will support our internal analytics and personalization needs, while enabling future business opportunities.


Key takeaways:

Stack: Python (ideally) or Java/Scala + one of DataFlow, BigQuery, Kafka, Airflow or Spark,

Salary: 20 000 PLN - 29 000 PLN gross UoP.

Location: 100% remote or hybrid (Kraków),

Recruitment process: 2 - step online process.

Role description:

  • Present a schema management system design at an Engineering-wide architecture review,
  • Build new data pipelines using DataFlow,
  • Code data fills in python against Google Cloud Storage and BigQuery,
  • Investigate, deploy and implement a caching solution for enabling real-time joins of big data sets,
  • Develop a data quality validation framework used for both testing and real time production data,
  • Design tools enabling easy-to-use workflows for internal teams using the data platform,
  • Present a schema management system design at an Engineering-wide architecture review,
  • Build new data pipelines using DataFlow,
  • Code data fills in python against Google Cloud Storage and BigQuery,
  • Investigate, deploy and implement a caching solution for enabling real-time joins of big data sets,
  • Develop a data quality validation framework used for both testing and real time production data,
  • Design tools enabling easy-to-use workflows for internal teams using the data platfom.

Requirements

  • 7+ years of experience designing and implementing large scale software,
  • 4+ years of experience working with Big Data technologies like DataFlow, BigQuery, Kafka, Airflow and Spark,
  • Experience designing a real-time Big Data platform,
  • Experience deploying and managing Big Data infrastructure,
  • Strong coding ability in an object oriented language (preferably Scala, Python, Java),
  • Excellent team player with strong communication skills (verbal and written),
  • Enthusiastic about collaborative problem solving,
  • Bachelor’s degree or better in Computer Science.

What we offer

  • Competitive compensation package including annual performance bonus opportunity,
  • Competitive benefits package, including health care and gym pass,
  • Spacious and modern office space in the heart of old Warsaw,
  • Flexibility and the possibility to work remotely,
  • Kitchen full of coffee, tea, snacks, and fresh fruit,
  • Superior co-working and personal development experience,
  • Relocation package for candidates outside Poland.