Salary:

  • UOP: 15 000 – 30 000 PLN brutto.
  • B2B: 15 000 – 35 000 PLN netto.

Our project:

A start-up based in Munich and in Barcelona, was founded in Spring 2020 and one year later we are now already a team of +60 super motivated ML Engineers, data scientists and developers. Our mission is to build beautiful Artificial Intelligence products. We’re curious, passionate, and relentless in our drive to develop a truly end-to-end product, pushing the boundaries of innovation as far as we can.

Requirements:

  • Hands-on experience in data modeling, ETL development, and data warehousing/lakes.
  • Excellent skills in SQL language.
  • Solid understanding of database design & administration principals.
  • Solid understanding of query execution plans.
  • Excel in the design, creation, management, and business use of large data sets, combining raw information from different sources.
  • Experience in root cause analysis, remediation, and problem resolution for complex systems.
  • Working knowledge in database systems architecture, including networking, security, scaling, and fault-tolerance.
  • Working knowledge of database internals such as locking, wait events, consistency, logging, recovery.
  • Proficiency tuning databases for performance, availability, and scalability.
  • Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations.

Will be a plus:

  • Coding experience with modern programming languages such as Python, Scala, Java etc.
  • Hands-on experience with Cloud database technology, specifically Redshift.
  • Experience architecting data solutions with AWS products including Big Data Technologies (Redshift, RDS, S3, Glue, Athena, EMR, Spark, Hive, etc.).
  • Experience developing and deploying high-performance solutions using Apache airflow, Lambda Functions, Glue, Python and Spark.
  • Experience providing technical leadership and mentoring other engineers for the best practices on data engineering.

Your role:

  • Develop high quality data models in SQL to support our applications.
  • Design, build and launch new data models and data pipelines.
  • Implement best practices in data engineering including data integrity, validation, reliability and documentation.
  • Optimize database design for performance.
  • Design and implement scalable extract, transform and load (ETL) pipelines.
  • Design and implement automated alarming and dashboards to monitor data integrity.
  • Build relationships with Data Scientists, Product Managers and Software Engineers to understand data needs.

We provide:

  • Modern A-class office in a vibrant city center (Twarda,18). Ability to work 100% remotely/from the office/choose a hybrid type.
  • Flexible working hours.
  • Flat structure.
  • Mentor support.
  • Corporate library.
  • In-house trainings.
  • International projects.
  • Knowledge sharing.
  • Free coffee & snacks.
  • Family Picnics.
  • Charity events.

Benefits:

  • Sport subscription (Multisport Classic).
  • Psychological help compensation.
  • 4 Sick Days a year.
  • Training budget.
  • Private healthcare (LuxMed Silver).
  • Mental Health Support compensation.
  • Language courses.
  • Employee referral program.