Freelance Data Engineer (f/m/d)

Job Details

Must Haves:

  • Strong skills in data integration and data gathering methodologies, APIs,
    and experience working with a variety of APIs (e.g. REST API or SOAP
  • Experience with Public Cloud (preferably GCP) services and Service-
    Oriented Architecture
  • Proficient in the following languages: Python, SQL, Java/ Scala
  • Understanding of ETL/ELT infrastructures and Cloud architectures
  • Experience with event driven Architectures and tools such as Kafka and
    knowledge of version control systems such as GIT
  • Expertise in big data and event streaming architecture, implementation
    and maintenance
  • Expertise creating data pipelines, working with various messaging systems
    (Kafka, Pub/Sub …), using SQL to create and query tables as well as
    creating Spark streaming jobs
  • Either experience with Cloudera platform (Apache Airflow, Apache Kafka,
    NoSQL, Impala, Spark streaming using Scala) … or comparable tool stack!
  • Experience with different file formats (orc, parquet, txt, Avro, JSON)
  • Agile scrum experience
  • Strong problem-solving and troubleshooting skills
  • Excellent communicator at all levels
  • A team player with a strong sense of collaboration and personal
  • Working proficiency in English


  • Location: T-Center, Vienna
  • Remote 80%/ Onsite 20%
  • Duration: long-term
  • ASAP

If the role sounds interesting to you, don’t hesitate to get in touch with us. Please also let us know about your hourly rate.
We are looking forward to hearing from you!

Sarah Postlmayr
Flux Consulting GmbH