Skip to main content
NEUN
Back to Careers

Top

Middle Data Engineer

NEW
Remote, hubs in Dubai, Yerevan, ...Full-timeGlobal
šŸ“Š MidšŸ  Remote
RemoteRemote work position availableActivePosted within the last 30 days

Job Description

[AI-summarized by JobStash]

You will build and maintain data pipelines and data-related services that power analytics across products. You will extend SQL-based pipelines, add new data sources to ETL processes, refactor Python scripts into modular production-quality code, configure CI for linting and tests, and update pipeline documentation. You will work closely with senior engineers and analysts, ask clarifying questions, and gradually take ownership of data platform components while learning modern data workflows.

Requirements

  • ā—Confident communication and proactive clarification seeking
  • ā—Responsibility, ownership, and proactive communication of challenges
  • ā—Comfortable with IDEs and version control systems like Git
  • ā—Basic understanding of clean code principles and software delivery workflows
  • ā—Essential Python skills including language fundamentals and data structures
  • ā—Confident with SQL basics
  • ā—Regular and thoughtful use of AI tools
  • ā—Motivation to learn and grow in data engineering
  • ā—Knowledge of data engineering fundamentals including ETL, data modeling, data quality, and storage systems
  • ā—Experience using Apache Airflow
  • ā—Familiarity with containerization tools such as Docker
  • ā—Exposure to cloud platforms such as GCP
  • ā—Basic experience with cloud platforms (GCP, AWS, or Azure) is a plus
  • ā—Understanding of orchestration tools such as Airflow is a plus
  • ā—Basic Docker usage (build, run, logs) is a plus
  • ā—Experience with BI tools (Superset, Metabase, Power BI) is a plus
  • ā—Personal data projects (ETL scripts, dashboards, analytics) are a plus

Responsibilities

  • ā—Build and maintain data pipelines and data-related services
  • ā—Contribute to shared tools and libraries
  • ā—Upgrade data platform components and services
  • ā—Communicate with analysts to understand data needs
  • ā—Extend SQL-based pipelines with new transformations
  • ā—Add new data sources to ETL processes
  • ā—Refactor Python scripts into modular code and add logging
  • ā—Configure CI for linting and tests for data repositories
  • ā—Update pipeline documentation after logic or schema changes

Benefits & Perks

  • ā—Remote work setup with access to hubs in Dubai, Yerevan, London and Belgrade
  • ā—Compensation for medical expenses
  • ā—Provision of necessary equipment
  • ā—20 working days of paid vacation annually
  • ā—11 days off per year
  • ā—14 days of paid sick leave
  • ā—Access to internal conferences
  • ā—Access to English courses
  • ā—Access to corporate events
  • ā—Regular performance reviews

Tech Stack

data pipelineloggingPythonSQLETLCIdata modelingtestingGCPGitproject:The Open Platform
Expired
Search