Data Engineer (based in HCMC)

Amanotes
Sign in to view
Contract Duration: 12 months
New Jobs Share
AWS  4 yearsPython  4 yearsData Modeling  4 years
Reactjs  3 yearsJavascript  3 yearsLinux  3 yearsAzure Data Lake  3 yearsSQL  4 years

Why You'll Love Working Here

Learning and leisure budget. Free Music classes. Monthly team building
Working from home policy. Performance bonus. Global focus.

Job Description

About Amanotes:

  • A fast-growing startup in the music-tech industry. We seek to delight people with interactive music experiences.
  • Since 2014, 30+ music games and mobile applications were published under our name with over 2 billion downloads worldwide and 120+ million monthly active users.
  • View Amanotes on Gambaru Platform

Where we are going

Growth:

  • Scale our music platform to retain the top position in the industry in terms of functionality, user experience, and performance
  • Take a high level of ownership to get the job done and coach junior guys.
  • Explore and evaluate the practicality of new technologies to improve existing processes and/or products
  • Brainstorm to make clear product ideas and technical solutions
  • Proactively manage tasks and processes using the JIRA suite.

Daily Work:

  • As an Amanotes data engineer, you will be leveraging hundreds of terabytes of data from 2+ billion users to create real business values through practical data solutions.
  • Design, build up, implement, optimize and own the data/ETL pipeline with the rest of the Data Engineer team.
  • Create and maintain the system pipeline and data documentation.
  • Implement the system pipelines (Data Warehouse, Data Lake).
  • Engineer data platforms to automate/facilitate data-related processes.
  • Collaborate with business stakeholders, Data Analytics team, and AI/ML team to build and implement deep learning models or algorithm-focused data products (reusable assets) and solutions and deliver them right to the client.
  • Work with external tech teams if any projects are needed Data Engineer team to join with.

What you can expect

  • Silicon Valley-style work culture: move fast, break things, make mistakes, learn quickly, and grow.
  • Work from Home policy and Flexible Time Management as well
  • Opportunity to learn and extend the roles within the engineering team whatever it is, like research, analytics, automated test, business development, product management, DevOps,...
  • Open, direct, collaborative, and no-politics working environment
  • Dealing with many challenging engineering problems to build, scale, and secure the data platform
  • Apply the Grow-at-Work package from Gambaru
  • Free ticket to Gambaverse where you can grow your skills, your emotion, and your finance as well.

Your Skills and Experience

What we are looking for

Must have:

  • 3-4+ years of hands-on experience deploying production quality code.
  • Experience with the following technologies: Python, Google Cloud Platform (included at least GCS, BigQuery), Airflow, and DBT.
  • Knowledge of JavaScript/ReactJS is a plus.
  • Experience with Airflow or other pipeline orchestration software.
  • Deep understanding of SQL and analytical Data Warehouses (BigQuery preferred).
  • Deep understanding, skills, and experience in deployment.
  • Data Modelling for Data Warehouses - familiarity with major methodology, i.e. Kimball, Inmon, Data Vault.
  • Experience in working with Linux environment, Git.
  • Experience and interest in Cloud platforms such as Google Platform, K8S, AWS, Azure, or Databricks.

It will be a plus:

  • Experience in deployment data process CI/CD on Gitlab/Github/Bitbucket.
  • Experience data-intensive projects in the AWS, and GCP (AWS ecosystem, Hadoop ecosystem, K8S, GCS Data Lake).
  • Experience and knowledge in the streaming ETL process are pluses (Kafka, Apache Spark, Pub/Sub,…).
  • Experience with BI/data visualization tools.
  • Business mindset.

More jobs for you

There is currently no related jobs.
The list will be updated as soon as possible. Let's return to all jobs search.
View more jobs