Smart Nas app


Take control with SmartNas!
Download the app!


Senior Data Engineer

January 31, 2021 | Database/Data Engineering/Data Analytics | 1 positions in Phnom Penh

Job responsibilities

This role will be responsible for building and maintaining the software infrastructure that enables computation over large data sets.

  • Create, optimize and maintain optimal data collection, flow and pipeline for data science POCs and industrialized solutions
  • Assemble, cleanse, integrate and transform large, complex data sets into model ready data
  • Identify, design, and implement internal process improvements, including automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloudera ‘big data’ technologies
  • Build data cleansing, management and visualization tools that provide actionable insights into data features that might be relevant for POCs and industrialized solutions
  • Create data tools for Data Science and other analytics team members that assist them in building and optimizing POCs and industrialized solutions
  • Work with stakeholders including Data Scientists, Data Architects, commercial and other stakeholders to assist with data-related technical issues and support their data infrastructure needs

Job requirements

  • Graduate with a bachelor or master degree in Computer Science, Information Technology or with work experience with knowledge required by this position
  • 3 years of working experience working in a data engineering
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets is required.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement is required.
  • Experience of setup and manage on Cloudera technologies
  • Experience of using Big data tools such as: Hadoop, Sqoop. Flume, Web HDFS, Spark, Impala, Hive, Solr, etc.
  • Experience with Unix Shell scripts, Phyton, PL/SQL

Be the next Smart Hero

Job Application

Files must be less than 2 MB.
Allowed file types: rtf, pdf, doc, docx, odt.

Top up with:

Top up Top up