Company Banner
Dimohon Internship Icon Latihan Industri Fresh Graduate Experienced Icon Lepasan Graduan & Berpengalaman Kerja Terkini

Data Engineer

  • 2023-09-18

-

Singapore, Singapore

Data Engineer

Butiran Pekerjaan

Roles & Responsibilities

Required and desired skills/qualifications:

  • Desired candidate should be having around 5-9 years of experience.
  • Have strong technical foundation with in-depth knowledge in Big Data Hadoop, Data Reporting, Data Design, Data Analysis, Data governance, Data integration and Data quality.
  • Experience in monitoring, Tuning tasks on Cloudera distribution.
  • Deep and extensive knowledge with HDFS, Spark, MapReduce, Hive, HBase, Sqoop, Yarn, Airflow.
  • Thorough knowledge on Hadoop architecture and various components such as HDFS, Name Node, Data Node, Application Master, Resource Manager, Node Manager, Job Tracker, Task Tracker and MapReduce programming paradigm.
  • Good understanding on Hadoop MR1 and MR2 (YARN) Architecture.
  • Efficient in working with Hive data warehouse tool creating tables, data distributing by implementing Partitioning and Bucketing strategy, writing and optimizing the HiveQL queries.
  • Good experience working with different Hadoop file formats like Sequence File, ORC, AVRO and Parquet.
  • Experience in using modern Big-Data tools like SparkSQL to convert schema-less data into more structured files for further analysis. Experience in Spark Streaming to receive real time data and store the stream data into HDFS.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • In-depth understanding of Data Structure and Algorithms.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in working on Avaloq data processing. Participation in multiple Avaloq Core Banking Platform implementations in various business / technical streams
  • Good experience in CI/CD pipeline and working in Agile environment.
  • Hands on experience with Real time streaming using Kafka, Spark streaming into HDFS.
  • Developed analytical components using SparkSql and Spark Stream.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark SQL using Scala.
  • Good knowledge streaming data using Kafka from multiple sources into HDFS.
  • Knowledge of processing and analyzing real-time data streams/flows using Kafka and HBase.
  • Proficient in all phases of software development including design, configuration, testing, debugging, implementation, release, and support of large-scale, Bank platform applications.

Tell employers what skills you have

Airflow
Data Analysis
Big Data
Hadoop
ETL
Data Integration
Data Quality
Data Governance
MapReduce
Data Design
Tuning
SQL
Debugging
Software Development

Berhati-hati dengan penipuan. JANGAN memberikan maklumat peribadi atau wang kepada pihak yang tidak dikenali. Sahkan identiti sebelum bertindak. Laporkan segera jika mengesyaki penipuan. Kekal maklum dan kekal selamat.

Company Logo

INTELLECT MINDS PTE. LTD.

Job Majestic Logo

© Hakcipta 2024 Agensi Pekerjaan JEV Management Sdn. Bhd., registered in Malaysia (Company No: 201701016948 (1231113-U), EA License No. JTKSM860)
© Hakcipta 2024 Job Majestic Sdn. Bhd., registered in Malaysia (Company No: 201701037852 (1252023-X))

Ask us