Company Banner
已申请 Internship Icon 实习 Fresh Graduate Experienced Icon 应届毕业生 & 经验工作者 最新工作

Data Engineer

  • 2023-09-18

-

Singapore, Singapore

Data Engineer

职位详情

Roles & Responsibilities

Required and desired skills/qualifications:

  • Desired candidate should be having around 5-9 years of experience.
  • Have strong technical foundation with in-depth knowledge in Big Data Hadoop, Data Reporting, Data Design, Data Analysis, Data governance, Data integration and Data quality.
  • Experience in monitoring, Tuning tasks on Cloudera distribution.
  • Deep and extensive knowledge with HDFS, Spark, MapReduce, Hive, HBase, Sqoop, Yarn, Airflow.
  • Thorough knowledge on Hadoop architecture and various components such as HDFS, Name Node, Data Node, Application Master, Resource Manager, Node Manager, Job Tracker, Task Tracker and MapReduce programming paradigm.
  • Good understanding on Hadoop MR1 and MR2 (YARN) Architecture.
  • Efficient in working with Hive data warehouse tool creating tables, data distributing by implementing Partitioning and Bucketing strategy, writing and optimizing the HiveQL queries.
  • Good experience working with different Hadoop file formats like Sequence File, ORC, AVRO and Parquet.
  • Experience in using modern Big-Data tools like SparkSQL to convert schema-less data into more structured files for further analysis. Experience in Spark Streaming to receive real time data and store the stream data into HDFS.
  • Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.
  • In-depth understanding of Data Structure and Algorithms.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in working on Avaloq data processing. Participation in multiple Avaloq Core Banking Platform implementations in various business / technical streams
  • Good experience in CI/CD pipeline and working in Agile environment.
  • Hands on experience with Real time streaming using Kafka, Spark streaming into HDFS.
  • Developed analytical components using SparkSql and Spark Stream.
  • Involved in converting Hive/SQL queries into Spark transformations using Spark SQL using Scala.
  • Good knowledge streaming data using Kafka from multiple sources into HDFS.
  • Knowledge of processing and analyzing real-time data streams/flows using Kafka and HBase.
  • Proficient in all phases of software development including design, configuration, testing, debugging, implementation, release, and support of large-scale, Bank platform applications.

Tell employers what skills you have

Airflow
Data Analysis
Big Data
Hadoop
ETL
Data Integration
Data Quality
Data Governance
MapReduce
Data Design
Tuning
SQL
Debugging
Software Development

小心骗局。不要向不明来源提供个人信息或付款。在采取行动之前验证身份。立即举报任何疑似骗局。保持警惕,保持安全。

Company Logo

INTELLECT MINDS PTE. LTD.

Job Majestic Logo

© Copyright 2024 Agensi Pekerjaan JEV Management Sdn. Bhd., registered in Malaysia (Company No: 201701016948 (1231113-U), EA License No. JTKSM860)
© Copyright 2024 Job Majestic Sdn. Bhd., registered in Malaysia (Company No: 201701037852 (1252023-X))
版权所有

Ask us