top of page

Career

Automation Engineer

Operations

|

Contract

Operations

Contract

About Us

Do you want to be part of Thailand banking transformation? Data is the core of the new financial services era, and we are open for the opportunity to be part to drive this change at the core.

SCB DATAx is a new venture of the Siam Commercial Bank (SCB) holdings, a leading financial services and digital services holdings in Thailand and ASEAN.

As part of the transformation of SCB into a group of product and technology companies, under the SCBx brand, SCB DATAx is the technology company to centralize data and provide AI and data science services and products to the group.

With a leading-edge cloud native data & AI platform, our vision is to support the group to providing everyone in our region with the opportunity to prosper.

We work on forward-thinking challenges of centralizing, analyzing and sharing information. We collaborate with companies and experts in many different domains, embrace diversity and all that while having a good laugh and joy in work.


Discover job openings on our career page. To apply, email with the role's title as the subject, attach your CV, and specify your contact information. We're eager to learn more about you.

 I acknowledge that I have read and agreed to DataX's Terms and Conditions and Privacy Notice

Benefits

Other

Preferred Qualifications

  • Knowledge of data modeling and data management in production system

  • Certification in relevant Azure technologies (e.g., Azure Certified Data Engineer, Azure Certified DevOps Engineer).

  • Experience with containerization and orchestration technologies such as Docker and Kubernetes.

  • Knowledge of streaming data processing frameworks such as Apache Kafka.

Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred.

  • Proven experience (3 years) working as an Automation Engineer or Data Engineer in a data-intensive environment, with a focus on Azure and Databricks technologies.

  • Strong programming skills in languages such as Python, Java, or Scala.

  • Hands-on experience with Azure services such as Azure Data Lake Storage, Azure SQL Database, Azure Databricks, and Azure Functions.

  • Proficiency in SQL and experience working with relational and NoSQL databases.

  • Experience with Databricks Unified Analytics Platform, including Spark-based data processing and machine learning capabilities.

  • Experience implementing data quality controls and monitoring solutions in a production environment.

  • Excellent problem-solving and analytical skills, with a keen attention to detail.

  • Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.

Responsibilities

  • Implement workflow orchestration and scheduling mechanisms using tools like Azure Data Factory, Databricks Jobs, or custom scripting.

  • Develop and maintain automated checks and validation rules to ensure data quality, integrity, and compliance with regulatory standards.

  • Implement monitoring solutions to track the health and performance of data pipelines, utilizing Azure Monitor, Databricks monitoring tools, or custom dashboards.

  • Optimize the performance and scalability of the data platform by automating resource provisioning, workload management, and system monitoring using Azure services such as Azure Kubernetes Service (AKS), Azure Monitor, and Azure Resource Manager.

  • Stay updated on emerging technologies, tools, and best practices in data engineering and automation to drive continuous improvement.

About Team & Role

As an Automation Engineer specializing in Data Platform Architecture, you will play a key role in designing, implementing, and maintaining automated processes within our data platform. You will collaborate closely with cross-functional teams to ensure the reliability, scalability, and performance of our data infrastructure, leveraging Azure and Databricks technologies. Additionally, you will be responsible for implementing mechanisms to control data quality and monitoring data in real-time.

bottom of page