We are sorry!

This job has been closed. You will find bellow the job description as a reminder. It is not possible to apply anymore.

Location: Singapore
Salary: Open
Sub-industry: Pharmaceuticals
Function: Technology

Job Description

Our Client:

Our client builds, delivers, and maintains a portfolio of data management platforms and mobile products to support our core operations. They thrive on basic and adaptable architecture. You will be working in a fast-paced, high-growth environment where you will have the freedom to thrive, take calculated risks, and drive innovation across the company. As every day, the Digital Health Team improves the lives of millions of people through digital experiences and interaction.

This position will partner with the data science and analyst teams to build the systems and capabilities to discover these insights and deliver them in impactful ways. Your role will be crucial for us. You will partner with cross-functional teams and work backward from business requirements to deliver scalable solutions that enable self-service and accelerate access to insights. This position will support our architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. You will play a key role in developing, constructing, testing, and maintaining data on the AWS platform and downstream Data engineering to enable AI/ML across DHT.

The Responsibilities:

* Build and/or lead the creation and maintenance of optimal data pipeline architectures
* Stay abreast of industry trends and enable successful data solutions by leveraging best practices
* Partnering effectively with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
* Participate and provide Proof of Concepts (POCs) to demonstrate proposed solutions
* Enable team members in the data engineering space through training, culture, and team building
* Assemble large, complex data sets that meet functional / non-functional business requirements
* Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
* Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS 'big data' technologies
* Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
* Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
* Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
* Use expertise to improve the quality of data models, data flows, and data processing of data systems, reporting, analytics solutions
* Work closely with Data Scientists on feature engineering and productionizing models to be robust and scalable
* Work closely with data scientists, business, and IT teams to build a platform and framework to enable machine learning and data analytics activities on a large scale
* Continuous innovation and optimization of machine learning workflow, through R&D on new technologies
* Establish, implement, and maintain best practices and principles of machine learning(MLOps) engineering

The Requirements:

* Bachelor/Master/Engineering degree in IT / Computer Science/ software engineering or relevant field
* At least 5 years experiences in a complex, technical environment
* AWS technologies or other cloud experience is a must
* Proficiency with Python or Scala
* Advanced working SQL and python knowledge and experience
* Experience building and optimizing 'big data' data pipelines, architectures, and data sets, preferably on AWS data lake
* AWS Certified Solutions Architect - Professional level is a huge bonus
* Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
* Strong analytic skills related to working with unstructured datasets
* Build processes supporting data transformation, data structures, metadata, dependency, and workload management
* A successful history of manipulating, processing, and extracting value from large, disconnected datasets
* Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores
* Implementation experience in machine learning algorithms and applications
* Strong expertise in ML model deployment tooling (including experience with tools for real production deployments, testing, management of package dependency, lineage/audit trails, model versioning), high-performance computing, and parallel data processing
* Passionate about machine learning, new application areas, and new tools
* Strong project management and organizational skills
* Experience leading, supporting, and working with cross-functional teams in a dynamic environment

What's on Offer:

* Competitive salary and benefit package
* Working for one of the globally acknowledged company within the industry. A truly established and global leader in the industry
* Available avenues to contributes positively for the organization to grow and being recognized for it

Disclaimer: The Company complies with the Tripartite Guidelines on Fair Employment Practices (TGFEP), including the prevailing guidelines on recruitment. All qualified applicants will be considered for the position regardless of their age, race, religion, nationality, marital status, or family responsibilities. A more detailed discussion of the TGFEP is available on the Tripartite Alliance for Fair and Progressive Employment Practices (TAFEP) website at https://www.tal.sg/tafep