Data Platform Specialist
Role Summary
Data Platform Specialist is to ensure that the data that has been loaded from a source to the destination after business transformation is accurate. It also involves the verification of data at various process flows that are being used between source and destination.
Responsibilities
- Interact with Technical Lead and Project Manager to understand requirements and timelines.
- Assist in creating test specifications from design requirements.
- Perform functional and regression testing to support quality.
- Execute tests and provide test results and reports for product leads within defined timelines.
- Assist developers with code debugging and compliance issues.
- Provide recommendations on compliance, debugging, and ensuring code is deployed to production without errors or defects.
Requirements
- Bachelor degree holder (IT or related technical field preferred) with IT and/or SI project management.
- Experience: Junior level (≤ 5 years). Fresh graduates are encouraged to apply.
- Enterprise Data Warehouse Solution – SnowFlake
- Security - Azure Sentinal or Azure Advisory, Azure Data Factory
- Data File System – HDFS, AWS S3, Azure ADLS Gen2/Delta Lake
- Data Ingestion – Streaming: NiFi/Kafka, Azure Event Hub, Azure Functions, Stream Analytics; Batch: Sqoop, Azure Data Factory, Oracle Data Integrations, Databricks
- Metadata – Hive/MySQL, Azure Purview
- Job Scheduler: Propietary shell scripts, AirFlow, Cron, ADF, SnowFlake Task
- Data Transform Jobs – SQL, Spark, Databricks
- Reporting & visualization – Tableau
- Infrastructure – On Premise Redhat Virtualization, Cloud AWS and Azure
- OS – Linux and Window
Role Summary
Data Platform Specialist is to ensure that the data that has been loaded from a source to the destination after business transformation is accurate. It also involves the verification of data at various process flows that are being used between source and destination.
Responsibilities
- Interact with Technical Lead and Project Manager to understand requirements and timelines.
- Assist in creating test specifications from design requirements.
- Perform functional and regression testing to support quality.
- Execute tests and provide test results and reports for product leads within defined timelines.
- Assist developers with code debugging and compliance issues.
- Provide recommendations on compliance, debugging, and ensuring code is deployed to production without errors or defects.
Requirements
- Bachelor degree holder (IT or related technical field preferred) with IT and/or SI project management.
- Experience: Junior level (≤ 5 years). Fresh graduates are encouraged to apply.
- Enterprise Data Warehouse Solution – SnowFlake
- Security - Azure Sentinal or Azure Advisory, Azure Data Factory
- Data File System – HDFS, AWS S3, Azure ADLS Gen2/Delta Lake
- Data Ingestion – Streaming: NiFi/Kafka, Azure Event Hub, Azure Functions, Stream Analytics; Batch: Sqoop, Azure Data Factory, Oracle Data Integrations, Databricks
- Metadata – Hive/MySQL, Azure Purview
- Job Scheduler: Propietary shell scripts, AirFlow, Cron, ADF, SnowFlake Task
- Data Transform Jobs – SQL, Spark, Databricks
- Reporting & visualization – Tableau
- Infrastructure – On Premise Redhat Virtualization, Cloud AWS and Azure
- OS – Linux and Window
Screen readers cannot read the following searchable map.
Follow this link to reach our Job Search page to search for available jobs in a more accessible format.
Job Segment:
Database, Project Manager, Data Warehouse, Warehouse, SQL, Technology, Manufacturing