Thermo Fisher

Systems Developer (Data Warehouse)

Bangalore, India Full time

Work Schedule

Other

Environmental Conditions

Office

Job Description

Summarized Purpose:

We are offering an outstanding opportunity to join Thermo Fisher Scientific as a Data Warehouse Developer. In this role, you will focus on designing, building, and optimizing database-centric solutions that power our analytical and operational data platforms. You will play a key role in developing robust, scalable data pipelines and warehouse structures -primarily on AWS Redshift—supporting data-intensive workloads and downstream reporting applications.

Education/Experience:

  • Bachelor’s degree or comparable experience in Computer Science, Information Science, or a related area of study
  • Around 3+ years of experience in database development, data engineering, or a related field
  • Equivalent combinations of education, training, and experience will also be considered

Major Job Responsibilities:

  • Design, implement, and refine data warehouse schemas (e.g., star/snowflake models) and data pipelines in AWS Redshift or similar RDBMS platforms.
  • Build and manage SQL-centric ETL/ELT procedures for the processing, transformation, and merging of data from diverse origins.
  • Improve database performance through query optimization and efficient data processing.
  • Collaborate with stakeholders to translate data needs into scalable and maintainable data structures.
  • Support data quality, validation, and reconciliation processes to ensure accurate and reliable datasets.
  • Engage with AWS services including Lambda, Step Functions, and S3 for orchestrating and automating data workflows.
  • Participate in design reviews, documentation, and testing activities to ensure adherence to quality and compliance standards.
  • Collaborate with Operations and DevOps teams to deploy and monitor data workflows using CI/CD pipelines where applicable.
  • Troubleshoot production issues, analyze root causes, and propose balanced solutions.
  • Leverage AI-assisted development tools to improve query efficiency, code refactoring, and documentation.

Knowledge, Skills and Abilities:

  • Strong hands-on SQL development skills including complex queries, window functions, joins, and analytical operations.
  • Mastery in data modeling and grasp of data warehousing principles (ETL/ELT, dimensional modeling, slowly altering dimensions).
  • Experience working with large relational databases (e.g., Redshift, PostgreSQL, SQL Server, MySQL, Oracle).
  • Knowledge of AWS cloud services, especially S3, Lambda, Redshift, and Step Functions.
  • Familiarity with Python or NodeJS for scripting, automation, or data workflows based on Lambda.
  • Excellent analytical and problem-solving skills with attention to detail.
  • Strong communication and collaboration skills in a team-oriented environment.

Must Have skills:

  • Advanced SQL and RDBMS experience – Ability to develop and optimize queries, stored procedures, and data transformations for large-scale data workloads.
  • Data warehousing and ETL/ELT – Practical experience designing and maintaining data warehouse environments and data pipelines.
  • Practical exposure to AWS services – Hands-on experience with AWS data services like Redshift, S3, Lambda, and Step Functions.
  • Skilled in Python or NodeJS programming – Capable of using a programming language for automation or data integration purposes.
  • Data modeling and schema creation – Experience crafting normalized and dimensional schemas for analytics and reporting.

Good to have skills:

  • Exposure to data Lakehouse or big data environments (Databricks, Snowflake, or similar).
  • Knowledge of AI-assisted or modern query optimization tools and practices.

Working Hours:

India: 05:30 PM to 02:30 AM IST

Philippines: 08:00 PM to 05:00 AM PST