Back to List

Junior Data Engineer

Information Technology

Taguig, Metro Manila   |   Full Time

As a Junior Data Engineer at FGI, you will support the creation of the technology and data architecture that moves and translates data used to inform critical strategic and real-time decisions. You will collaborate with internal partners across engineering, product, business, and marketing teams to help translate business needs into technical solutions. Your main areas of focus will be assisting in centralizing data repositories, contributing to self-serve analytics platforms, and aiding in the development of automated reporting solutions.

We are an expanding team with numerous growth and learning opportunities. You will work closely with the engineering team and gain valuable experience under the guidance of senior engineers, including the engineering head of the company (an ex-Google engineer), helping to shape the team’s culture and technical direction.

Our Stack:

  • Airflow

  • Docker

  • DBT

  • BigQuery

  • Metabase

Key Responsibilities

As a Junior Data Engineer, you will:

  • Assist in managing and automating the end-to-end process of organizational data collection and reporting.

  • Support the optimization and maintenance of existing data pipelines.

  • Work with your team to identify new data requirements, analysis strategies, and reporting solutions.

  • Contribute to the development and implementation of data models, data architecture, business intelligence, and data governance processes to ensure scalability and maintainability.

  • Help create and maintain technical systems documentation.

Job Qualifications

  • B.S. degree in Computer Science or a related technical field, or equivalent practical experience.

  • Open to fresh graduates or individuals with 1 to 2 years of professional experience with data modeling, building data pipelines, and reports (internships and projects can be considered).

  • Proficiency with SQL; basic knowledge of SQL performance tuning and process optimization.

  • Familiarity with at least one programming language (e.g., Python, Java, C++, C#).

  • Understanding of basic data concepts such as schema evolution, sharding, latency, etc.

Preferred, but not required:

  • Experience with ETL frameworks (e.g., Airflow, Flume, Oozie) for building and deploying ETL pipelines.

  • Familiarity with AWS and GCP cloud suites (e.g., AWS Redshift, Google BigQuery).

Submit Your Application

You have successfully applied
  • You have errors in applying