This job has expired

Check similar jobs, what people also searched, or create a job alert for Data Engineer jobs in Bogotá, Capital District, Colombia

Expired

Teams Plus

Data Engineer

Bogotá, Capital District, Colombia

5 days ago
Save Job

Summary

Role Overview


We're looking for a Data Engineer with experience in building scalable data pipelines and modeling workflows to help us manage the ingestion and transformation of data from various source systems into Snowflake. The ideal candidate has a solid foundation in SQL, data modeling, and ELT orchestration tools, and is comfortable working with cloud infrastructure and modern data stack technologies.


Responsibilities


  • Design, develop, and maintain robust ELT pipelines to ingest data from multiple systems into Snowflake.
  • Model raw data into clean, reliable, and analytics-ready datasets for internal teams and stakeholders.
  • Implement and maintain data quality checks, monitoring, and alerting to ensure pipeline reliability.
  • Collaborate with software engineers, analysts, and business users to understand data needs and provide scalable solutions.
  • Write and optimize complex SQL queries in Snowflake for analytics and reporting use cases.
  • Manage and document schema changes, data lineage, and data contracts across systems.
  • Utilize tools such as dbt, Airflow, or similar to orchestrate and version data workflows.
  • Support infrastructure-as-code practices using tools like Terraform or similar where relevant.


Required Qualifications

  • 3+ years of experience in data engineering or backend engineering with a focus on data workflows.
  • Strong SQL skills, particularly in cloud data warehouses like Snowflake or BigQuery.
  • Hands-on experience designing and maintaining ELT pipelines (dbt, Airflow, custom scripts).
  • Familiarity with modern data modeling best practices (e.g., dimensional modeling, star schemas).
  • Proficiency in a scripting language such as Python for data transformations and automation.
  • Experience working with cloud environments (Azure preferred; AWS or GCP acceptable).
  • Understanding of data governance, privacy, and access control principles.


Nice to Have

  • Experience integrating data from enterprise systems like Salesforce, HubSpot, or NetSuite.
  • Familiarity with real-time data processing or event-based architectures.
  • Exposure to CI/CD tools for data infrastructure (e.g., GitHub Actions, Azure DevOps).
  • Experience with monitoring/logging tools for data pipelines.


How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: