Create cloud-based data processing systems to convert raw data from diverse origins (such as APIs, FTP, SFTP) into well-structured formats.
Design and deploy efficient data workflows (both batch and real-time) with integrated monitoring and alert systems, ensuring smooth operation for data analysts and business users.
Build and maintain large and intricate datasets that align with both business objectives and technical standards.
Support data governance activities by organizing and documenting metadata.
Conduct comprehensive testing (functional and non-functional) on datasets to ensure reliability and accuracy.
Design and model databases, creating essential database components (views, tables, stored procedures, scripts).
Optimize queries for performance and scalability, ensuring high efficiency.
Contribute to troubleshooting, monitoring, and providing support to end-users.
Collaborate with cross-functional teams to gather requirements and deliver solutions that meet business needs.
Develop code that adheres to established architecture standards, focusing on enhancing scalability and performance of data products.
Stay informed about the latest trends and advancements in data engineering to continuously improve practices.
Profile:
A degree in Engineering or a related field.
Proven expertise in developing data pipelines using cloud platforms like AWS (S3, ECS, Lambda, RDS, SNS/SQS, IAM, CloudWatch).
In-depth knowledge of database programming, particularly SQL, and experience with unstructured or graph databases.
Strong ability to write advanced SQL queries, with knowledge of DBT Core being a plus.
Proficiency in Python and data-focused libraries such as Pandas and Boto3.
Solid understanding of data platform architecture and data governance practices.
Expertise in Snowflake development is an advantage.
Familiarity with CI/CD practices.
Experience with orchestration tools (e.g., Step Functions, Airflow, Prefect).
Proficient in version control systems like GitHub or GitLab.
Knowledge of data visualization tools such as Tableau is beneficial.
Familiarity with Infrastructure as Code tools (e.g., CloudFormation, Terraform).
AWS certifications (e.g., Solutions Architect) are desirable.
Agile mindset: Collaboration, continuous learning, and mentorship of junior team members.
Proven ability to meet deadlines while exhibiting strong analytical and problem-solving skills.
Attention to detail, with a focus on quality and system performance.
Experience working in Agile environments, particularly using tools like Jira.
Strong communication skills.
Fluency in English is essential; additional proficiency in Portuguese and/or French is a bonus.
A team-oriented, proactive approach to challenges and a commitment to continuous improvement.