Design, implement, and maintain scalable data pipelines and ETL processes to extract, transform, and load large datasets from multiple sources.
Build and maintain web scraping tools to gather structured and unstructured data from various websites.
Build and maintain automation tools to optimize operational performance and minimize manual labor.
Implement and optimize data models and schemas to support data analytics and reporting needs.
Work closely with project managers, data analysts & other engineers to turn data into information that can be used to make sound business decisions and deliver efficient solutions.
Follow best practices in data engineering, coding standards, and documentation guidelines.
Explore new technologies and contribute ideas to improve data pipelines.
Requirements
Must be a third or final year student pursuing a degree in Computer Science, Information Technology, Data Science, or a related field.