Baidu, Inc.

Big Data Intern

Singapore

28 days ago
Save Job

Summary

Job Responsibilities:

- Assist in the construction and optimization of the company’s big data warehouse system, including batch and streaming data pipeline development.

- Support the team in understanding business systems and project requirements, and help implement data solutions that align with business goals.

- Participate in data integration and ETL development under the guidance of senior engineers.

- Follow up on the latest big data technologies, and assist in improving the performance and stability of big data platforms.


Job Requirements:

- Currently pursuing or recently graduated with a Bachelor’s or above in Computer Science, Data Engineering, Machine Learning, or a related field.

- Strong interest in data and business insights, with basic data analysis skills.

- Familiar with SQL and at least one programming language such as Python, Java, or Scala; experience with Shell scripting is a plus.

- Understanding of basic ETL concepts and data warehouse design principles.

- Exposure to big data tools like Hadoop, Hive, Spark, Kafka, Flink, or any similar frameworks is a bonus.

- Curious, proactive, and eager to grow in the big data field.

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job

People also searched: