Velodata Global Pvt Ltd

Senior Data Engineer( Python Data Engineer + Python Visualization)

Kochi, Kerala, India

7 days ago
Save Job

Summary

📢 Hiring for Our Client: Senior Python Data Engineer


📍 Kochi / Trivandrum / Remote (South India preferred)

🕒 Immediate Joiners Preferred


Looking for an experienced Python Data Engineer (7+ yrs) with strong skills in:

🔹 ETL, Pandas, NumPy, SQLAlchemy

🔹 Data Viz: Plotly, Dash, Seaborn, Matplotlib

🔹 Databases: PostgreSQL / MySQL / MS SQL Server

🔹 Bonus: Docker, Airflow, Cloud (AWS/GCP/Azure)


  • Job Description
  • Experience : 7 +Years
  • Mandates : Python - Pandas, NumPy, SQLAlchemy, Matplotlib, Plotly, Seaborn, Dash, PostgreSQL/MS SQL Server/ MySQL

 

Job Purpose


We are looking for a highly skilled Senior Python Data Engineer with excellent communication skill to join our

Data & AI team. The ideal candidate will have strong experience in designing and building scalable data pipelines,

and developing insightful data visualizations using Python-based libraries. This role requires a blend of backend

data engineering expertise and frontend data storytelling skills.

 

Job Description / Duties & Responsibilities

• Design, develop, and optimize scalable data pipelines and workflows using Python.

• Work with large, complex datasets to extract, transform, and load (ETL) data from various sources.

• Build reusable, modular data engineering components and frameworks.

• Develop custom data visualizations using Python libraries (e.g., Matplotlib, Plotly, Seaborn, Dash, etc.).

• Collaborate with data analysts, and business stakeholders to understand requirements and deliver data-

driven insights.

• Optimize performance of data processing and visualization solutions.

• Maintain documentation for data workflows and visualization dashboards.

 

Job Specification / Skills and Competencies

• 6+ years of hands-on experience in Python programming.

• Must have excellent communication skill.

• Strong experience in data engineering: ETL development, data wrangling, and data quality management.

• Solid understanding of data structures, algorithms, and relational as well as non-relational databases

(e.g., PostgreSQL, MongoDB, etc.).

• Proficient with Python data stack: Pandas, NumPy, SQLAlchemy, etc.

• Hands-on experience with data visualization using tools such as: Matplotlib, Seaborn, Plotly, Dash, or

Streamlit

• Familiarity with data versioning, containerization (Docker), and cloud platforms (AWS/GCP/Azure) is a

plus.

• Experience with workflow orchestration tools (e.g., Airflow, Prefect) is desirable.

• Strong problem-solving and communication skills

• Bachelor's or Master’s degree in Computer Science, Engineering, or related field.

• Experience working in cross-functional agile teams.

• Exposure to machine learning or AI/LLM integration is a plus.ips: Provide a summary of the role, what success in the position looks like, and how this role fits into the organization overall.




📩 Interested candidates can apply by sending their updated resume to:

👉 [email protected]

Let’s connect if you’re ready to take your data engineering career to the next level!

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job