CBRE Investment Management

Database Engineer Principal

Richardson, TX, US

14 days ago
Save Job

Summary

What you’ll do

  • Plan, analyze, develop, maintain, and enhance complex data systems.
  • Participate in the design, specification, implementation, and maintenance of data systems.
  • Design, code, test, and document software programs of moderate to high complexity.
  • Design, develop, and maintain scalable data pipelines using Snowflake, dbt, Snaplogic, and other ETL tools.
  • Participate in design reviews and technical briefings.
  • Integrate data from various sources, ensuring consistency, accuracy, and reliability.
  • Develop and manage ETL/ELT processes to support data warehousing and analytics.
  • Assist in preparing requirement specifications, analyzing data, designing, and developing data-driven applications.
  • Resolve complex software development issues in production environments.
  • Utilize Python for data manipulation, automation, and integration tasks.
  • Assemble large, complex data sets to meet functional and non-functional business requirements.
  • Identify, design, and implement internal process improvements.
  • Build infrastructure for optimal data extraction, transformation, and loading.
  • Collaborate with cross-functional teams (Product, Data, Design, Frontend, Backend) to support data-related technical issues.
  • Write complex SQL queries, stored procedures, functions, and cursors in SQL Server and PostgreSQL.
  • Analyze long-running queries and develop performance optimization strategies.
  • Create and manage ETL processes using tools like SSIS or Informatica.
  • Perform DBA activities like system health maintenance, performance tuning, database access management, and deployment.
  • Utilize CI/CD tools for optimal productivity.
  • Collaborate with scrum teams in agile development methodologies.
  • Analyze, review, and alter programs to improve efficiency or adapt to new requirements.
  • Prepare documentation to describe program development, logic, coding, and corrections.
  • Collaborate with Data Scientists and analysts.


What you’ll do

  • Bachelor's degree in Information Systems, Mathematics, Computer Science, or a related field.
  • 8+ years of relevant work experience in data engineering.
  • Expertise in data extraction, transformation, loading (ETL), data analysis, data profiling, and data governance.
  • Proficiency in relational and transactional databases, data warehousing, and data lakehouses for structured and unstructured data.
  • Strong experience in designing and developing enterprise-scale data warehouse systems using cloud-native and cloud-agnostic solutions (Snowflake, Azure Synapse, AWS Redshift, etc.).
  • Proven ability to design and develop reusable and scalable data products with data quality scores and integrity checks.
  • Experience in ingesting data from different formats and disparate systems including CRM and ERP systems such as Salesforce, PeopleSoft, SAP etc.
  • Deep understanding of data ingestion tools (Snap logic, ADF, Snow pipe, Matillion, etc.) to build end-to-end complex data workflows.
  • Experience with cloud platforms (AWS, Azure) and technologies, Agile methodologies, and DevOps principles.
  • Proficiency in architecting cloud-native solutions across multiple B2B and B2C data domains.
  • Experience in designing and developing modern APIs for secure data sharing across internal and external applications.
  • Strong expertise in designing and developing comprehensive relational and dimensional data models and ER diagrams (ERwin, SQL Workbench).
  • Experience with data orchestration tools (Apache Airflow, Chronos).
  • Proficiency in designing and developing data transformation models in dbt.
  • Ability to analyze statistical information, identify patterns, and design conceptual and logical data models.
  • Experience with CI/CD pipelines (Jenkins, Azure DevOps).
  • Knowledge of Python for data manipulation and automation.
  • Understanding of data governance frameworks and best practices.
  • Familiarity with source code versioning tools (GitHub).
  • Excellent written and verbal communication skills, strong organizational, problem-solving, and analytical skills.
  • Proficiency1 in working with RESTful APIs, including HTTP methods, endpoints, and request-response cycles.
  • Ability to conduct data validation and reconciliation to ensure accuracy and consistency.
  • Experience in validating data pipelines to meet business requirements and performance standards.


Why CBRE?

At CBRE, we believe we possess an encouraging environment where integrity, service, and excellence craft our approach to every opportunity. We are guided by the needs of the cities we inhabit, the communities we build and the world we live in.

  • CBRE named a Fortune’s Most Admired Real Estate Company Fourteen years in a row
  • CBRE agrees to acquire J&J Worldwide Services, a leading provider of engineering services, base support operations and facilities maintenance for the U.S. federal government.
  • Named a World’s Most Ethical Company by Ethisphere for 11th consecutive year.
  • Ranked #3 on Barron’s Most Sustainable Company list


#CBREIM

How strong is your resume?

Upload your resume and get feedback from our expert to help land this job