Job Summary:
We are seeking a strategic and technically proficient Director of Data Architecture and Engineering to lead the design, integration, and optimization of our modern data platform. The ideal candidate will have 8+ years of experience in data engineering, data architecture, and data management, with a proven track record of delivering scalable and high-quality data solutions that support the various data needs of the organization.
Key Responsibilities:
- Lead the design and development of our modern data platform, by building innovative and scalable data solutions that are aligned to the enterprise standards and data strategy.
- Architect and implement scalable data pipelines for integrating data from various sources including databases, APIs, streaming data, and third-party platforms.
- Establish data integration patterns and frameworks, ensuring seamless data flow and interoperability across systems.
- Design and implement data models, data lakes, and data warehouses to support analytics and reporting.
- Collaborate with architects, engineering, product, and business leaders to align platform development with business objectives.
- Drive project execution throughout the entire software development lifecycle, mitigating risks and ensuring timely delivery.
- Ensure smooth running of data platform and data pipelines, and help with the triage and resolution of data issues
- Implement data standards, data quality frameworks, monitoring systems, and data governance best practices.
- Participates in new technology evaluations, identify alternative or new technologies and assist in defining new enterprise standards.
- Provide leadership and mentorship to a team of data engineers and architects, fostering a culture of innovation and excellence.
- Stay informed on emerging data technologies, tools, and integration frameworks to enhance data architecture and engineering practices.
Qualifications and Skills:
- Bachelorβs or masterβs degree in computer science, engineering, or related field.
- 8+ years of experience in software engineering, with at least 5 years in data architecture, data engineering, or data integration roles.
- Experience in data modelling, data warehousing and good understanding of various data models like relational/ODS, dimensional models and related concepts
- Strong knowledge of data architecture, data integration patterns, ETL frameworks, and data processing tools.
- Proficiency in Python, SQL, modern data platforms such as Databricks/Snowflake, relational databases (e.g., PostgreSQL, SQL Server, Oracle)
- Proven experience building batch and real-time data pipelines for diverse data sources (structured, unstructured, semi-structured, streaming).
- Extensive experience with data pipeline tools (e.g., Azure Data Factory, Spark, Glue).
- Good understanding of various big data and lake house table and file formats (Iceberg, Delta, Parquet, ORC)
- Strong understanding of cloud-based data storage and databases (e.g. AWS S3, Azure Data Lake, RDS, Dynamo DB)
- Knowledge of building RESTful APIs using serverless technology like AWS Lambda or Azure functions
- Knowledge of data streaming technologies (e.g., CDC, Kafka, Azure Event/IoT Hub, Kinesis).
- Experience working with code management and CI/CD tools for SDLC (e.g. GitHub, GitLab, SonarQube, Code build)
- Knowledge of standard IT security practices such as identity and access management, SSO, data protection, encryption, certificate, and key management.
- Knowledge of data governance, data security, and regulatory compliance.
- Excellent problem-solving and analytical skills, with the ability to design scalable data solutions.
- Excellent communication, technical writing and presentation skills.
- Demonstrated leadership experience managing cross-functional data architecture/engineering teams.
- Demonstrated ability to adapt to new technologies and learn quickly.
- Flexibility to work non-standard schedule as needed, including on-call hours as needed
Preferred Qualifications:
- Certification in modern cloud platforms, data engineering, or data architecture.
- Proficiency in data workflow management and orchestration tools (e.g., Airflow, dbt, Dagster).
The base salary for this role can range from $120,000 to $145,000 based on a full-time work schedule. An individualβs ultimate compensation will vary depending on job-related skills and experience, geographic location, alignment with market data, and equity among other team members with comparable experience
Want to Learn More?