Senior Data Engineer
Toronto, Ontario • Direct Hire • March 02, 2026 • 85581
Base Salary Range: $123833 - $170184
Senior Data Engineer
Job ID: TM5394
Salary: $123,833.00 - $170,184.00
Division & Section: Technology Services, Office of the Chief Technology Officer
Job Type & Duration: Full-time, 1 Permanent Vacancy
Shift Information: Monday to Friday, 35 hours per week
Affiliation: Non-Union
Why Join the City of Toronto:
As a Senior Data Engineer at the City of Toronto, you will have the opportunity to work on cutting-edge data solutions that directly impact the lives of Toronto's residents. You'll be part of a team driving the city's digital transformation, working on projects that enhance city services and operations through innovative data utilization. You'll work in a collaborative environment that values your expertise and provides opportunities for professional growth. If you're passionate about leveraging data and AWS technologies to create meaningful change, we encourage you to apply and be part of our mission to build a smarter, more connected Toronto.
Major Responsibilities:
Reporting to the Manager, Data Integration & Access, the Senior Data Engineer will join our Enterprise Data Platform team, being a vital partner in supporting the design, development, and implementation of our Enterprise Data Platform. The ideal candidate will have a strong background in AWS technologies, data engineering, and modern data architecture:
AWS Expertise: Utilize a wide range of AWS services to build and maintain scalable, secure, and efficient data infrastructure. Key services include S3, Redshift, Kinesis, EMR, Glue, Data Zone, Lake Formation, and CloudFormation.
Data Pipeline Development: Design, implement, and maintain robust ETL/ELT processes using tools such as AWS Glue, DBT (Data Build Tool), and Apache Spark.
Data Mesh Implementation: Contribute to the implementation of a data mesh architecture, enabling decentralized, domain-oriented data ownership and management.
Infrastructure as Code: Develop and maintain infrastructure as code using Terraform or AWS CloudFormation to automate and streamline the deployment of cloud resources.
Data Processing: Utilize Python and Apache Spark for large-scale data processing, transformation, and analysis.
Data Modeling: Design and implement efficient data models to support analytics, machine learning, and reporting needs.
Streaming Solutions: Develop and maintain both batch and real-time data streaming solutions using technologies such as AWS Kinesis.
Data Governance: Implement and adhere to data governance policies to ensure data quality, privacy, and compliance with regulations.
Platform Enhancement: Work with technologies such as Databricks and Snowflake to enhance the capabilities of the data platform.
Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide tailored solutions.
Documentation and Knowledge Sharing: Create and maintain comprehensive documentation for data processes, pipelines, and models. Share knowledge with team members and contribute to the team's overall growth.
What do you bring to the role:
- Post-secondary education in Computer Science, Data Science, Information Technology or a related discipline (or an equivalent combination of education and experience).
- Extensive experience in data engineering, with expertise in AWS technologies, particularly in data-related services (e.g. S3, Redshift, Kinesis, EMR, Glue, etc.).
- Experience in Python programming, for big data processing frameworks, such as Apache Spark.
- Experience with Infrastructure as Code/IaC (e.g. Terraform or AWS CloudFormation), and ETL/ELT processes and tools (e.g. AWS Glue and DBT).
- Knowledge of data modeling concepts and techniques.
- Knowledge of other cloud platforms (e.g. Azure, GCP, etc.) for multi-cloud strategies.
- Strong understanding in data governance principles and privacy regulations (e.g., DGPR, CCPA).
- Experience with data mesh architecture concepts and implementation, and with CI/CD practices and tools will be considered an asset.
- AWS certifications (e.g. AWS Certified Data Analytics – Specialty, AWS Certified Big Data Specialty) will be considered an asset.
- Understanding of machine learning workflows and MLOps practices will be considered an asset.
- Exceptional problem solving, communication, analytical skills, to be able to explain complex technical concepts to non-technical stakeholders.
- Ability to work independently and as part of a team, with attention to detail and commitment to delivering high-quality work.
- Ability to be adaptable with time management skills, willing to learn new technologies and methodologies.
Equity, Diversity and Inclusion
The City is an equal opportunity employer, dedicated to creating a workplace culture of inclusiveness that reflects the diverse residents that we serve. Learn more about the City’s commitment to employment equity.
Accommodation
The City of Toronto is committed to creating an accessible and inclusive organization. We are committed to providing barrier-free and accessible employment practices in compliance with the Accessibility for Ontarians with Disabilities Act (AODA). Should you require Code-protected accommodation through any stage of the recruitment process, please make them known when contacted and we will work with you to meet your needs. Disability-related accommodation during the application process is available upon request. Learn more about the City’s Hiring Policies and Accommodation Process.