Position: ETL Developer
Interview: In-Person.
Client: Florida Department of Transportation
Duration: 12+ Months
Location: Tallahassee, FL.
Education
Primary Job Duties/ Tasks
1.Analyze the current data environment, including data sources, pipelines, and legacy structures, to determine required transformations and optimal migration strategies into Snowflake.
2.Collaborate with stakeholders and data architects to design and implement scalable, secure, and cost-effective data architecture using Snowflake.
3.Re-engineer legacy reporting logic (e.g., WebFOCUS, Mainframe FOCUS, and T-SQL) by translating them into Snowflake SQL and optimizing performance.
4.Develop and automate ELT/ETL data pipelines using Snowflake's native features and tools such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration tools (e.g., dbt, Airflow).
5.Partner with analysts and business users to build efficient, reusable data models and secure views within Snowflake that support downstream reporting (e.g., Power BI, Tableau, or
Looker).
6.Optimize query performance and data governance by implementing best practices in Snowflake for security, access control, caching, clustering, and cost monitoring.
7.Support training, documentation, and knowledge transfer to internal teams, ensuring smooth adoption and use of Snowflake-based solutions.
Job Specific Knowledge, Skills, and Abilities:
1.Expert level SQL programming is REQUIRED for this position.
2.Proven experience with Snowflake platform architecture and data warehousing concepts.
3.Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
4.Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
5.Solid understanding of data governance, security roles, masking policies, and RBAC within Snowflake.
6.Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external
tables in Snowflake.
7.Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools.
8.Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
9.Strong understanding of current data governance concepts and best practices.
10.Knowledge of data migration best practices from external data sources and legacy systems (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
Preferred KSAs:
11.Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic models using Snowflake as a backend.
12.Experience working with financial, ERP, or general ledger data in a reporting or analytics
capacity.
13.Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms.
14.Familiarity with Agile/SCRUM frameworks and experience working in iterative development cycles.
15.Experience with Oracle Data Warehouse.
16.Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or GitHub Actions).