**************************** W2 ONLY - NO C2C *****************************
Location: Universal City, CA
Type: 6 months contract on W2
Qualifications:
- Minimum 7 years of related experience in data engineering
- Strong understanding of Datawarehousing concepts
- Proficient in Python for building UDFs and pre-processing scripts
- Proficient in sourcing data from APIs and cloud storage systems
- Proficient in SQL with analytical thought process
- Experience working on Airflow orchestration
- Must have experience working on any of the cloud platforms - AWS would be preferred
- Experience with CI/CD tools in a python tech stack
- Experience working on Snowflake Datawarehouse would be nice to have
- Competent working in secured internal network environments
- Experience working in story and task-tracking tools for agile workflows
- Motivated and Self-Starting: able to think critically about problems, decipher user preferences versus hard requirements, and effectively use online and onsite resources to find an appropriate solution with little intervention
- Passionate about writing clear, maintainable code that will be used and modified by others, and able to use and modify other developers' work rather than recreate it
- Bachelor's Degree in related field
Responsibilities:
- Build complex data engineering pipelines using Python and Airflow to hydrate datamarts in Snowflake and Data Lake in AWS Cloud.
- Quick thinking in troubleshooting production scale data pipelines and analytical thinking in identifying data issues.
- Integrate code with defined CI/CD framework and AWS services required for building secure data pipelines.
- Work in an agile software development team to complete backlog items, working in conjunction with other developers within the organization and other data engineers & architects.
- Test and create automated tests for your code, ensuring every function, service, and object is compatible with your team's work and with the many systems within the client's system portfolio.
- Act alongside other senior data engineers, help formulate best practices and setting up toolsets and procedures for the team, leveraging internally available tools and communicating with other internal development & product teams.
- Stay up-to-date on new development and platform technologies, make recommendations for the right tool for the job, and take the lead in setting up those tools and training other developers on them.
- Communicate with business partners around client to understand the needs of the user, and implement those ideas in the code.
- Create documentation for developers as well as business users to help them understand your products.
- Maintain cloud-based platforms and environments of supported applications, troubleshooting and patching functional issues and data issues in lower or production environments when necessary.
- Perform other duties as assigned.
NOTES FROM HIRING MANAGER
Will this position be remote or onsite 100%/Hybrid?
Is there any chance for an extension?
- Likely to extend longer and conversion opportunity
Can you tell me what your interview process would look like?
- Prescreening questionnaire will be reviewed along w/resumes to be considered for interviews. (please have the candidate complete the questionnaire before submitting along w/resume. 1st round with data engineer and final round with the HM
Day to day duties?
Of the items listed on the job description, can you tell me the top 3 skills for the role?
- ETL, Strong SQL, combination of python/cloud