Role: Cloud Engineer
REMOTE
Design, develop and optimize cloud-based solutions, ensuring seamless operations for Enterprise Data Warehouse team. This is a contract position, and the successful candidate in this position will be expected to bring knowledge and experience with Azure Data Lake Storage Gen 2, Azure Data Factory (ADF), Azure Synapse and Snowflake Data Cloud. The successful candidate will apply design and development fundamentals to these cloud platforms by providing the best practices and expertise for seamless data integration.
Essential Functions & Accountabilities
- Take direction from Manager/Architect/Senior Programmer for ADF, Azure Synapse, and Snowflake on daily tasks and activities.
- Develop and maintain data pipelines, data storage solutions, data processing and data integration using ADF and Snowflake.
- Test new or modified programs to ensure that they conform to design. Develop test data for use in program testing.
- Write documentation of system changes in conformance with department standards
- Write any necessary user documentation (report descriptions, user procedures, operations procedures)
- Provide timely and concise summaries of project status to manager.
- Works closely with business data analyst and data architects to understand the requirement to create effective data workflows.
Knowledge, Skills, And Abilities
- Proficiency in designing, building, and testing on Azure
- Experience developing data pipelines using Azure Data Factory with ETL techniques to load from various sources into data warehouse or data lake
- Experience developing Microsoft SQL Server Integration Services (SSIS) is a plus
- Demonstrate experience in using different ADF components (Pipelines, Data Flows, Triggers, Linked Services etc.)
- Familiarity with project deployment model, use of project and package parameters, and Integration Services Catalog (including Environments) preferred.
- Proficient in using Snowflake Data Cloud including creating databases, schemas, tables and views.
- Developing an optimizing ETL process for loading data into Snowflake
- Software development life cycle and software quality assurance best practices and methodologies.
- Proficient in SQL query writing and editing.
- Develop complex ETL routines to load a dimensional star schema with a strong focus on data quality, error handling, and logging processes.
- Assess requirements for completeness and accuracy and determine actionable steps for the ETL team.
- Identify necessary changes to current business and data integration processes.
- Communicate risks and ensure understanding.
Experience
- 7 years of IT Development experience with advanced experience working on Azure Data Factory and Azure Synapse. Snowflake experience is a plus.
- Advanced experience developing data pipelines using Azure Data Factory and Snowflake
- Experience in using ADF and Snowflake’s components to move data using different ETL techniques
- Experience with Azure, Azure DevOps, Azure Data Lake, Azure SQL Server
- Advanced experience with T-SQL, data warehousing design, BI concepts, and best practices
- Optimize data loading, querying, and performance within Synapse workspaces and Snowflake
- Experience with ETL processes using Snowflakes build in features
- Bachelor's Degree from an accredited college or university or equivalent practical and relevant experience.
- Clear verbal and written communication skills.
- Previous experience in the health industry, SSRS, or Power BI is a plus.
Requirement: Bachelor’s degree in Computer Science or related field or equivalent combination of industry-related professional experience and education.