Job description
Classification: Contract-To-Hire
Contract Length: 12-months
Job ID: 15750002
At CereCore, our heart for healthcare is interconnected with our knowledge of technical solutions, creating a vital link that ultimately drives the delivery of high-quality care. CereCoreis a wholly-owned subsidiary of Hospital Corporation of America (HCA) Healthcare.
CereCore is seeking a Cloud Data Engineer- Consultant to join ourteam.
Summary:
Data Engineers within HCA’s Information and Analytics organization are responsible for defining and implementing data management practices across the enterprise. This contract position will focus primarily on enterprise data management and migrating of data to the cloud. This role requires working closely with the different data teams and requires ‘self-starters’ who are proficient in problem solving and capable of bringing clarity to complex situations.
Data Engineers are expected to source and incorporate new data sources into the Enterprise Data Ecosystem. The responsibilities will include writing, testing, and reviewing ETL pipelines for defining and implementing data management practices across the enterprise.
Responsibilities:
- Implement data migration pipelines from Teradata to the cloud.
- Implement enterprise data management practices, standards, and frameworks for Data Integration
- Develop, manage, and own full data lifecycle from raw data acquisition through transformation to end user consumption.
- Analyze requirements, design data pipelines and integrate those solutions for customer environments
- Understanding of fundamental cloud computing concepts
- Translate business requirements into technical design specifications
- Closely collaborates with team members to successfully execute development initiatives using Agile practices and principles
- Maintains a holistic view of information assets by creating and maintaining artifacts that illustrate how information is stored, processed, and accessed
- Provide guidance on technology choices and design considerations for migrating data to the Cloud
- Experience with building consumable data lakes, analytics applications and tools
- Designing the cloud environment from a comprehensive perspective, ensuring that it satisfies all of the company’s needs.
- Performing activities such as deployment, maintenance, monitoring, and management inside the cloud framework that has been created
- Work closely with individuals across the technology organizations to help promote awareness of the data architecture and ensure that enterprise
- assets of competence are leveraged
Requirements:
- Cloud Data Experience; GCP preferred
- Extensive Experience with ETL and big data tools such as Spark/Kafka/Hadoop etc.
- Teradata ETL experience using BTEQ and SQL scripts.
- Extensive experience with relational database management systems; Teradata, Oracle or SQL Server preferred.
- Knowledge of ETL tools such as such as StreamSets, Cloud Data flow, Connect ETL etc.
- Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool specific experience in the RDBMS's listed above is ideal
- Experience with Oracle, SQL Server, and other database platforms.
- Scripting experience with Unix/Linux.
- Experience with Git and GitHub version control.
- Experience with relational databases such as Teradata and public cloud technologies such as, GCP Big Query, GCP Data Catalog and Azure Data Bricks preferred
- Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool specific experience in the RDBMS's listed above is ideal. Experience with GCP Big query is preferred
- Experience with Cloud Data Flow, Airflow, Cloud Composer, Streamsets or managing streaming data is strongly preferred
- Ability to troubleshoot, maintain, reverse engineer and optimize existing ETL pipelines.
- Experience with Cloud Data Flow, Airflow, Cloud Composer, Cloud Data Fusion, Data Catalog, Kafka, DataProc, github, Streamsets or managing streaming data is strongly preferred
- NoSQL, Hbase, Cassandra, MongoDB, In-memory, Columnar, other emerging technologies
- Ability to analyze and interpret complex data, and offer solutions to complex clinical problems.
- Ability to work independently on assigned tasks.
- Strong written and verbal communication skills including the ability to explain complex technical issues in a way that non-technical people may understand.
- Excellent problem-solving and critical thinking skills.
- Knowledge of IT governance and operations.
Job Types: Full-time, Contract
Pay: $135,000.00 - $155,000.00 per year
Benefits:
- Flexible schedule
- Health insurance
Schedule:
- 8 hour shift
- Holidays
- Weekend availability
Experience:
- SQL: 6 years (Required)
- Big Data tools: 6 years (Required)
- GCP: 6 years (Required)
- ETL: 7 years (Required)
Work Location: Remote
blackflymedia.com is the go-to platform for job seekers looking for the best job postings from around the web. With a focus on quality, the platform guarantees that all job postings are from reliable sources and are up-to-date. It also offers a variety of tools to help users find the perfect job for them, such as searching by location and filtering by industry. Furthermore, blackflymedia.com provides helpful resources like resume tips and career advice to give job seekers an edge in their search. With its commitment to quality and user-friendliness, blackflymedia.com is the ideal place to find your next job.