Position Purpose:
Develops and operationalizes data pipelines to make data available for consumption (reports and advanced analytics), including data ingestion, data transformation, data validation / quality, data pipeline optimization, and orchestration. Engages with the DevSecOps Engineer during continuous integration and continuous deployment.
Education/Experience:
A Bachelor's degree in a quantitative or business field (e.g., statistics, mathematics, engineering, computer science) and requires 2 – 4 years of related experience.
Or equivalent experience acquired through accomplishments of applicable knowledge, duties, scope and skill reflective of the level of this position.
Technical Skills:
Experience with Big Data; Data Processing
Experience with Other: diagnosing system issues, engaging in data validation, and providing quality assurance testing
Experience with Data Manipulation; Data Mining
Experience with Other: Experience working in a production cloud infrastructure
Experience with one or more of the following C# (Programming Language); Java (Programming
Knowledge of Microsoft SQL Servers; SQL (Programming Language)
Soft Skills:
Intermediate - Seeks to acquire knowledge in area of specialty
Intermediate - Ability to identify basic problems and procedural irregularities, collect data, establish facts, and draw valid conclusions
Intermediate - Ability to work independently
Responsibilities:
Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction (scripts, programs, automation, assisted by automation, etc.)
Designs, develops, implements, tests, documents, and operates large-scale, high-volume, high-performance data structures for business intelligence analytics
Designs, develops, and maintains real-time processing applications and real-time data pipelines
Ensure quality of technical solutions as data moves across Centene’s environments
Provides insight into the changing data environment, data processing, data storage, and utilization requirements for the company and offers suggestions for solutions
Develops, constructs, tests, and maintains architectures using programming language and tools
Identifies ways to improve data reliability, efficiency, and quality; use data to discover tasks that can be automate