Educational & Experience Requirements:• Minimum of a Bachelor’s in computer science or related discipline• 8+ years of experience in software development including data architecture, design, development, migration, and support Required Skills:• Experienced in analysis, design, development, testing and maintenance of large data warehouse systems with various heterogeneous source systems like SAP, SQL Server, flat files, and legacy systems.• Experience with AWS and GCP to design, debug, and develop cloud data pipelines. Some tools of interest are:â—‹ Fargateâ—‹ SageMakerâ—‹ Google Big Queryâ—‹ Airflowâ—‹ Kafkaâ—‹ Infrastructure as Code• Experience writing and optimizing SQL queries against large volumes of data.• Experience designing and developing ETL and ELT pipelines with both relational and non-relational data.• Experience in an agile environment with DevOps tools such as GitHub Actions, Jenkins, or Circle CI.• Python programming applied to data engineering. Experience with pandas, numpy, and asyncio.• Unix shell scripting for job automation.• Logical and physical data modeling, schema design, de-normalization techniques for large platforms like Teradata, Informatica, SQL Server using query tools like SQL developer and/or SQL assistance• Excellent oral and written communication and interpersonal skills. Good at documenting including functional specs, procedures, and status updates• Ability to work independently with minimum supervision on small and large initiatives Preferred experiences or skills:• Experience developing visualizations in a platform like Tableau or Spotfire.