Vacancy expired!
Dataops EngineerRemoteTop Requirements:USCMust be okay with 6-8 week clearance timeline4+ years ETL expAWS exp (flexible here if they have a lot of ETL exp)Postgres, Data warehousing, Redshift exp Job DescriptionApex Systems is seeking DataOps Engineer in support of our new VIPER program delivering innovative software solutions for our customer the US Citizenship and Immigration Services (USCIS). You'll use the latest AWS technologies to perform data mining, validating data sets, and analyzing the results. What You Will Do: Use data analytic and statistical modeling techniques for collecting, reviewing, interpreting, evaluating, and integrating data from multiple sources to identify patterns, trends, or relationships. data sets or develop mathematical models to address various USCIS problems. Prioritize initiatives based on business needs and requirements while managing competing resources and priorities. Write code for connecting, extracting, or importing via REST data services. Evaluate and integrate data models while catering to the requests of the Analytics team. Work with ETL pipelines and cloud infrastructure platforms like AWS S3 is the key experience required for this project. Monitor and troubleshoot job failures during on-call rotations while demonstrating skilled knowledge of EKS, Splunk, and New Relic. Other duties as assigned. What We Need: Bachelors degree in Computer Science, Data Science, or related field. In lieu of degree, will consider combination of training, certifications, and years of experience. Minimum of five (5) years of experience in data analysis / data engineering / business intelligence. Minimum of five (5) years of practical and functional experience in data extraction, analysis, and/or reporting typically achieved through work as a: business, data, operations, compliance, or risk analyst; internal auditor; or role that includes significant aspects of the listed essential job functions. Strong Business Acumen and strategic thinker who understands data implications across enterprise ecosystem. Experience managing data through its lifecycle (especially sourcing, validating, and deploying). Demonstrated ability to perform complex queries, assess data quality and develop data mappings. Experience working with ETL pipelines and cloud infrastructure platforms like AWS S3 is the key experience required for this project. Hands-on familiarity with deployment tools like Jenkins, Docker, Terraform, etc. Experience with multiple data warehousing and analytics development projects. Experience communicating with Data Scientists and understand business need for data. Experience working in an Agile DevOps Environment. Required Technical Expertise:AWS S3, PostgreSQL, Amazon Redshift, Databricks, SAS, DMS, dbt, EKS, Splunk, New Relic Certifications: AWS Certificate is a plus. Knowledge of the DBT tool is a plus. Clearance Level: Position of Public Trust DHS EOD (Enter on Duty) Suitability