Vacancy expired!
Center 2 (19050), United States of America, McLean, Virginia
Senior Data Engineer - Principal Associate (Remote-Eligible)Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies such as Machine Learning based Data Quality solutions.Team Info:As a Capital One Data Engineer, you'll have the opportunity to be on the forefront of driving a major transformation within Capital One. You'll be driving the creation of the next generation transformation engine within the Finance Technology Line of Business, allowing multiple other applications to source business critical data that is of the highest quality while meeting strict Service Level Agreements.What You'll Do:- Collaborate with and across Agile teams and lines of businesses to design, develop, test, implement, and support technical solutions for data transformation using technologies such as Spark, Scala, Python
- Build a machine learning based data quality solution for Anomaly detection
- Work with a team of developers with deep experience in machine learning, data engineering, distributed microservices, and full stack systems
- Utilize programming languages like Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Snowflake
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
- Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment
- Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
- Bachelor's Degree
- At least 4 years of experience in application development (Internship experience does not apply)
- At least 1 year of experience in big data technologies
- 5+ years of experience in application development including Python, SQL, Scala, or Java
- 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
- 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, or MySQL)
- 2+ years of data warehousing experience (Redshift or Snowflake)
- 3+ years of experience with UNIX/Linux including basic commands and shell scripting
- 2+ years of experience with Agile engineering practices