Seize your opportunity to make a personal impact supporting the Case Management Modernization (CMM) Program. The CMM program is an initiative to support the Administrative Office of the US Courts (AO) in developing a modern cloud-based solution to support all 204+ federal courts across the United States.
GDIT is your place to make meaningful contributions to challenging projects and grow a rewarding career. The Senior Data Engineer/ Developer will work as part of the CMM Enterprise Data Warehouse (EDW) team to deploy a secured cloud-native EDW platform and support the statutory and operational reporting, data cataloging, and other analytical objectives.
The Senior Data Engineer/ Developer is responsible for designing, building, testing, and maintaining scalable data engineering components and platform services that power the EDW. This role delivers high-quality, secure, and performant data pipelines, transformations, and integrations aligned with federal standards and EDW SOW objectives for modernization, analytics, and operational excellence.
RESPONSIBILITIES:
Develop a solution to integrate case management business data from the case management system into Snowflake EDW on a near real-time basis
Produce and maintain dimensional data models that form the basis for the data marts to satisfy the requirements of the data products
Develop and maintain downstream data marts and their corresponding ETL/ELT processes (on a near real-time basis or overnight batch depending on the individual uses case requirements) and their associated reports and dashboards
Develop and maintain comprehensive documentation including the STTMs, data models (logical and physical), design documents, testing documents and other related documents
Develop, deploy and maintain the data presentation and data reports for each of the data products using SAP Business Objects, Tableau, and/or other presentation tool(s) selected by the customer
Design and develop data ingestion, ETL/ELT pipelines, and transformation logic for nearreal-time and batch workloads
Implement robust, reusable data services supporting analytics, reporting, and downstream data marts
Collaborate with architects to implement logical and physical data models in Snowflake
Develop and maintain high-quality, testable code using secure coding standards and best practices
Integrate data pipelines with cloud services, messaging, and storage components
Implement data quality checks, validations, and error-handling mechanisms.
Optimize pipeline performance, scalability, and cost efficiency
Support CI/CD-enabled deployments, including automated testing and promotion across environments
Participate in code reviews, design reviews, and sprint ceremonies
Support incident resolution and root cause analysis for data pipeline failures
Produce and maintain technical documentation, runbooks, and workflows.
Ensure deliverables meet federal security, governance, and audit requirements
Operates within an Agile federal delivery environment across multiple scrum teams
Collaborates closely with architects, DBAs, business analysts, and QA personnel
Accountable for code quality, performance, and delivery timelines
Expected to maintain audit-ready documentation and technical artifacts
REQUIRED EXPERIENCE & QUALIFICATIONS:
Bachelor's degree in Computer Science, Computer Programming, Computer Engineering or relevant computer-based major, strongly preferred
8 years of IT related experience with 7+ years experience specifically in developing IT and cloud infrastructures
Experience in software engineering and design architectures
Experience and understanding of best practices regarding system security measures
Experience with building and migration software and IT services to align to strategic business needs and goals
Experience in conducting research for advanced technologies to determining how IT can support business needs leveraging software, hardware, or infrastructure
Experience with AWS data and compute services
Experience with Airflow or equivalent orchestration tools
Knowledge of cloud messaging and storage services
Proven track record in software and data engineering roles
Hands-on experience building enterprise-scale data pipelines
Strong proficiency with SQL and data transformation techniques
Experience developing in cloud-based data platforms
Familiarity with Agile/Scrum delivery environments
Experience working in regulated or compliance-driven environments
Experience supporting federal EDW or analytics programs
Hands-on experience with Snowflake
Proficiency with ETL/ELT frameworks
Experience with streaming or nearreal-time data ingestion
Familiarity with data governance, metadata, and classification standards
Experience mentoring junior developers
CERTIFICATIONS (Preferred):
Snowflake SnowPro Core
AWS Certified Data Analytics Specialty
Relevant programming or data engineering certifications
COMMUNICATION & ORGANIZATIONAL:
Excellent presentation and communication (oral and written) skills
Consultant mindset with the ability to work with high level customer stakeholders and build excellent customer relationship
Experience identifying and applying industry tools, solutions, methods best practices, and emerging technologies
Strong analytical skills and problem-solving skills with the ability to formulate and communicate recommendations for improvement
Demonstrated ability to work effectively, independently, and as part of a team
GDIT IS YOUR PLACE:
At GDIT, the mission is our purpose, and our people are at the center of everything we do.
Growth: AI-powered career tool that identifies career steps and learning opportunities
Support: An internal mobility team focused on helping you achieve your career goals
Rewards: Comprehensive benefits and wellness packages, 401K with company match, and competitive pay and paid time off
Community: Award-winning culture of innovation and a military-friendly workplace
OWN YOUR OPPORTUNITY: Explore a career in data science and engineering at GDIT and youll find endless opportunities to grow alongside colleagues who share your determination for solving complex data challenges.