We are seeking an experienced Data Engineer to join the Governance, Risk, and Compliance (GRC) team at Raytheon Technologies. The candidate will work closely with our GRC Dev Ops team and various IT and Cybersecurity stakeholders to design, implement, and maintain data warehousing solutions. This role focuses on building scalable data pipelines and models, transforming raw data (from structured, semi-structured and unstructured sources) into curated datasets, and ensuring data is accessible for BI reporting and AI/ML use cases.
- Collaborate with the Business and Data Analysts as well as Front-end and Full Stack AI Developers to understand data requirements and deliver scalable solutions that support large-scale automation initiatives, incorporating AI/ML
- Design, develop, and optimize ETL/ELT pipelines to process, model and transform data from raw to curated layers, enabling seamless integration into published layers for BI and advanced analytics
- Implement and manage data warehousing solutions using Object storage, Snowflake, Databricks, Matillion, and Informatica
- Develop and maintain APIs to facilitate secure and efficient data integration between various IT, Cyber and GRC systems, applications, and data pipelines
- Ensure the accuracy, reliability, and scalability of data pipelines and data models
- Support the ingestion, integration, and transformation of large datasets to meet IT, Cybersecurity and GRC operational and reporting needs
- General: Partner with stakeholders to understand their data and reporting requirements and provide tailored solutions
- General: Stay informed on the latest advancements in data engineering, warehousing, and integration tools and methodologies
- Proven experience as a Data Engineer with a focus on data warehousing, ETL/ELT development, and pipeline design
- Strong proficiency in SQL, experience with relational and non-relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake, Databricks)
- Experience building APIs and integrating data pipelines with RESTful or GraphQL APIs and implement CI/CD for pipelines and SQL transformations (Git workflows, automated testing, release/version control)
- Hands-on experience with ETL/ELT tools and platforms such as Matillion, Informatica, or equivalent
- Proficiency in programming languages such as Python or Java for building and optimizing data pipelines
- General: Expertise in cloud platforms (AWS, Google Cloud, Azure) and their data services
- General: Familiarity with BI tools like Power BI and an understanding of how to prepare data for reporting needs
- General: Strong analytical and problem-solving skills with a focus on delivering high-quality, scalable solutions
- General: Excellent communication and collaboration skills for cross-functional teamwork
- Experience working on Cybersecurity or GRC-related projects or industries
- Working knowledge of machine learning and AI concepts (model registry/model hub workflows or equivalent)
- Familiarity with data governance, security, and compliance principles
- Understanding of regulatory compliance standards and frameworks
GeoLogics is an Equal Opportunity/Affirmative Action Employer that is committed to hiring a diverse and talented workforce. EOE/Disability/Veteran