Job Description
Job DescriptionAbout Us:
Tech9 is shaking up a 20-year-old industry, and we're not slowing down. Recognized by Inc. 5000 as one of the nation's fastest-growing companies, we are dedicated to building innovative, highly complex web applications. Our team is passionate about delivering quality software that meets the highest standards. We offer a 100% remote working environment with a collaborative and supportive team, allowing you to focus on what you do best. Our current need is for a Data Engineer to join our US team. This role will allow you to work directly with our client, where you will get hands-on experience working with new projects and teams.
Why Join Us?
-
Challenging Problems: You will tackle complex and exciting challenges.
-
Flexibility: We offer a flexible and autonomous working environment.
-
Collaboration: Work with skilled and friendly teammates who are committed to quality.
-
Support: We fully support your efforts to build software the right way.
-
Tools: We provide the necessary tools for you to excel at your job.
-
Remote Work: Enjoy the benefits of a 100% remote work environment.
About this Project – Security & Monitoring Data
This initiative focuses on internal security and code safety across development teams. The project involves handling vulnerability data and security monitoring data from multiple sources, ensuring they are properly ingested, standardized, and available in the data warehouse. While most data sources are already ingested, the Data Engineer will complete the ingestion of remaining sources, validate existing pipelines, and ensure the Databricks warehouse meets security, compliance, and performance standards.
The role also involves integrating streaming data pipelines, working with graph APIs, and enabling Power BI dashboards that provide clear visibility into security and vulnerability metrics. Additionally, the engineer will support automation and deployment efforts through Azure DevOps pipelines and infrastructure-as-code, ensuring a secure, modern, and reliable data ecosystem.
Key Responsibilities
-
Ingest and integrate security and vulnerability data sources into Databricks.
-
Validate and standardize the data warehouse to meet security and compliance requirements.
-
Build and maintain streaming pipelines for real-time security monitoring data.
-
Work with graph APIs and REST APIs to expand the organization’s security data ecosystem.
-
Design and optimize ETL workflows using DBT, Databricks notebooks, and Python.
-
Develop and support Power BI dashboards and reports (backend and frontend) to deliver actionable insights.
-
Implement and maintain DevOps pipelines in Azure for automation, CI/CD, and secure deployments.
-
Collaborate with client stakeholders, outsourced vendors, and internal teams to ensure smooth delivery.
-
Participate in Agile sprint cycles, making and delivering on commitments.
-
Communicate clearly and effectively in English, both technically and cross-functionally.
Required Skills & Experience
-
Cloud & Data Platforms: Strong experience with Microsoft Azure (Data Lake, Synapse, etc.), Databricks, and Azure DevOps.
-
ETL & Data Warehousing: Expertise in dimensional modeling, ETL pipelines, and DBT.
-
Programming & APIs: Proficiency in Python for data engineering and automation; familiarity with graph APIs and REST APIs.
-
Streaming Data: Hands-on experience with streaming ingestion and processing.
-
Security Data Engineering: Experience working with vulnerability/security monitoring data ingestion and validation.
-
Reporting & Visualization: Advanced experience with Power BI (backend and frontend), including semantic modeling and SQL endpoints.
-
Collaboration & Communication: Excellent English communication skills, with proven success in Agile environments.
Preferred Skills
-
Terraform: Hands-on experience with infrastructure-as-code for cloud deployments.
-
ML Document Processing: Familiarity with ML-based document processing solutions.
Interview Process Overview
The process is designed to be thoughtful, efficient, and focused on both technical ability and team fit.
- 30-minute on-demand HireVue screening, where you'll respond to situational and behavioral questions to help us understand your ownership mindset, adaptability, and approach to collaboration.
- 10-minute virtual Q&A session with our recruiter to clarify the role and answer any questions you may have. This is not an interview, just a conversation to ensure alignment.
- 60-minute live technical interview with one our Senior Data Engineers.
- 60-minute technical interview with a member of our Director of Engineering or CTO
- 15-30 minute chat with the hiring manger
- 30-60-minute session with the client
#LI-Remote
#UnitedStates
To ensure you've received our notifications, please whitelist the domains jazz.co, jazz.com, and applytojob.com
Powered by JazzHR
b0HSA5Fv32