- Home
- Remote Jobs
- [Remote] Senior Data Engineer
[Remote] Senior Data Engineer
Job summary
Work model
About Interos.ai
Interos.ai provides continuous visibility, analysis, and monitoring of extended supply chains to identify and manage risk factors. Founded in 2005 and headquartered in Arlington, Virginia, USA, Interos.ai operates with a workforce of 51-200 employees.
Website: http://www.interos.ai
Company H1B Sponsorship
Interos.ai has a track record of offering H1B sponsorships, with 3 in 2025, 1 in 2024, 1 in 2023, 3 in 2022, and 2 in 2021. Please note that this does not guarantee sponsorship for this specific role.
Role Overview
Interos.ai is defining the category of supply chain risk intelligence, building the world's most trusted and transparent supply chains. This is a remote job open to candidates in the USA.
Responsibilities
- Build and share knowledge of the data flow throughout the Resilience platform
- Optimize the storage, processing, and movement of data, such as tuning data models or refactoring data software
- Develop, document, and maintain new data platform functionality
- Contribute to software and data architecture design and reviews
- Provide support for other engineers or internal customers in the matters of data or software
- Implement and enforce best practices for code quality, testing, and documentation
- Improve or develop frameworks, packages, and/or documentation to support engineering standards
- Conduct code reviews to ensure adherence to coding standards and promote knowledge sharing within the team
Skills Required
- 5 years of experience in Software Development
- 3 years of full-time professional Python experience including production experience with data pipelines
- 3 years of experience with SQL via relational or columnar databases
- 2 years of experience working with Snowflake
- 2 years of experience developing in AWS
- 2 years of experience in data streaming or event-driven systems with Kafka or another stream processing system
- Bachelor's degree in Computer Science or equivalent experience
- Experience with job or pipeline orchestration, using tools like Prefect, Databricks, Airflow, Argo, etc.
- Passionate about writing code that is scalable, maintainable, reusable, and well-tested
- Interested in building reliable, fault-tolerant, production-grade systems
- Comfortable debugging large or complex applications and, if necessary, guiding their refactor
- Excellent communication skills with the ability to convey technical concepts to both technical and non-technical stakeholders
- Experience developing data software in an agile environment
- Enjoy optimizing complex processes and leaving something better off than where you found it
- A seasoned engineer who enjoys sharing your experience with the team
Benefits
- Comprehensive health, dental & vision insurance
- 401(k) with employer match
- Flexible Time Off (FTO) 10 paid holidays
- Opportunity to help shape an early, fast-growing market
- A flexible, remote-first work environment that empowers you to perform at your best wherever you are
- An incredible culture built on trust, accountability, and collaboration across time zones and teams