Senior Manager, Precision Navigation and Sensing (R4256)
Lead a team of software developers and sensor experts to develop and field optimize algorithms and sensors for accurate, reliable state estimates enabling autonomous operation of VBAT and XBAT aircraft. Develop and implement advanced sensor algorithms for processing data from IMUs, radar, cameras, GPS, and other sensors. Enhance state estimation algorithms by integrating multi-sensor data to improve accuracy and robustness. Select, characterize, and field precision navigation sensors such as cameras, radar, IMUs, and GPS. Design and implement real-time sensor data processing pipelines. Collaborate with cross-functional teams including software engineers, autonomy researchers, and hardware engineers to ensure seamless integration of state estimation algorithms. Conduct experiments and field tests to validate algorithm performance in real-world conditions. Stay updated with sensor technology and state estimation advancements, applying new techniques to systems.
Software Engineer, Simulation
As a Senior AI Research Scientist for Vision-guided robotics, you will lead the research and development of novel deep learning algorithms that enable robots to perform complex, contact-rich manipulation tasks. Your work involves exploring the intersection of computer vision and robotic control to design systems that allow robots to perceive and interact with objects in dynamic environments. You will create models that integrate visual data to guide physical manipulation, advancing beyond simple grasping to sophisticated handling of diverse items. Collaboration with a multidisciplinary team of engineers and researchers is required to translate cutting-edge concepts into robust capabilities deployable on physical hardware for industrial applications. Responsibilities include researching and developing deep learning architectures for visual perception and sensorimotor control in contact-rich scenarios, designing algorithms for manipulation of complex or deformable objects with high precision, collaborating with software engineers to optimize and deploy research prototypes onto robotic hardware, evaluating model performance in simulation and real-world settings to ensure robustness, identifying opportunities to apply state-of-the-art computer vision and robot learning advancements to practical industrial problems, mentoring junior researchers, and contributing to the technical direction of the manipulation research roadmap.
Senior Staff Engineer, Autonomy (R3955)
As an Autonomy Engineer on the team, the individual will lead teams at the intersection of artificial intelligence, task and motion planning, and controls. Responsibilities include working closely with engineers to architect solutions, set standards for software engineering, drive strategic technical improvements, and mentor other engineers. The role involves integrating Shield AI autonomy products with third party systems, writing autonomous behaviors for varying aircraft platforms such as VTOL and Fixed Wing, and collaborating with Subject Matter Experts to understand customer needs and implement appropriate autonomy solutions.
Multi‑Target Tracking & Sensor Fusion Engineer (R4172)
Design, research, and implement state-of-the-art multi-target tracking and data association algorithms. Develop production-quality C++ software for deployed military aviation platforms, ensuring deterministic, real-time performance. Build and maintain comprehensive unit, integration, and system-level tests to validate algorithm correctness and robustness. Enhance and calibrate sensor models in advanced simulation and hardware-in-the-loop (HWIL) environments. Collaborate on feature planning, decomposition, and milestone execution within an agile development framework. Contribute to flight-test planning, performance analysis, benchmarking, and regression evaluation. For principal-level applicants, provide technical leadership, design reviews, algorithmic mentorship, and subject-matter expertise across the autonomy organization.
Autonomy Engineer, Navigation (R4171)
Contribute to the design, development, integration, test, and deployment of the Hivemind navigation stack. Research and develop advanced state estimation and navigation algorithms for assured Position, Navigation, and Timing (PNT) in contested environments. Design, write, and deploy production-quality C++ software for aviation platforms requiring real-time, deterministic performance. Build and maintain comprehensive unit, integration, and system-level tests to validate navigation software performance under operational constraints. Develop modeling, calibration, and simulation tools for inertial and aided navigation technologies in airborne platforms. Participate in agile-based product planning, feature definition, capacity estimation, and cross-team collaboration. Contribute to ongoing system performance evaluation, regression analysis, and V&V (Verification and Validation) efforts.
Field Applications Engineer (R4163)
Integrate autonomy software onto unmanned aircraft systems, ensuring seamless operation across onboard compute, sensors, and control interfaces. Own the build, configuration, and validation process for flight-ready systems; coordinate hardware/software compatibility and mission readiness. Travel to test sites and support live flight operations, including safety checks, system bring-up, and troubleshooting under time-critical constraints. Diagnose and resolve integration issues across complex autonomy software stacks and embedded systems in lab and field environments. Manage data collection during missions and post-test analysis, working with autonomy engineers to refine behaviors and identify improvements. Work closely with autonomy, GNC, systems, and test teams to ensure mission-critical functionality is delivered on time and validated thoroughly. Build tools and processes to improve integration timelines, flight test reliability, and team efficiency across deployment cycles. Assist with documentation and system-level validation required for certification, airworthiness, and compliance in defense-relevant environments. Members of this team typically travel around 30-40% of the year to different office locations, customer sites, and flight integration events.
Scale.ai - Software Engineer (Robotics & Autonomous Systems)
In this role, you will own and architect large-scale data processing pipelines for robotics and autonomous vehicle datasets. You will build machine learning training and fine-tuning pipelines using Scale's robotics data. The job requires working across backend technologies (Python, Node.js, C++) and frontend stacks (React, TypeScript) to build end-to-end solutions. You will develop tools and systems for robotics data collection, teleoperation, and model evaluation, and interact directly with robotics and autonomous vehicle stakeholders to understand their technical needs and drive product development. Responsibilities also include building real-time systems for robotic control, sensor fusion, and perception pipelines, designing comprehensive monitoring and evaluation frameworks for robotics models and data quality, collaborating with machine learning engineers and researchers to bring robotics research into production, and delivering features at high velocity while maintaining system reliability and performance.
Mission Planning Engineer - Guidance
Design and implement mission-planning, routing, and path-optimization algorithms; implement visual navigation and map-preloading for GPS-denied or spoofed conditions; integrate algorithms into backend and simulation frameworks for real-time use; build and run simulations to validate algorithm robustness under contested scenarios; design stress-testing methodologies and performance metrics for field validation; collaborate with UX and Product teams to ensure outputs are interpretable, usable, and operationally meaningful; work closely with backend teams to ensure scalability and low-latency integration.
Mechatronics Scientist
The role involves learning to design, deploy, and operate the backbone of AI infrastructure, working with GPU clusters, high-speed networking, and large-scale Linux systems. The candidate will contribute to the architecture of Australia's first production-scale AI datacenter systems, collaborate with AI researchers and engineers to shape the next generation of model training environments, and develop new operational and architectural approaches from first principles.
Access all 4,256 remote & onsite AI jobs.
Frequently Asked Questions
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.