Service Technician Associate I - Pittsburgh, PA (Contract)
Develop tools for validation and regression testing of image sensors, image processing pipelines, and hardware and software integration. Perform lab and real-world camera data collection and data analysis. Participate in tuning of sensor parameters and image processing pipelines to optimize image quality. Troubleshoot camera and image quality issues observed on autonomous vehicles. Design new hardware and the necessary software for sensor range. Work with perception software team to assess end to end camera performance.
Senior Manager, Perception
Lead high-impact Perception teams, managing technical roadmap and milestone goals. Collaborate with AI and software leaders, simulation, systems design, and mission assurance teams to deliver a dynamic objects perception system across various sensor modalities and perception pipelines. Build and lead a group of managers and engineers responsible for roadmap, productivity, execution, and impact. Set vision for and grow a team of software engineers involved in planning, execution, and success of complex technical projects, providing technical leadership. Collaborate across teams to brainstorm and accelerate perception capability development. Provide summaries, progress updates, and recommendations to executive leadership. Establish best practices and statistical rigor around data-driven decision-making. Stay updated on industry and academic trends in AI and perception.
Intern Robotics Software Engineer
Lead the research and development of novel deep learning algorithms that enable robots to perform complex, contact-rich manipulation tasks. Explore the intersection of computer vision and robotic control, designing systems that allow robots to perceive and interact with objects in dynamic environments. Create models that integrate visual data to guide physical manipulation, moving beyond simple grasping to sophisticated handling of diverse items. Collaborate with a multidisciplinary team of engineers and researchers to translate cutting-edge concepts into robust capabilities that can be deployed on physical hardware for industrial applications. Research and develop deep learning architectures for visual perception and sensorimotor control in contact-rich scenarios. Design algorithms that enable robots to manipulate complex or deformable objects with high precision. Collaborate with software engineers to optimize and deploy research prototypes onto physical robotic hardware. Evaluate model performance in both simulation and real-world environments to ensure robustness and reliability. Identify opportunities to apply state-of-the-art advancements in computer vision and robot learning to practical industrial problems. Mentor junior researchers and contribute to the technical direction of the manipulation research roadmap.
Intern: Software Engineer
Lead the research and development of novel deep learning algorithms that enable robots to perform complex, contact-rich manipulation tasks. Explore the intersection of computer vision and robotic control, designing systems that allow robots to perceive and interact with objects in dynamic environments. Create models that integrate visual data to guide physical manipulation, moving beyond simple grasping to sophisticated handling of diverse items. Collaborate with a multidisciplinary team of engineers and researchers to translate cutting-edge concepts into robust capabilities deployable on physical hardware for industrial applications. Research and develop deep learning architectures for visual perception and sensorimotor control in contact-rich scenarios. Design algorithms enabling robots to manipulate complex or deformable objects with high precision. Collaborate with software engineers to optimize and deploy research prototypes onto physical robotic hardware. Evaluate model performance in both simulation and real-world environments to ensure robustness and reliability. Identify opportunities to apply state-of-the-art advancements in computer vision and robot learning to practical industrial problems. Mentor junior researchers and contribute to the technical direction of the manipulation research roadmap.
Senior Software Engineer, Pilots
As a Senior Software Engineer on the Pilots team, the responsibilities include delivering robust, thoroughly tested, and maintainable C++ code for edge and robotics platforms, designing, implementing, and owning prototype perception systems that may transition into production-grade solutions, constructing and refining real-time perception pipelines including detection, tracking, and sensor fusion, adapting and integrating ML and CV models for Hayden-specific applications, driving technical decision-making balancing prototyping speed with production readiness, collaborating with the Product team and cross-functional Engineering departments, and contributing to shared infrastructure, tooling, and architectural patterns as pilots mature into foundational products.
Machine Learning Engineer - Perception Mapping (copy)
As a software engineer on the perception mapping team at Zoox, you will curate, validate, and label datasets for model training and validation. You will research, implement, and train machine learning models to perform semantic map element detection and closely collaborate with validation teams to formulate and execute model validation pipelines. You will integrate models into the greater onboard autonomy system within compute budgets. Additionally, you will serve as a technical leader on the team, maintaining coding and ML development best practices and contributing to architectural decisions.
Senior SLAM Engineer
Design and implement state-of-the-art SLAM algorithms for real-time localization and mapping using multi-modal sensor inputs such as cameras, IMUs, GPS, and wheel encoders. Develop robust online and offline state estimation methods for complex urban and highway environments. Focus on 3D geometric vision problems including VSLAM, VIO, SfM, and scene reconstruction. Implement robust motion estimation, feature matching, loop closure, and map optimization pipelines. Apply non-linear optimization and filtering techniques like bundle adjustment, graph SLAM, and EKF to maximize system accuracy and robustness. Collaborate with sensor calibration and perception teams to improve system performance and consistency. Evaluate and benchmark system performance using large-scale datasets and real-world driving scenarios. Contribute to system integration, continuous validation, and deployment of SLAM modules on autonomous vehicle platforms. Mentor junior engineers and contribute to technical leadership within the team.
Robotics Engineer
As a Software Engineer in the Robotics and Automation group, you will design and deploy systems to automate material science research and discovery laboratories, specializing in robotics, automation, and perception software development. You will architect and develop software systems that control and orchestrate robotic workcells for autonomous materials experimentation, design scalable control frameworks for flexible automation involving robots, motion systems, sensors, and lab instruments. Your role involves collaborating with hardware, mechatronics, and science teams to translate experimental workflows into reliable automated processes; building and maintaining APIs and services for scheduling, execution, monitoring, and data capture; developing simulation, testing, and validation tools to accelerate development and ensure system reliability; integrating 2D and 3D vision systems with robotic manipulation, motion planning, and execution; optimizing system performance, robustness, and throughput under rapid iteration cycles; contributing to technical direction, architecture decisions, and best practices; mentoring junior engineers and helping establish engineering standards; and fostering collaboration and open-mindedness to empower the team to deliver world-class technology at an unprecedented speed.
Perception Engineer
Design, implement, and deploy 2D and 3D vision systems for robotic manipulation, inspection, state verification, and sensor fusion; develop vision-guided automation solutions integrating cameras, lighting, optics, and robots in laboratory and industrial environments; implement perception pipelines for object detection, segmentation, pose estimation, and feature extraction; own camera calibration and system-level accuracy validation; develop novel algorithms for state estimation of fluids and particle flows; integrate vision outputs with robot motion planning, grasping, and task execution; tune and harden vision systems for robustness against variability in materials, reflections, and environmental conditions; collaborate with software, mechatronics, and mechanical teams to translate experimental and operational needs into automated solutions; contribute to technical direction, architecture decisions, and best practices across the robotics, perception, and automation software stack; and bring an attitude of collaboration and open-mindedness to facilitate fearless and creative problem solving that empowers the team to ship world-class technology at an unprecedented speed.
Senior Manager, Precision Navigation and Sensing (R4256)
Lead a team of software developers and sensor experts to develop and field optimize algorithms and sensors for accurate, reliable state estimates enabling autonomous operation of VBAT and XBAT aircraft. Develop and implement advanced sensor algorithms for processing data from IMUs, radar, cameras, GPS, and other sensors. Enhance state estimation algorithms by integrating multi-sensor data to improve accuracy and robustness. Select, characterize, and field precision navigation sensors such as cameras, radar, IMUs, and GPS. Design and implement real-time sensor data processing pipelines. Collaborate with cross-functional teams including software engineers, autonomy researchers, and hardware engineers to ensure seamless integration of state estimation algorithms. Conduct experiments and field tests to validate algorithm performance in real-world conditions. Stay updated with sensor technology and state estimation advancements, applying new techniques to systems.
Access all 4,256 remote & onsite AI jobs.
Frequently Asked Questions
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.