&

OASIS

AI-powered Autonomous cargo drone

OVERVIEW

OASIS (Operational Autonomous System Integrated Simulator) was a joint Boeing x NASA research program exploring how large autonomous cargo aircraft could safely operate within today’s National Airspace System. The goal of the program was to validate Concept of Operations for autonomous freighters and to identify technology, human factors, and operational gaps long before a real-world deployment.

The program combined a high-fidelity vehicle surrogate with live National Airspace System services for traffic, weather, and flight planning, all connected through a distributed simulation environment and operated via a dedicated Ground Station Operator workstation. This setup allowed the team to simulate end-to-end autonomous cargo missions under realistic operational and communication constraints while observing how human operators interacted with highly automated systems in real time.

The project intentionally avoided remote piloting and instead explored a command-based supervision model in which the human operator expressed intent and the autonomous system executed and validated actions.

The Problem

Autonomous cargo operations fundamentally break today’s mental models of aviation because there is no pilot onboard and responsibility is distributed across humans, automation, and ground systems. Operators must interact with air traffic control through mediated communication channels, while regulators require a traceable and defensible allocation of responsibilities between humans and machines.

From an operational and UX standpoint, the team needed to define an FAA/regulator-defensible allocation of responsibility between the Ground Station Operator, the autonomous aircraft, and supporting ground services, while designing an interface that preserved situational awareness under latency and time pressure without turning the operator into a remote pilot. At the same time, the system had to support time-critical interactions during high workload phases like approach, taxiing, and departure.

My Role

  • Led end-to-end UX design for the Ground Station Operator experience, from early concept through high-fidelity simulation and evaluation with Boeing and NASA.

  • Designed for complex proprietary screen sizes and multi-screen operational layouts, ensuring usability and clarity across high-fidelity simulation workstations

  • Defined interface and interaction models for command-based autonomy, including synchronization workflows, local-versus-vehicle state management, and time-critical tactical commands.

  • Partnered closely with autonomy engineers, human factors researchers, military drone pilots, and flight operations experts to align design decisions with real-world operational and regulatory constraints.

  • Focused on preserving situational awareness and reducing cognitive load in data-dense, time-sensitive environments under communication latency.

My Impact

Despite not entering into production (yet), this work helped validate Boeing and NASA’s approach to supervised autonomy for cargo aircraft, identifying operational breaking points under communication latency and influencing how time-critical control was allocated between human operators and onboard autonomy.

For my Taxi Route Builder design, I received Boeing’s Meritorious Invention Award in recognition of the work, in lieu of a patent filing due to the project’s then-secrecy.

Aryk’s dedication to visual and interaction design consistently results in interfaces that are modern, elegant, and grounded in established conventions. Aryk’s work doesn’t just look good – it clarifies intent, improves usability, and meaningfully raises the quality of the product.
— Dave Feroe, Principal User Researcher, Boeing

Design Constraints

The system was explicitly designed to avoid direct flight controls and remote piloting. Operators sent commands to the drone indicating their intent rather than manipulating flight surfaces directly, while the vehicle validated and executed actions autonomously.

At the same time, the interface had to operate within rigid, proprietary multi-screen workstation layouts, including non-standard aspect ratios and dedicated displays for situational awareness, vehicle command, and supporting context. These physical constraints shaped both the interaction model and the visual hierarchy of the system, requiring careful attention to how responsibility boundaries, system state, and critical actions were distributed across screens without fragmenting operator awareness.

Rough wireframes of the primary screen display

Details about the screen layout

My design working on the real workstation

RESEARCH

I began by breaking down the tasks performed by pilots in today’s commercial freight operations and redistributing those responsibilities across the Ground Station Operator, the autonomous vehicle, and supporting ground systems. This process produced a traceable function allocation framework that identified technology gaps, supported safety analysis, and directly informed software requirements. 

Together with the product team, I conducted multi-stage, scenario-based research to evaluate both the interface and the underlying concept under realistic operational constraints. Research participants included commercial aviation pilots, subject matter experts, and a group of military drone pilots, providing a mix of civil aviation, autonomy supervision, and remote operations experience.

The research combined journey mapping, task-based usability testing, and high-workload mission simulations focused on critical flight phases such as approach and departure. Findings were used to iteratively refine synchronization workflows, command patterns, automation transparency, and multi-screen information hierarchy, directly shaping both the interface design and the viability of the operational model.

Me sketching UI concepts

One of many journey maps documenting the complex scenarios

Final Designs

The resulting Ground Station Operator interface integrated live map and traffic visualization, phase-of-flight timelines, task panels aligned to operational phases, messaging and guidance systems, synchronization dialogs, and explicit plan diffing between local and vehicle state. The interaction model supported both strategic planning and tactical intervention while maintaining a clear boundary between operator intent and autonomous execution.

Safety and trust were reinforced through visible automation behavior, explicit system state feedback, error-preventing workflows, and confirmation patterns for time-critical changes. The interface was designed to be informative without being obtrusive, so that it could support validation without distorting operator behavior.

A note on the mockups: Due to the aspect ratio, the mockups are necessarily wide. Click on them to see them larger.

A look at the panel interactions

Mission planning and drone tail camera view

Preparing a taxi route

What I’d Do Next

At the time of writing, the OASIS program has been paused due to organizational and strategic factors. While the work did not progress to deployment, the project surfaced meaningful insights into supervised autonomy and ground control UX, and the following outlines what I would explore next if the program were resumed.

I would extend the system with AI-assisted supervision to support multi-aircraft operations, using agentic monitoring to surface emerging risks and propose prioritized interventions rather than relying on manual scanning. I would introduce workload-aware interfaces that adapt information density and interaction cost during high-workload phases (taxi in/out, takeoff, and landing), and add predictive delay compensation to mitigate satellite communication latency.

On the autonomy side, I would explore agentic decision support that simulates the downstream effects of tactical commands before execution, paired with transparent reasoning and confidence signaling. Finally, I would extend training and evaluation through AI-driven scenario generation to continuously test edge-case conditions.