Designing WATonoBus, the Autonomous Shuttle Bus on Campus

Self-Driving Shuttle Is an Ideal Platform for Developing ADAS Technology


Every day, the University of Waterloo in Canada buzzes with the hum of the WATonoBus, a self-driving shuttle bus that navigates the 2.7-kilometer ring road encircling the campus. This autonomous shuttle, available to students, faculty, and visitors, is one of the many innovative projects happening at the Mechatronic Vehicle Systems Lab (MVS), one of the world’s largest academic automotive labs.

The MVS Lab, which has been working for over 15 years, has collaborated with numerous organizations and original equipment manufacturers (OEMs), including industry giant General Motors.

“We usually have between 40 and 50 graduate students, engineers, postdocs, and technicians working,” says Dr. Amir Khajepour, professor of mechanical engineering at the University of Waterloo and the head of the MVS Lab.

Six years ago, the lab embarked on its autonomous driving project with initial funding from the Canadian Foundation for Innovation. Since then, the project, which adopted the name WATonoBus, has engaged dozens of students, evolving into a platform for developing various technologies for autonomous mobility.

The self-driving WATonoBus stopped at an intersection to let a man on a crosswalk cross in front of the bus.

The WATonoBus. (Image credit: University of Waterloo)

Breaking Down the Problem

The WATonoBus project started with the aim of creating a platform for research and training in autonomous mobility. Considering an academic environment where students join and leave the MVS Lab every year, a platform was needed that would require minimal learning time. This would allow graduate students to spend most of their time on research and development rather than learning the platform itself.

“We need a platform with well-defined modules. This will allow the team to understand and continue to develop the modules in parallel.”

“One of the most important things for me was to make sure that the platform we are putting together does not have a long learning curve because students are continuously coming and going,” explains Khajepour. “We need a platform with well-defined modules. This allows the team to understand and continue to develop the modules in parallel.”

To address this, the team deconstructed the WATonoBus project into several key modules, each designed for simplicity and accessibility. The first step involved outfitting a shuttle with various sensors, including cameras, lidars, radars, and GPS. They then created a perception module that processes input from the shuttle’s sensors, fuses them together, and extracts crucial information, such as the location, speed, and heading of cars, objects, and people, in addition to the road and drivable area.

The team developed other modules responsible for decision-making, motion planning, control, and health monitoring. These modules, developed with MATLAB® and Simulink®, interact with other modules and perform the tasks crucial for the automated driving of the shuttle. The outputs of these modules are control commands such as acceleration and speed levels, steering angle, and braking that are sent to the WATonoBus for execution.

Video length is 0:35

The perception results on a winter day. (Video credit: University of Waterloo)

“We put together a general high-level software architecture,” Khajepour says. “The perception parts needed a lot of GPU power, so we used NVIDIA® processors. But the rest of the system was all developed in Simulink and MATLAB.”

The two modules communicate through the Robot Operating System (ROS). This open-source software framework offers tools, libraries, and drivers for developing applications that interact with the physical world, such as robotics and autonomous driving. ROS operates on a publisher-subscriber system, enabling different nodes in the system to exchange information.

In the case of WATonoBus, the perception module publishes its results and predictions to ROS. The decision and control modules then subscribe to these ROS topics to receive the perception information as it is created. Once these modules process the data, they publish their output back to ROS, which then sends low-level commands to the shuttle bus control actuators.

Training with Simulation

A formidable challenge in developing self-driving cars is sourcing data to train the models. Gathering real-world data is a slow, expensive process fraught with safety and regulatory concerns. To circumvent these issues, engineers typically rely on simulated environments for the bulk of the training of the artificial intelligence systems that power self-driving cars.

“Instead of running everything on the actual platform, which is the shuttle bus, we were using the MATLAB and Simulink environment in order to generate those scenarios.”

Simulations allow the team to train and test the models across various scenarios without deploying the models on actual cars. Simulation reduces the costs and increases the speed of training autonomous driving models. Once the models are ready, they are deployed to vehicles, where they can be further tested and fine-tuned on actual roads.

“Instead of running everything on the actual platform, which is the shuttle bus, we were using the MATLAB and Simulink environment in order to generate those scenarios,” says Khajepour.

The team used Driving Scenario Designer, included in Automated Driving Toolbox™, to create a scenario generator for their simulated environment. This tool allowed the engineers to construct various environments, roads, and actor models, along with all the sensors installed on the car. The simulated environment provides locations of cars and objects the perception module could encounter in the real world.

Like the main architecture, this information is published to ROS and passed on to the modules responsible for decision-making and motion planning. The outputs of the control module are then returned to ROS and passed back to the virtual environment as commands to control the shuttle.

Video length is 0:10

The team used Automated Driving Toolbox to create a scenario generator for their simulated environment. (Video credit: University of Waterloo)

“Our scenario generator was designed to be able to add any type of distribution for the location, speed, and other factors that we have for all the vehicles, pedestrians, and also all the cases that we have, such as, for example, a T-junction or any other traffic situation,” Khajepour explains. This approach allowed the team to quickly generate numerous scenarios, including challenging scenarios that rarely happen but are crucial to testing the safety of the self-driving shuttle, usually referred to as “edge cases.”

One of the problems the team had to solve in the simulation was the sequence of actions that the shuttle must perform in different scenarios, such as pulling over, opening the doors, waiting for the passengers to get on, and more. To address this, they used Stateflow® to design state machines, which helped them model and simulate decision logic for these complex, multistep operations.

Simulink enabled them to simulate vehicle dynamics information with model predictive control and PID controllers.

Real-World Testing

The model development for the WATonoBus is an iterative process. Each iteration begins in the simulated environment, where the research team uses Simulink and Driving Scenario Designer to test the system across various situations and edge cases. Once the model proves robust in simulation, it’s ready to be deployed to the car and tested in the real world.

For this transition, the engineers use Embedded Coder® to convert their MATLAB software into an executable package that can run on in-car computers. Once the software is installed on the car, they test it in a special environment where they have full control over the traffic and objects. This enables them to spot any issues that might have been missed in the simulation training.

“Usually, we encounter some difficulties in the first few rounds of real-world testing. We return to the simulated environment, tune the system, and redeploy it to the shuttle. Once we are confident about its performance, we test it on the university Ring Road on our campus.”

Video length is 0:30

The team installed MATLAB software on the in-car computer and conducted real-world testing on the university campus. (Video credit: University of Waterloo)

“Usually, we encounter some difficulties in the first few rounds of real-world testing,” Khajepour says. “We return to the simulated environment, tune the system, and redeploy it to the shuttle. Once we are confident about its performance, we test it on the university Ring Road on our campus.”

The Ring Road, a bustling 2.7-kilometer stretch that connects all university departments, parking lots, and other facilities, provides a comprehensive testing ground. With its mix of pedestrians, cyclists, vehicles, and road conditions, it mirrors the complexity of an urban environment.

“The Ring Road represents the urban environment very nicely,” Khajepour says. “And one of the difficulties of Canada is making sure that you can operate it in very varying weather conditions.”

Once the model is deployed on the WATonoBus shuttle on the Ring Road, under the watchful eye of a safety driver, the team identifies new situations and scenarios that the WATonoBus struggles to handle. These scenarios are then reproduced in simulation, and the cycle begins anew. Currently, the team has used this iterative approach to release two major software versions and revise their models and hardware.

Open to the Public

A significant milestone was reached when Dr. Khajepour and his team received regulatory approval to operate the WATonoBus on the ring road with passengers. Now, the autonomous shuttle is available to the public daily.

“We are running it for about an hour a day to collect data and see how the pieces are coming together,” Khajepour says.

The WATonoBus stops at five locations in its trajectory, picking up and dropping off passengers at key locations around the campus. Since its launch, the shuttle has become popular among the students who travel around the campus.

“We apply what we learn on WATonoBus to other applications, including autonomous underground mining, autonomous mobility in healthcare facilities, automated distribution yards, and farming.”

Beyond its primary function, WATonoBus has evolved into a platform for innovation, fostering the development of technologies with applications far beyond autonomous driving on public roads.

“We apply what we learn on WATonoBus to other applications, including autonomous underground mining, autonomous mobility in healthcare facilities, automated distribution yards, and farming,” Khajepour shares.

The team has also developed a mobile application for the WATonoBus that displays the bus’s location on the Ring Road and provides the service schedule. This app will serve as a valuable tool for gathering feedback from students and other passengers, helping the team identify areas for improvement for the shuttle and its software.

The WATonoBus project is a practical test bed for many students’ theses work. One doctoral student working on the project says, “I’m working on certainty handling in the decision-making module. Eventually, I hope that my thesis work will be deployed onto the WATonoBus test bed and see whether my work can enhance the performance of the shuttle.”


Read Other Stories

Panel Navigation
Panel Navigation

AUTOMATED DRIVING

Seeing the Road Ahead

The Path Toward Fully Autonomous, Self-Driving Cars

Panel Navigation

ROBOTICS

Driverless Race Cars Hit the Track at Indianapolis Motor Speedway

 Digital Twins and Simulation Are Critical to Win at Indy Autonomous Challenge