Design and Simulation of Autonomous Surface Vessels (ASV) - MATLAB & Simulink
Video Player is loading.
Current Time 0:00
Duration 42:00
Loaded: 0.00%
Stream Type LIVE
Remaining Time 42:00
 
1x
  • descriptions off, selected
  • en (Main), selected
    Video length is 42:00

    Design and Simulation of Autonomous Surface Vessels (ASV)

    Overview

    Autonomous maritime systems are inherently interdisciplinary, so one of the primary challenges that engineering teams face is the need to plan, communicate, and integrate the different aspects of their designs. In this talk, we will demonstrate how MATLAB and Simulink can help provide a unified environment for the development and simulation of autonomous surface vessels (ASV). The webinar will be centered around a reference application that simulates an ASV navigating from one dock to another in a canal. We will discuss how MathWorks tools are used to create different modules in this reference application, including scenario simulations, vehicle models and hydrodynamics, autonomy stack, and user interfaces. We will highlight how the physics-based simulation in Simulink can be connected to photorealistic simulators such as Unity® to create a powerful framework for testing and validation of ASVs. We will also show how you can connect your existing work to Simulink and leverage the power of this simulation framework.

    Highlights

    In this session, we will use a reference application of a simulated ASV navigating from one dock to another in a canal to discuss how you can use MATLAB and Simulink to:

    • Create a digital twin that incorporates the ASV’s power and propulsion systems
    • Model and simulate hydrodynamics and wave-vessel interactions to predict the ASV’s performance
    • Develop autonomous algorithms such as perception and COLREGs-based navigation
    • Test models and algorithms in simulated scenarios, either a simple cuboid environment or a photorealistic environment such as Unity®
    • Implement test harnesses to automatically track and verify design requirements throughout development iterations
    • Integrate or connect with your existing work in other development environments

    About the Presenters

    Martin Luo – Application Engineer, MathWorks

    Martin Luo is an application engineer at MathWorks based in Sweden and a specialist in model-based design for robotics and autonomous systems. Before joining MathWorks, Martin worked as a flight control engineer for civil aircraft and developed navigation systems for quadcopters. Martin holds an MS in Robotics from KTH Royal Institute of Technology, and an MS in Guidance, Navigation, and Control from Beijing University of Aeronautics and Astronautics.

    Carlos Osorio – Aerospace and Defense Principal Application Engineer, MathWorks

    Carlos Osorio received a B.S. from the Pontificia Universidad Catolica del Peru and an M.S. from the University of California at Berkeley, both in Mechanical Engineering. He specializes in Automatic Control Systems, Robotics, and Vehicle Dynamics. Before joining The MathWorks in October of 2007, he worked in the automotive industry in the Advanced Chassis Technology Division first at Ford Motor Company and later at Visteon Corporation, where he was involved in the development and implementation of prototype electronic active and semi-active suspensions as well as steer-by-wire and brake-by-wire systems for passenger vehicles. At MathWorks Carlos primarily works with Aerospace and Defense customers and focuses on mechanical and electrical/electronic systems and advanced control applications.

    Russell Graves – Application Engineer, MathWorks

    Russell is an Application Engineer at MathWorks focused on machine learning and systems engineering. Before joining MathWorks, Russell worked with the University of Tennessee and Oak Ridge National Laboratory in intelligent transportation systems research with a focus on multi-agent machine learning and complex systems controls. Russell holds a B.S. and M.S. in Mechanical Engineering from The University of Tennessee.

    Mike Rudolph – Aerospace and Defense Industry Manager, MathWorks

    Michael Rudolph has been the Aerospace and Defense Industry Manager at MathWorks since 2019. In his role, he works with engineering leadership throughout the industry to understand ongoing and emerging technology trends from autonomous systems to multifunction RF systems. Before joining MathWorks, he spent a decade at Raytheon BBN Technologies performing interdisciplinary research on various sensor and RF systems for DoD customers like Defense Advanced Research Projects Agency (DARPA). He holds bachelor’s and master's degrees in engineering from Penn State University, and an MBA from University of Virginia's Darden School of Business.

    You Wu – Robotics Industry Manager, MathWorks

    Dr. You Wu is a robotics evangelist and the Robotics Industry Manager at MathWorks, promoting best practices in robot development processes to industrial clients. Dr. Wu received his Ph.D. degree from MIT with a focus on underwater soft robotics and a Bachelor’s degree from Purdue University. Before joining MathWorks in 2020, he was CTO of Watchtower Robotics, a startup that put inspection robots into municipal water pipe networks.

    Recorded: 30 Jun 2022

    Welcome to today's session on design and simulation of autonomous surface vessels. My name is Mike Rudolph, Aerospace and Defense Industry Manager at MathWorks. And I'm joined by my application engineering colleagues, Martin Lowe, Carlos Osorio, and Russell Graves, as well as Robotics Industry Manager, You Wu.

    Collectively, we support teams like yours around the world working on autonomous systems. In this talk, we're going to role-play various stakeholders, highlighting the parts of the autonomous system development workflow, to demonstrate how a team like yours might leverage the MathWorks offerings to do the same.

    Martin, Carlos, and Russell are going to provide a window into some solutions that engineers like you might be interested in learning more about. You Wu and I will be asking the team some questions about their work along the way. Today, we're going to be demonstrating an example mission that has an autonomous ship, as you see here, navigating through a crowded canal.

    So regarding our mission, the three main challenges that your team might face include things like collaboration among a highly interdisciplinary team, realistic modeling and simulation that includes dynamics, and testing and verification of the design for rapid evaluation and feedback. We're going to be showing how that interdisciplinary team might collaborate on the design and simulation of an ASV using MATLAB and Simulink as a unified environment.

    And we'll do this by building a multi-domain physical model of the platform, and connecting that platform to environments of varying degrees of fidelity. We will then discuss how you might implement autonomous algorithms using that physical model. And then we'll show how you might leverage all of these to run tests and verify requirements using the complete model under the hood.

    So Martin, if I'm a customer looking to acquire one of these fairly expensive systems, but am looking to reduce risk, how might you leverage the environment and platform models to demonstrate that system to me, and convince me that your design is really going to work?

    Great question. In my opinion, such a digital team shall includes two parts, the vessel platform with virtual sensors, as well as a set of scenarios. A digital team creates a virtual environment, which enables us to see how an ASV performs before the expensive physical test.

    To give more details, here shows a digital team of an ASV implemented in Simulink. First of all, we have models of the vessel platform, payload, and environment, where the vessel platform includes the hydrodynamics, hydrostatics, actuators, and electrical system. We have the traditional guidance, navigation, and control modules, which can be regarded as automation system.

    To move on, we have situational awareness and planning modules, which can be regarded as autonomous system. Of course, we need to have the scenes and scenarios to feed the data into the perception sensors, and close the loop.

    Before we dive in, Martin, can you show us how you build a scenario? I noticed in earlier videos that you built it in Unity game engine. Is it difficult?

    It wasn't too hard for me. Let me show you a high-level workflow for creating scenes and scenarios in Unity. First, we utilize the 3D satellite images from Google Map. From Google Map, we took a bunch of screenshots, and used a tool called RenderDoc to convert it from 2D to 3D.

    We import them into the CAD tool called Blender for 3D modeling and rendering. Stitch all the 3D models and export the scene into FPX file, which can be used by many simulators.

    In this case, the scene in FPX format will be imported into Unity as an asset. In addition, we import the CAD model of the vessels. Add waves, weather, virtual sensors, such as LiDAR, and Camera, into Unity.

    Now, we have all the elements for creating scenes and scenarios in Unity. By having scenarios and the vessel platform, we can perform co-simulation between Unity and MATLAB Simulink through ROS, ROS 2, and DDS, which publish or subscribe data through the middleware.

    We can also control the scene from either Unity or MATLAB Simulink. For example, we can control the weather, time. We can also control the sun position, and control the wave during the simulation.

    Let me take this opportunity to introduce our ASV platform. There are four cameras mounted on the sensor rig towards the front, back, port, and starboard side, respectively-- and the LiDAR on the top. There are also an inertial measurement unit and a GPS.

    We also have a camera sensor model in Unity. Simulink receives images by using a ROS subscriber block. The images can be visualized during simulation. We have a laser scan sensor model in Unity. In this example, Simulink receives the latest scan by using a ROS subscriber block.

    The laser scan can be visualized during the simulation. On the other hand, Simulink published a post message to Unity, in this case through ROS. Receiving the post message, Unity updates and visualizes the vessels in the scene. By now, the simulation loop between Unity and MATLAB Simulink has been closed.

    This looks really nice, but not very realistic. How might you add physics so that you can control this like a real ship?

    Thanks, Mike. Yes, so in order to properly test our autonomous navigation algorithms, or our low-level control strategies, we need to have a dynamic model of the vehicle that we're trying to control. So let's explore our example model. We're going to focus on the vehicle platform section. First, let's take a look at the vessel platform itself.

    So we have mathematically modeled the vehicle using a pre-standard textbook implementation. Actually, we have followed quite closely the implementation from Professor Fossen, which I want to thank for his work on all of this.

    So if you look way on the right side, the vehicle has been modeled as six degrees of freedom. So it's a rigid body mass that is receiving a variety of forces and moments. So all the external effects to the vehicle, the waves, the currents, the hydrodynamics, all those effects are being calculated by Simulink. And those are being provided to the vehicle motion equations as forces and moments.

    So that is implemented on this section. This is the part that is solving the dynamic equations of motion. But if we look a little closer before that, we have a section that is comparing the hydrodynamic effects, for example.

    So we have damping effects and Coriolis effects being calculated. All of this is a lot of MATLAB matrix multiplications and computations being done in Simulink to convert-- to first, calculate the forces, and then to convert these forces to the appropriate frames of reference, or to appropriate orientation.

    And so you have calculation of hydrodynamic coefficients. In this case, we are using a simplified approach. So we're just using constant coefficients for the hydrodynamic forces, and damping produced by the motion of the vehicle on the water.

    But here is where you will bring in-- if you actually have a CFD tool that you're using to analyze the shape and the geometry of your vehicle, for example, those coefficients, instead of having just a constant value like we have here, those coefficients can come in, in the form of lookup tables, for example.

    We are calculating the hydrostatic effect. So this is all the buoyancy effects that are affecting the vehicle. We have a section that is computing the waves and cross flow effects. Again, that's being performed with a MATLAB function. And then we are computing the thrust, and the propellers producing thrust-- the impulse for the vehicle to move forward or turn, depending on whatever maneuver we're asking it to do.

    So that is at a high level, the vehicle dynamics itself. But notice that we have two additional blocks here. So we are including the ability to add a payload to the vehicle. And this is important because based on the location of this payload, that is going to affect the location of the center of mass of the vehicle. There needs to be all these transformations to locate the payload.

    We also are included the environment. In this case, we have a simplified version of this. But we're including water current. We're modeling this as just a constant direction, in this particular case-- a constant direction of water current with a particular speed. There are way more complicated models of currents where you can do random distributions and changing over time.

    For example, we have gravitational effects, of course, are all considered. So that includes the primary physics of what we require for the vehicle. Now, all those computations are producing measurements of all the basic vehicle states. All the basic vehicle states are being fed back into the low-level control algorithm.

    So if I explore the control algorithm, there's many sections here. So we have multiple modes of operation. The two primary low-level control algorithms are speed controller-- so that regulates the velocity at which the vehicle is moving on the search direction, so on the x-axis, let's say. And then we're controlling heading.

    Both of these controllers are very straightforward. So we have just a PID controller for velocity. And the heading controller is also API controller. So we have yaw angle and the desired yaw angle. So that's the direction that you want. And we have a feedforward element, taking advantage of the fact that we have a measurement of the yaw rate.

    Once we have that control algorithm design, we can bring it into the simulation model that includes all the nonlinear physics and all the effects that I mentioned. And we want to verify that they're performing the right way. So let me quickly show you a video recording of the simulation running with the heading control activated, performing an s-shaped maneuver.

    So what you're seeing is our ship doing an s maneuver. It's going to start over again. So you can see, it's going to turn to the right first. So we're doing just a very simple sinusoidal maneuver.

    Thank you, Carlos. I can tell that here, you use a lumped model or ideal models to represent the actuators. What if I want to add more fidelity to it-- for example, bringing the propeller efficiency curve.

    Yes, thanks, You. And this gives me a chance to explore a little bit more, the model, and make a couple of points. So if you notice, the propeller dynamics are modeled just as a first-order transfer function. But this is a very important point about modeling dynamics in general, is what level of fidelity do you want to include in your model?

    So in our case, we are just assuming our propellers are behaving-- pretty nominal way. So we are not very concerned about the dynamics of the propellers themselves. But one of the big powers that Simulink and our MathWorks tools have is the ability to elaborate on these dynamics and add fidelity as you are working on the model, and as you are finding that we need to test for a different kind of maneuver, or we need to verify that our propellers are behaving properly.

    So let me switch from the main model. So this is going to give me a chance to introduce you guys, for those of you that are not familiar, to our physical modeling libraries of modeling tools. So this is a product called Simscape. And under Simscape, we have a variety, a family of products that model electrical and electronic systems, mechanical systems, hydraulic or fluid systems, for example.

    And what you're seeing here is a schematic implementation. So one of the cool things about Simscape is that instead of having to write all the mathematics as transfer functions, or differential equations, it models a system as a schematic representation of your system. So what you're seeing in this schematic, for example, is an electrical source.

    So we will have some kind of battery or some kind of electrical generation capability that is going to produce electrical power, that is going to be driving a motor. So we have the representation of a permanent magnet motor drive here in a single block-- very easy to parameterize.

    So if you have data sheet specifications, or we have also pre-parameterized motors for permanent magnet machines, or brush-less DC motors, for example, which are very typically used for this-- electrical power is being converted into a mechanical system.

    So now I have a shaft with a gearbox, or we have a transmission system there-- propeller inertia. And all of that is connected to a propeller block. This propeller block includes the primary dynamics of a marine propeller. Just a quick exploration-- there's multiple options for parameterization. So you can just do polynomial curves for your torque and thrust coefficient.

    So you might have efficiency curves, like you was mentioning, or you might have torque coefficients or thrust coefficients that you want to represent from a mathematical calculation from your CFD tool, or maybe the manufacturer is giving you some of these efficiency curves, for example. So you can bring that in as tabulated coefficients, for example.

    The rotational energy is being converted into thrust. And the thrust will be what is applied to the vehicle, essentially. Let me quickly run the simulation so you can see how this behaves. We are requesting-- we have a step change in the torque request to our motor drive. So that's why you are seeing the simulation, the step change that is occurring at about 0.05 seconds, like at 50 milliseconds.

    So we have torque measurements. We have speed measurements. We have thrust measurement, the thrust being produced by the propeller, which is the important component that we want to connect to our vehicle dynamics, essentially-- current measurement, efficiency for the motor, efficiency for the propeller. So we can keep track of all these different characteristics-- same as if you were implementing it with basic fundamental mathematics.

    But the important thing is here we are adding higher fidelity to the propeller dynamics. So it's all the full nonlinear mathematics for the propeller. And we're adding the dynamics of the electric drive also.

    Wow, thank you, Carlos. I'm definitely convinced now that there is plenty of fidelity in this model. Since we're in a busy canal, how might we develop autonomy algorithms to estimate our own position, sense and perceive the environment, as well as plan and decide on a course of action?

    That is a three-folded question. Let me first talk about how MATLAB tools helps the vessel estimate its own orientation and position. MATLAB Simulink provides initial sensor fusion algorithms, for example, the AHRS, to estimate the orientation, and INS filter to estimate the pose.

    MATLAB and Simulink also provide virtual sensor models to generate the synthesized data for your design and test. The automatic tuning enables fine tuning the configuration parameters for what sensor fusion algorithm automatically, to reduce the estimation error.

    After being able to estimate the vessel's position and orientation, we can further add a perception model for other ships-- a motion planning module, just like how Carlos added the dynamics module.

    Hey, Martin, let me add a caveat here. Many of our customers will have multiple engineers on their team. Some of the engineers working on perception, while some others working on the motion planning algorithm.

    Typically, the motion planning algorithm engineers need to wait on the perception engineers before they get the ground truth data, or the real data, and test the motion planning algorithms. How do Simulink decouple this and enable some of the independency in the teamwork?

    Great question. Simulink really enables independent development of each model in this framework with the help of specialized toolboxes. When I'm working on a motion planning module, I found it really helpful to use one of these tools, such as the Cuboid simulator. It runs inside MATLAB, takes minutes to master, and can help me quickly test my motion planning algorithms.

    In Cuboid simulation environment, vessels and other actors are represented as simple box shapes, or polygon meshes. Use this environment to rapidly offer the scenarios, generate detections, and test control and planning algorithms. Cuboid can co-simulate with both MATLAB and Simulink.

    You can design scenarios interactively using the Scenario Designer app. Here shows how to create scenes and scenarios interactively with the Scenario Designer. First, we add eagle vessel from the CAD model. We can change the parameters of the ego vessel from a parameter editor, such as the color.

    We can also change the position and orientation of the meshes in order to make it look correct in the editor.

    We can also add sensors on the Eagle vessel, such as LiDAR, in this example. We can specify the position from the editor and change the parameters.

    Next, we're going to add three non-ego vessels with the same workflow of adding the Eagle vessel. We are adding the non-Eagles vessels by importing their CAD models. Besides the vessels, we can add the real world scene from the CAD model. This railroad scene is from Google Map, the same as the one in Unity.

    Next, we can specify the trajectories for each vessel. As we can see, we just need to drag and drop and specify the waypoints on the scene for each vessel, which is pretty intuitive and easy. In the Trajectory table, we can also specify the accurate waypoints, the arriving time, and the speed.

    Lastly, we can run the simulation in a Scenario Designer app to reveal our design.

    In the previous example, we have added a LiDAR sensor. Besides the LiDAR, MATLAB and Simulink offer other virtual sensors as well. The behavior planning specifies how the ASV shall react under certain situations. We use the state machines for modeling and simulating the decision logics, task scheduling, fault management, and so on. It is defined by a couple of states and transitions.

    In this example, we have Idle mode, Waypoint Following mode, and Collision Avoidance mode. During the simulation, we can see the animation showing how the state machine transitions from each state. For example, when the Eagle vessel detects the risk of collision, it will transition from Waypoint Following mode to Collision Avoidance mode.

    Regarding collision avoidance, two different techniques that have been implemented. One is the local replanning using the Vector Field Histogram. And the other is the policy-based planning, following the COLREGS rules, where the COLREGS stands for Convention on the International Regulations for Preventing Collisions at Sea, which is published by International Marine Time organization.

    For local replanning using Vector Field Histogram, first of all, it receives readings from LiDAR or range sensor in a target direction to drive toward. The Vector Field Histogram controller computes an obstacle-free steering direction to avoid obstacles. This feature is offered by both MATLAB and Simulink.

    For policy-based planning following the COLREGS rules, here shows an example. There are two vessels which are in a crossing situation and involve risk of collision. By following the COLREGS number 15, the collision has been avoided.

    What the rule tells us is that the vessel in orange on the bottom has the other vessel on its own starboard side. It should keep out of the way. In this example, it will turn right to avoid the collision. It can also lower down the speed, or perform all the maneuvers to avoid the collision.

    Here shows more correct examples. These scenarios and simulations are created and performed in the cuboid simulation environment, which enables us to intuitively and quickly design and test the collision avoidance algorithm under various scenarios.

    That is a nice tool to develop and test collision avoidance algorithms. How about the other part, where you enable a vessel to determine a potential collision?

    It's a good point. To develop the situational awareness for the ASV, we need two elements, capability to perceive and locate other ships, and the capability to determine the risk level of a possible collision. Let me describe the risk level assessment part first. And then my colleague, Russell, will showcase his work on perception.

    Let me use the same example in the planning session. The Eagle vessel is moving towards north. And there is another vessel coming from its starboard side. A collision situation is first identified by calculating the closest point of approach with respect to time, and the distance.

    A minimum acceptable distance is defined. The vessel is recorded in high risk, if the distance of the closest point of approach is less than the minimum distance and vise versa.

    Here shows a frame of simulation results. The size and color of the circle indicates the risk level of collision. And the two numbers are the time and distance of the closest point of approach. The circle in green indicates low risk because the distance of closest point of approach is larger than the minimum distance.

    Here shows another frame of simulation results. The circle in red indicates high risk because the distance of the closest point of approach is smaller than the minimum distance. Here shows the complete simulation result in Cuboid. This risk assessment result is used for determining the behavior in the planning module.

    Here shows more examples of assessing the risk level for different scenarios in the Cuboid simulation environment. As you can see, the Cuboid simulation environment is an appropriate simulation environment for developing and testing the planning and situational awareness algorithms.

    We have introduced two simulation environments for scenarios. One is Unity and the other is Cuboid. Know that we can reuse the scenarios from these two simulation environments for different purposes. For example, we can test the planning and control algorithm in Cuboid and test our perception algorithm in unity, but using the same scenarios.

    Very nice, Martin. I like how you develop the risk assessment algorithms. At the same time, you are using ground truth or perfect measurements in your algorithm development so far. Is that right? What if I want to replace ground truth with a real perception algorithm? What does the process look like?

    Yeah, sure, You. Let's try to replace some of that ground truth information with networks from other existing sources. We might even build our own. But it's much easier if we can import a pre-trained network from something like TensorFlow or Caffe. Fortunately, MATLAB provides a lot of different pathways for getting pre-trained nets into the MATLAB MathWorks ecosystem.

    We can either go through the open neural network exchange, the Onyx format, or we have direct import options for TensorFlow, for Keras, and for Caffe. You can see here that import TensorFlow network is about as simple as it gets-- practically spells out what it does for you.

    We can use these sort of really simple functions to easily pull those pre-trained nets in, and use them to detect ships surrounding our automated surface vessel in this specific instance.

    Thanks, Russell. In this Yolo v4 result, all the boats are just labeled as boats. What if we want to build something more customized that is able to detect and differentiate different types of boats?

    They all say boat. Yeah, that's not very useful, I guess. If we want to customize that computer vision algorithm, we have a four step process that we propose as sort of this best practice workflow for AI design. It starts with data preparation, and goes all the way through to deployment.

    We have this sort of whole cradle to grave situation inside MATLAB, inside the MathWorks ecosystem. Today I'm going to focus primarily on data preparation and AI modeling, as Martin and the rest of the team will handle integrating the trained solution back into Simulink for testing, and finally, deployment.

    And as somebody working with neural networks or any machine learning algorithm, you know that data is basically the food for these algorithms, these things, right? And we'll spend about 75% to 80% of our time, or at least of my time, dealing with cleaning the data, pre-processing the data, and labeling the data.

    Fortunately, MATLAB provides a number of different applications, these little apps, that are graphical interfaces to enable you to easily and quickly take your existing data and label it. If you don't have real data, like real images from cameras on these ships, we can lean on 3D visualization engines, like Unity, that Martin showed earlier.

    We also have a direct connection to Unreal Engine, which I'm more familiar with. So Martin was dealing with the Cuboid simulation and the Unity simulation. But I'm going to focus on Unreal Engine. So here I have my Simulink model connected and configured to connect to Unreal. And I can just hit Play on the Simulink side. And sort of similar to what we were seeing with Unity, this is a co-simulation.

    So once we hit-- once we run Simulink, then we're going to move over into Unreal Engine and hit Play here. And that will launch the simulation. So we can see we have our camera image. And then in order to generate some training data rapidly, I'm actually going to use some built-in semantic segmentation techniques.

    So semantic segmentation is like a fancy way of saying that we're going to color different things in the images in different colors. So we have-- instead of having all of this clutter, all of the background noise of the water and the bridge and everything, we just have this red-- solid red-colored yacht, where our yacht is in the image.

    This enables us to quickly and easily use some really common image processing techniques to isolate where those objects of interest are in our images, and build up a pipeline for automatically labeling the data in a much more clean and efficient way.

    For example, we can go immediately into a MATLAB live script, where we can use some of the built-in video reading capabilities to take those two data streams, the raw camera images, and the semantic segmentation images, and we can read them in frame by frame, and use those really stark colors in the semantic segmentation data to easily create bounding boxes and labels, which we can then put over the top of the matching raw RGB camera data to create a nice training set we can use to feed our machine learning algorithms.

    Once we have our data prepped, we have two of these apps, the Deep Network Designer and the Experiment Manager, which are two apps that help you to both create from scratch, or customize existing deep networks that we imported, like we did earlier with the YOLOv4. We can pull those into the graphical environments, play with them, remove layers, change layers around.

    And then the Experiment Manager is really useful for helping us to fine tune those training options, machine learning options, to make sure that we get the accuracy that we need to meet our requirements. So that's it. We've gone through and we've prepped our data. We've actually generated some synthetic training data.

    And we've selected and customized our AI. So now you, instead of having boat, boat, and boat, we have yacht, sailboat, and water taxi in this image.

    Thank you, Russell. I like it. Since you already mentioned that we can bring in different neural networks into this simulation framework you presented here, can we bring other kind of code? For example, my team has already built several modules. Do we have to rebuild all of them in order to use your simulation framework?

    You don't have to. Motorization benefits organization developing digital team that consists of many functional pieces. Connections enables extending the digital team to third party environments and perform co-simulation. Let me take the planning module as an example to illustrate how we achieve it.

    Simulink offers different componentization techniques, such as model reference, subsystem reference, library and variance subsystem, which enables efficient and robust system development, facilitates collaborations, and improves verification workflows.

    Large scale systems typically use a combination of these Simulink components. Simulink as an integration platform of the digital team also allows you to import or implement C and C purpose code by using S function, C caller, or C function block. By using the meta function block, we can call Python, or import a pre-trained deep neural network.

    Simulink also supports co-simulation. between components. For example, we can import FME models, integrate third party functionalities into Simulink through ROS, ROS 2, or DDS. All these features and techniques make working in teams easier and more efficiently.

    I've been impressed so far. And this is great for development of the system. But how can I use this full simulation for system validation and testing of my customer requirements before I actually put this out to sea?

    We can reduce the scenarios for testing and validating the system. In Simulink, we can even automate the testing. First of all, we can author and link requirements with our design. By performing simulations, we can analyze and validate if the requirements are correct or complete. We can also verify if the design meets the requirements.

    The model can generate production code, which are optimized for embedded processors, such as MCU, FPGA, PLC, or GPU. With that said, by using the code generation, we can reduce our design for different hardware platforms.

    After generating code, we can reduce the test cases from the model verification phase to test the genetic code instead. We can also perform statistical analysis to find box, and prove the absence of critical runtime errors. Let's take the control module as an example for more details.

    As there have been scenarios for the design, I would like to automate the test under these scenarios. First, we create a test harness for the control module. The test harness provides an integrated simulation environment that isolates the modules. Then we offer test sequence and the test assessment, as to inputs and outputs.

    For the input, we can feed our scenario in by creating the test cases . And besides the test sequence, we can use our format to define the inputs, such as MATLAB, Excel, Signal Editor Block, and more.

    For the output, after running the test, we will get the test result and reports. And besides the test assessment, we can use our format to define the pass/fail criteria and visualized results.

    With the test harness, we can specify which simulation mode we are going to run, including model in the loop, software in the loop, processor in the loop, and hardware in the loop testings. After we perform the functional test with the scenarios, we can analyze the coverage that measures the testing completeness in models.

    Here shows the coverage highlights on the models, and a summary report from the coverage analysis. The blocks highlighted in red indicates not fully covered, meaning there are gaps in the test cases. If we look into an integrator in the subsystem of the speed control, you will find now that the saturations of this integrator have never been reached.

    With this, we can go back and check if there were any missing test cases, missing requirements, any unintended functionalities in our design, or if it's a design error. The same to the code coverage analysis, we reduce the scenarios for the functional tests, and collect the coverage result for the generated code.

    Here shows an example indicating that the C code generated from a multi-port speech is not fully covered with its default path. Besides functional tests, we can verify compliance with industry standards and guidelines to assess design quality.

    For example, this shows the supported high integrity software development standard and guidelines, such as IEC 61508. The secure coding standards, such as C, are also supported to address the cybersecurity concerns within the embedded systems. Detecting random errors are also supported, such as integer overflow, division by 0, array out of bounds, and more.

    Let's have a look at this example. A rate limiter for an input signal in the speed control system is potentially violating the IEC 61508 standard because the equality or inequality operations should not be used on floating point values due to the floating point position issues. This improves model robustness and prevents unexpected results.

    The same to verifying the compliance of the model, we can verify the compliance for the C and C purpose code. Here is an example. We check the C code and detect violations of the MISRA C rule, which is a set of software development guidelines for C programming.

    One of the violations that have been found is that the else statement is missing in the C code. The code verification can reduce the number of bugs and improve the quality of the code. Also know that this code verification does not only work for generated code, but also handwritten code.

    Wow, we covered a lot of information in a short amount of time. So thank you to the team for giving us such a comprehensive overview. I hope this gave you a great sense for how an interdisciplinary team like your own might collaborate on the design and simulation of an autonomous system through model-based design using MATLAB and Simulink.

    We plan to share some more information, as well as in-depth how to videos on our website soon. So please keep an eye out for that. If you'd like a demonstration or a private overview for your team in the meantime, we'd be happy to do so. Just please reach out. And we'll get in touch. Thank you so much.

    View more related videos