Rise of the machines as driverless AI racing cars return to Yas Marina for climactic showdown
A2RL returns for a second race, with the AI-driven cars competing over 20 laps this weekend at Yas Marina.
This weekend marks the return of a motor race consisting entirely of AI-driven single-seaters, based on Super Formula’s SF23, for 20 laps of Yas Marina.
Eighteen months on from the first race of the A2RL competition, the fully autonomous racing series returns this weekend for its second season, with six teams having qualified for the race at the Yas Marina Circuit.
When is the next A2RL race?
- When? 15th of November.
- Where? Yas Marina Circuit, Abu Dhabi, United Arab Emirates.
- How can I watch? The event will be streamed and broadcast the day following the race, from 11 am UK time, on A2RL’s official YouTube channel. It will also be broadcast on the Abu Dhabi Media Network, StarzPlay, and Motorsport TV.
- Who is competing? Six teams have qualified for the Grand Final race, competing for the $2.25 million prize pool. The six qualifying teams are TUM, Unimore, Kinetiz, TII Racing, PoliMOVE and Constructor. Reigning Champions TUM have qualified in pole position after a fierce ‘Multi-Car Qualification’ sprint race against rival Unimore. The other five teams, who did not qualify for the final, will compete in a ‘Silver Race’ to determine their place in the standing. These teams are: RAPSON, Code 19, Fly Eagle, FR4IAV, and TGM Grand Prix.
What is A2RL?
Two years ago, A2RL launched its racing series, which was put together by Abu Dhabi-based company ASPIRE, which combined artificial intelligence and mechanical knowledge to create a driverless racing series.
The first race saw 10 teams, pulled together from university entrants and research institutions around the world, fight for the $2.25 million prize fund via a race at Yas Marina, with the modified Super Formula chassis racing each other around the circuit using nothing but AI and programming.
Having spent the last 18 months learning and refining the cars, six teams have qualified for this year’s race, which will take place at the same venue this weekend, on the 15th of November.
The event will be streamed globally, including on Motorsport TV and on the official A2RL YouTube channel, and will see teams from Germany, Italy, and the UAE fight it out over 20 laps.
The cars have no steering wheels or cockpits and, where a driver would normally be seated, the cars are filled with sophisticated sensors, computers, and technology that is processed through an AI ‘stack’ that it uses to pilot the car – the responsibility of the teams is to refine its learnings to create the best all-rounder car that it can and, essentially, create their ‘driver’ through technology.
It’s fair to say that the first race was unmissable viewing and, unintentionally, hilarious. While the cars showed flashes of speed and awareness, there was an ‘uncanny valley’ awareness for anyone watching that something strange was unfolding in front of them as machines randomly lost control by themselves in straight lines, or made bizarre decisions like stopping on the track behind an opponent.
But the engineering challenge of creating autonomous racing cars can’t be underestimated, and, having identified some of the reasons for the strangeness of the first race, Abu Dhabi’s Autonomous Racing League is ready to go again.
“From an engineering perspective, it was a huge success,” says Josh Roles, A2RL’s head of racing operations, of that first race at Yas Marina in April 2024.
Formerly of McLaren and boasting an engineering background, Roles initially joined A2RL at the start of 2024 in a consultancy role, which evolved into heading up the day-to-day operations of the championship.
“It was the first time that multiple autonomous cars have ever been on a race track, in a racing environment, at the speeds that they reached, wheel-to-wheel racing. I mean, that in itself was incredible.
“It’s incredibly difficult to get these cars to drive autonomously, let alone race at high speeds with all of these variables, which are heat, humidity, vibrations, all the dynamics that come with operating a racing car, vehicle dynamics, aerodynamics, all of those things.
“So I think, from an engineering perspective, the project was an incredible success.
“From an event perspective, I think given it’s the first time it was done, it was done on a very, let’s say, short timeline. I think there could have been improvements, and those improvements are all things that we are taking into Season Two.
“We learnt a lot from Season One, but, as a whole, I think it was fantastic. We pride ourselves on doing science in the public domain, let’s say, it’s what we’re doing.
“It’s no different from what SpaceX does. They launch rockets and iterate design changes very rapidly, to hold themselves accountable in the public domain when things go wrong.
“I think we’re no different from that. We’re just doing autonomous racing cars on a racing track.”
Put to him that, for many curious viewers, the first race had been unintentionally hilarious, Roles said, “From my point of view, I wish we had sat down and maybe scenario-planned a little bit better and a little bit more. That’s my role this year.
“Thinking forward and visualising every possible scenario that could happen, and planning a reaction for that. Motorsport has been around longer than all of us have been alive.
“But they’ve had years and years and years to work on sporting regulations, to work on procedures, to work on protocols. This was the first time we ever had to do it.
“Of course, you learn things; Formula 1 teams are still learning things these days. So yes, there was an element of humor, and, let’s say, critique about what we achieved.
“But, from our side, it was fantastic. It was a great point to learn. We continue to develop from those learnings last year. I think we will always look back on Season One and say, ‘Wow, look how far we’ve come and look how much we’ve learned'”.
More on A2RL and driverless racing:
👉 F1 without drivers: The ‘fearless’ technology promising to revolutionise motorsport
👉 Man vs. Machine: Daniil Kvyat to race against driverless AI car at Suzuka
The technical changes that have been made to A2RL since Season One
For A2RL Season 2, the EAV-24, the AI-modified version of Super Formula’s SF23, has been comprehensively upgraded to create the EAV-25.
This new autonomous racecar features a number of safety, reliability, and powertrain enhancements over the previous model.
Improving autonomous reliability was a central consideration for the engineering team in the development of the EAV-25. This included upgrades to many key components, including the steering system, GNSS and 5G antennas, upgraded electronics, motorsport-spec wiring looms, and more.
In terms of safety, the EAV-25 brings battery upgrades, improvements to the emergency braking system, and a secondary Inertial Measurement Unit (IMU). The powertrain has also received select improvements to the alternator, gearbox, and a new fuel line to minimize the risk of fuel leakage.
Roles revealed that, unsurprisingly, the first race had not been without its technical headaches and, indeed, a major problem had been found with information being relayed by a key sensor, something which confused the teams on the day.
“From a technical aspect, it was the first time we had run these cars in anger, in a racing environment, properly, and it was the first time the teams had the cars to themselves,” he said.
“So we learned that the system we had built and the architecture we had built were fantastic for what we thought we needed. We exceeded our expectations and actually required an additional level of sensors, an additional level of redundancy within sensors.
“Because we’re building new technologies onto this platform, which we are pushing through the boundaries of their environments.
“Are the vibrations too harsh for this type of sensor? What we didn’t take into consideration enough is redundancies, because, once we have a critical sensor failure, we potentially lose the whole stack, or we have something go wrong with the car. So redundancies were a big thing we learned.
“Optimisation of sensors was another big point that teams came up against.
“But, from an event perspective, the planning, the learnings, the race structure, the scenarios, the flag Race Control systems that we have that that communicate directly to the cars.
“This year is all very, very different and much more revamped.
“Because this is obviously an open platform for teams to interpret the data however they like, the platforms are identical, but teams can take any of the sensors, all of them, or a select few of those sensors, and they can build their AI stack around those sensors.
“Now, last year, for instance, we had an IMU, which is a sensor that basically determines the six degrees of freedom of how the car is moving, forward, backwards, side to side, up, and down.
“So that gives teams data such as velocity, pitch, yaw, roll, and all of those things. And teams were relying on that sensor. And we actually found that the sensor became faulty when the mileage hit a certain number, 500 kilometres. In those environments, the sensor started to fail, so teams then had readings that were not quite correct, and therefore, it had a huge impact on their AI algorithms and software.
“But the problem was that the manufacturer was testing these sensors in our test bed, which is the whole reason for A2RL, so it was kind of this vicious circle of the manufacturers achieved what they wanted to achieve, but now that’s impacted the team.
“So what we’ve done this year is we’ve looked at the architecture of the vehicle and all of the systems and subsystems, and we’ve put in solid redundancy, and we’ve tested these redundancies to ensure that, if we have sensor failure, we have backups.
“We’ve also worked with the teams directly to ensure that they’re not relying on one single point of failure on their AI stack. So, touch wood, we won’t have that redundancy issue again this year.”
In order to boost the reliability of the sensors, which will, in turn, ensure greater reliability of the AI stack, Roles explained that more failsafe sensors have been installed.
“We’ve had more physical sensors to act as physical redundancy,” he said.
“We’ve done a lot of work around the drivetrain of the vehicle, because single-seaters are incredibly inherent to vibrations and certain frequencies of vibrations, and that is a killer for sensors.
“So we’ve worked very hard to put some vibration damping into the physical hardware of the autonomous stack. We’ve worked incredibly hard and long on the mapping of the engine to ensure we mitigate that vibration frequency range as much as possible.
“So they’re just two areas that I would say have been the most critical. But we’ve upgraded the wiring. We’ve upgraded some small components, such as fuel lines and things, because we were having some failures on prototype parts. There have been a lot of changes, but the biggest have been vibration issues.
“Think of a camera that’s mounted rigidly to a chassis that is vibrating at 150 hertz; you don’t get a clear image. Therefore, the AI cannot process that image because it’s just seeing a load of blurs. So we took that all into consideration when designing the new EAV 25 for this year.”
In an age where artificial intelligence now permeates the mainstream and in everyday life, F1 might not yet have to worry about AI drivers being faster and more daring. The first race was akin to a baby taking its first steps, wobbling, and falling down again.
But, like a baby, those wobbles quickly disappear, and Roles smiled at the analogy.
“I like the fact that you called it a newborn baby last year, because this year, I would call it a stroppy teenager, because we’re in a really good place with the AI now, in that we’ve developed significantly,” he said.
“We’ve learned a lot on the car, the teams have learned a lot, and we’re now in that kind of area where we’re telling the AI to do things, and we’re telling the platform what it needs to do, and it’s now starting to find ways around it, and it’s now starting to find things that it can enhance better than what we are telling it.
“So we’re at that stage now where it’s listening, but also starting to move into its own realms of making decisions.
“There have been a few scenarios where we’ve seen the team’s feedback from what they’ve learned on track, and it’s things like the teams will pre-plan and predict braking zones for the AI.
“We never expected to be in a place where we were able to allow the AI to, on its own, predict its braking zone. So some of the teams have now given the software the freedom to judge braking zones, based on previous lap learning, which, for me, is incredible, because that’s something that a human does; they know that the lap before they broke, they braked at the 100-metre board, and they braked way too early.
“So, on this lap, they’re going to brake at the 50-metre board and see how it goes. That’s exactly what the teams are doing now. Of course, the AI pushes its learnings beyond its capabilities, and we have spins and we have crashes and we have flat spots on tyres, but they just roll back that iteration, and then we go again.”
More on the A2RL and its refined artificial intelligence model will be published on PlanetF1.com on Saturday, November 15th, ahead of the Season 2 race.
Read Next: First look: Sergio Perez breaks cover in black Ferrari at Imola test

