This is an extended version of an article that previously appeared here.
Each year in the U.S. there are six million car crashes, resulting in nearly $160 billion in expenses. These accidents are the number one cause of death between the ages of four and 34, and 93% of them are due simply to human error.
“You either need to make a better driver, or take the driver and human error out of the equation all together,” says Brooke Williams, advanced driver assistance systems (ADAS) business manager at Texas Instruments (TI). “We have the technology that we believe can help reduce the death and crash rates significantly.”
The TDA2x System-on-Chip (Soc) is an ADAS that is paving the way towards the autonomous car, and taking human error out of the calculation. With a full board solution released in October 2013, TI is moving one step closer to their vision of enabling autonomous automobiles.
“We can cover nearly all of the components that are needed in a lot of these automotive systems,” says Williams. TI has been quietly making major investments over the past eight years with the vision of enabling autonomous vehicles. “Everything we are doing today is to support that vision,” he adds.
The three applications required to enable an autonomous car include a front camera, surround view, and sensor fusion.
- The front camera runs entry level algorithms and allows for high beam assist, lane departure warnings, traffic sign recognition, and, as the algorithms become more advanced, pedestrian detection and forward collision warnings.
- Rooted in park assistance, surround view is a slower speed application with four cameras (front, rear, and two side) that gives the driver a bird’s eye view once the images are fused together.
- A relatively new application, sensor fusion takes preprocessed data from camera and radar sensors, and fuses them together to enable more intelligent decisions. This technology comes into play when cars take control, applying brakes and making steering maneuvers. The application also supplies the redundancy that automotive manufacturers are looking for in autonomous cars.
“These cameras are fairly complex in nature so we’ve integrated a lot of IT interfaces and signal processing cores on a single SoC to integrate at a system level, so we can lower the cost of the BOM,” explains Williams.
All three of these applications have been targeted with the new TDA2x device family, and all are run on a single heterogeneous architecture. This will allow a single device to run more than one of these applications in the future.
The current architecture includes three main compute engines: a digital signal processor (DSP), vision accelerator, and ARM cortex.
“If you add up all of the compute engines, we [have] more than 60 GMACs of signal processing horsepower. We do that with a low power output between two to five watts, depending on the applications,” Williams explains.
A large internal memory eliminates the need to go off chip, speeding up the applications. And the video inputs and display outputs are capable of handling the configuration of six cameras.
“This is the broadest portfolio of cores, IP, and interfaces that anyone in the industry has offered,” says Williams. With that offering has come another set of challenges, specifically performance, scalability, power consumption, and integration.
“The number of algorithms being run are increasing at an exponential rate, so we need high signal processing performance,” explains Williams. “Because we are targeting a range of vehicles, there needs to be a scalable set of features to address entry level, mid-level, and high performance.”
With the current trend towards low power consumption, TI also felt the pressure to keep the power output down to a reasonable level. Form factor was also an issue, as the cameras are very small and in an environment with little room for airflow.
“The TDA2x family is the broadest portfolio of ADAS devices on the market today, giving scalability from entry level to high performance at a reasonable power budget. These technologies are the key to enabling future integration across multiple ADAS features,” says Williams. “We are bringing the most comprehensive and total embedded processing solution to the market, and taking us one step closer to an autonomous driving solution.”