Automated-Driving Control Unit
GIGABYTE has been planning the PILOT product family, which is designed to provide developers and test drivers with a complete product portfolio that supports the development of functions for automated driving by means of high computation power, high bandwidth, and rich external interfaces.
Email Sales
Way in The Future
Car accidents causes can mostly be connected to driver distraction or misjudgment. For this reason, the development and implementation of vehicle safety systems have grown, especially in the last decades. Thanks to the technologies advances, the automotive industry increases to adopt sensor and microcontroller in order to perceive environment inputs and to autonomous intervenes on the driving activity. Advanced Driver Assistance Systems (ADAS) are emerging as fundamental to improve road safety. ADAS is the first step towards autonomous vehicles. Autonomous vehicle-related technologies offer the possibility of fundamentally changing transportation. Cars with these technologies will likely reduce crashes, energy consumption, pollution and reduce the costs of congestion.
From ADAS to Autonomous-Driving
From ADAS (advanced driver assistance systems ) to full Level 5 autonomy, with a special focus on artificial intelligence, machine learning, sensors, and software, the computing unit handles the main control for the self-driving vehicles. It handles tasks like converting throttle input to torque requests, safety systems monitoring, control loops, and power limiting and the most crucial key is the combination of sensors and actuators, sophisticated algorithms, and powerful processors to execute software.
Autonomous car
The autonomous car market is forecasted to grow in the future years due to the early adoption of autonomous transport in the region. The prominent players in the region are conducting the level-4 test, the early conduction of the test is predicted to increase the market in the coming years.
Self-driving truck
Manufacturers have developed and are testing self-driving trucks.
Delivery robot
The delivery robots are being designed to provide a high-tech autonomous door-to-door delivery service to online shoppers all around the world.
Golf cart
Self-driving golf carts have been appearing around a college campus and public garden. The carts are slower, but they might just win the race to put driverless vehicles on the road.
Challenges Before Fully Self-Driving Vehicles Commercialized
Autonomous vehicles (AV) could contribute to making future mobility more efficient, safer, cleaner and more inclusive. It highlights that several conditions must be fulfilled to achieve this goal. Otherwise, the introduction of AVs in traffic streams could not bring the desired benefits. Also, attention must be paid to legal issues, which will determine when the society is ready for the future autonomous driving environment.
Navigation and Guidance
A complete GPS module for real-time communication between the control center and cloud is one of the major requirements for autonomous drive. Therefore, the onboard computing unit shall support LTE/5G/V2X wireless network connectivity for real-time data streaming, event notification, and communications so that the cloud center can always receive information and updates from the vehicles that are connected to.
The network must be able to transmit huge amounts of data at very high speed, with low latency, in all conditions (weather, traffic state, etc.) and without interferences. Additionally, it must be safe against hackers or terrorist attacks, and it must be able to work to some extent even in failure conditions.
While GPS is an essential function for autonomous vehicles, it's not sufficient by itself. The GPS signal is blocked by canyons, tunnels, radio interference, and many other factors, and these outages can last for many minutes and longer. To supplement the GPS, the autonomous vehicle uses inertial guidance with the combination of accelerometers and gyroscope, etc. These sensors provide data on the rotational and linear motion of the platform, which then is used to calculate the motion and position of the vehicle regardless of speed or any sort of signal obstruction.
The network must be able to transmit huge amounts of data at very high speed, with low latency, in all conditions (weather, traffic state, etc.) and without interferences. Additionally, it must be safe against hackers or terrorist attacks, and it must be able to work to some extent even in failure conditions.
While GPS is an essential function for autonomous vehicles, it's not sufficient by itself. The GPS signal is blocked by canyons, tunnels, radio interference, and many other factors, and these outages can last for many minutes and longer. To supplement the GPS, the autonomous vehicle uses inertial guidance with the combination of accelerometers and gyroscope, etc. These sensors provide data on the rotational and linear motion of the platform, which then is used to calculate the motion and position of the vehicle regardless of speed or any sort of signal obstruction.
Self-Driving Vehicles with wireless network and GPS connectivity
Driving and Safety
Sensors are a key component to making a vehicle driverless. Camera, radar, ultrasonic and LiDAR enable an autonomous vehicle to visualize its surroundings and detect objects. Cars today are fitted with a growing number of environmental sensors that perform a multitude of tasks. However, each sensor alone has its limitations.
The sensor perception task includes three parts: localization, detection and tracking, all of them achieved through data fusion performed at different levels.
Localization is usually performed by algorithms that fuse data from GPS, IMU, and LiDAR, resulting in a high-resolution group map. Vision-based deep -learning technologies are achieving accurate results for object detection, as they can autonomously handle huge amounts of data. Deep Learning techniques have also demonstrated their suitability for object tracking relative to approaches based on computer vision.
Decision-taking is one of the most challenging tasks that AVs must perform, especially in awkward situations. It encompasses prediction, path planning, and obstacle avoidance, all of them performed on the basis of previous perceptions. And finally, the HMI is to provide information about driving.
The sensor perception task includes three parts: localization, detection and tracking, all of them achieved through data fusion performed at different levels.
Localization is usually performed by algorithms that fuse data from GPS, IMU, and LiDAR, resulting in a high-resolution group map. Vision-based deep -learning technologies are achieving accurate results for object detection, as they can autonomously handle huge amounts of data. Deep Learning techniques have also demonstrated their suitability for object tracking relative to approaches based on computer vision.
Decision-taking is one of the most challenging tasks that AVs must perform, especially in awkward situations. It encompasses prediction, path planning, and obstacle avoidance, all of them performed on the basis of previous perceptions. And finally, the HMI is to provide information about driving.
Autonomous Vehicle
Computing Performance
There has been an exponential increase in the usage of the autonomous vehicles across the globe, due to an exponential increase in the pupularity and usage of artificial intelligence techniques in various applications. Traffic flow prediction is important for autonomous vehicles using which they decide their itinerary and take adaptive decisions ( for example, turn left or right, move straight, lane change, stop, or accelerate) with respect to their surrounding objects. It also has been observed that research on autonomous vehicles has shifted from traditional statistical models to adaptive machine learning techniques.
Computing Performance Of The Computation Unit
ISO26262-Management of Functional Safety in Automotive Industry
Safety has long been a key topic in the automotive industry and the list of requirements continues to grow. ISO26262 is the new automotive functional safety standard for passenger vehicle industry and governs the development of safety-related electrical and/or electronic (E/E) systems within road vehicles. ISO 26262 imposes stringent requirements that encompass the entire life cycle of a system, from concept phase to development, production, and decommissioning. It addresses the overall safety management process and covers relations with suppliers and interfaces for distributed development.
Self-Driving Vehicles Enacted Legislation
The government has a role in testing and deployment to guide vehicles with automated driving systems to have profound effects on society by increasing safety and mobility. Challenges for government include ensuring the systems are safe, determining who is liable for accidents, and public interest in adopting the technology.
Technological improvements in the infrastructure will also be necessary. First, those aimed at helping AVs to perform the perception tasks: horizontal and vertical road signs must be clear and complete, road layouts should be as smooth as possible, etc. Second, V2I-related technologies must be deployed.
Technological improvements in the infrastructure will also be necessary. First, those aimed at helping AVs to perform the perception tasks: horizontal and vertical road signs must be clear and complete, road layouts should be as smooth as possible, etc. Second, V2I-related technologies must be deployed.
GIGABYTE PILOT Family
An automated-driving control unit is the core controller of autonomous vehicles or ADAS vehicles. The experimental prototype of a driverless bus powered by GIGABYTE PILOT has no steering wheel, gas pedal, or brake pedal, being 100% autonomous.
To also handle computation-intensive tasks, such as the development, validation, and optimization of artificial intelligence (AI) algorithms, the GIGABYTE PILOT family can be equipped with hardware accelerators, such as GPUs and FPGAs.
GIGABYTE PILOT systems offer extensive bus and network support based on the latest standards, such as AUTOSAR and FIBEX. It can process variant vision sensor data synchronously and handle the sensor fusion. The PILOT platform offers a high-performance solution for developing functions for automated driving that supports all relevant sensor interfaces, buses, and networks. Alternatively, additional software environments, such as Linux-based applications, can be used. A documented application programming interface (API) of all relevant interfaces is provided.
PILOT family targets for automated driving systems (ADS), advanced driver-assistance systems (ADAS), and other surround sensor applications. It also supports higher speeds for in-vehicle infotainment displays and other peripherals.
To also handle computation-intensive tasks, such as the development, validation, and optimization of artificial intelligence (AI) algorithms, the GIGABYTE PILOT family can be equipped with hardware accelerators, such as GPUs and FPGAs.
GIGABYTE PILOT systems offer extensive bus and network support based on the latest standards, such as AUTOSAR and FIBEX. It can process variant vision sensor data synchronously and handle the sensor fusion. The PILOT platform offers a high-performance solution for developing functions for automated driving that supports all relevant sensor interfaces, buses, and networks. Alternatively, additional software environments, such as Linux-based applications, can be used. A documented application programming interface (API) of all relevant interfaces is provided.
PILOT family targets for automated driving systems (ADS), advanced driver-assistance systems (ADAS), and other surround sensor applications. It also supports higher speeds for in-vehicle infotainment displays and other peripherals.
Powerful Processors to Handle AI Computing Requirements
The GIGABYTE PILOT deploys powerful main processor and GPU solutions to use the power of AI and deep learning to deliver an autonomous driving solution from data collection, model training, and testing in simulation to the deployment of smart, safe auto-pilot cars.
The architecture GIGABYTE PILOT family can be scaled down to approximately 1 TOPs of performance for delivery robot or scaled up to 150 TOPs of performance for ADAS, 5G, or server-type applications. These multiple configurations can achieve many times the efficiency of existing solutions.
Modular design with CPU, GPU, and other accelerators providing a complete, heterogeneous system, the architecture will also be accessible through the popular machine learning frameworks, such as TensorFlow.
The Architecture of Sensor Fusion
The GIGABYTE PILOT controller can process the vision sensor data, from multiple cameras, LiDARs, radars, and ultrasonic radars and offers a broad portfolio of high-performance automotive solutions for sensor fusion applications. GIGABYTE PILOT creates a comprehensive environmental model by fusing various sensors in and around the car to computes the best driving strategy for maximum driver safety and convenience. It makes decisions and triggers actuators and is designed to meet the highest safety and security standards.
Whatever enhanced ADAS function you would like to design, it's for certain to find the right automotive-grade platform solution from the GIGABYTE PILOT product family.
Whatever enhanced ADAS function you would like to design, it's for certain to find the right automotive-grade platform solution from the GIGABYTE PILOT product family.
Safety-Critical Hardware for ISO26262
GIGABYTE is planning to assist the overall system design in attaining the desired A-SIL (according to ISO26262) level for safety systems with high efficiency.
Related Technologies
Artificial Intelligence (AI) is a broad branch of computer science. The goal of AI is to create machines that can function intelligently and independently, and that can work and react the same way as humans. To build these abilities, machines and the software & applications that enable them need to derive their intelligence in the same way that humans do – by retaining information and becoming smarter over time.
AI is not a new concept – the idea has been in discussion since the 1950s – but it has only become technically feasible to develop and deploy into the real world relatively recently due to advances in technology – such as our ability to now collect and store huge amounts of data that are required for machine learning, and also the rapid increases in processing speeds and computing capabilities which make it possible to process the data collected to train a machine / application and make it "smarter".
Advanced Driver Assistance Systems (ADAS) constantly monitor the vehicle surroundings, alert the driver of hazardous road conditions, and take corrective actions, such as slowing or stopping the vehicle. These systems use inputs from multiple sensors, such as cameras and radars. The fusion of these inputs is processed and the information is delivered to the driver and other parts of the system.
Machine learning (ML) is the scientific study of algorithms and statistical models that computer systems use to effectively perform a specific task without using explicit instructions, relying on models and inference instead. It is seen as a subset of artificial intelligence.
Bring Your Ideas Faster to Fruition
Email Sales