We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Cache-friendly Design in Robot Path Planning with C++ - Brian Cairl - CppCon 2023
Learn how cache-friendly design principles can significantly improve robot path planning performance by minimizing cache misses and optimizing memory access in a C++ implementation.
- Cache-friendly design is program design guided by the principles of memory locality and spatial locality to minimize cache misses.
- Cache misses can have a significant impact on program performance, leading to increased execution time and CPU usage.
-
The STL provides some tools for cache-friendly design, such as the
std::vector
andstd::unordered_map
containers. -
The
std::unordered_map
container is particularly cache-friendly because it uses a hash table to store its elements, which allows for fast lookups and minimizes cache misses. - Using a cache-friendly data structure, such as a vector of vectors, can improve program performance by reducing cache misses.
- Another way to improve cache performance is to reorder the data in memory to minimize cache misses. This can be done by sorting the data based on its spatial locality.
- Caching can also be used to improve program performance by storing frequently accessed data in fast memory, such as the cache.
- Some cache-friendly design principles include: minimizing the number of cache misses, keeping frequently accessed data in fast memory, and minimizing the number of cache lines that are accessed.
- Cache-friendly design can be particularly important in applications that require fast access to large amounts of data, such as robotics and computer vision.