Talks - Zac Hatfield-Dodds: Async: scaling structured concurrency with static and dynamic analysis

Discover how to scale structured concurrency with static and dynamic analysis, learning about futures, transformers, and mediactors, and how they can help manage shared resources, non-deterministic execution, and pathological situations.

Key takeaways
  • Structured Concurrency is proposed as a foundation for highly concurrent and correct programs. The idea is to promote parallelism using non-blocking I/O operations and make concurrent programming more predictable and robust.
  • Futures are a way to represent the result of an asynchronous operation. They can be used to compose complex operations, simplify the handling of asynchronous operations, and provide a clearer understanding of program behavior.
  • Pathological situations can occur in concurrent programs, such as accidentally dropping messages or deadlocking. Transformers and Mediactors can be used to manage these situations and make concurrent programs more robust.
  • Core Challenges of concurrent programming include managing shared resources, handling non-deterministic execution, and dealing with pathological situations.
  • Tools and libraries can help to alleviate these challenges, but the programmer still needs to understand the underlying principles of structured concurrency.