Various timing-based mutual exclusion algorithms have been proposed that guarantee mutual exclusion if certain timing assumptions hold. In this paper, we examine how these algorithms behave when the time for the basic operations is governed by random distributions. In particular, we are concerned with how often such algorithms succeed in allowing a processor to obtain a critical section and how this success rate depends on the random variables involved. We explore this question in the case where operation times are governed by exponential and gamma distributions, using both theoretical analysis and simulations.

Originally appeared in the Proceedings of the 18th Annual ACM Symposium on Principles of Distributed Computing, 1998.