7.19. Avoid Algorithmic Complexity Attacks

If it's important that your system keep working (and not be vulnerable to denial-of-service attacks), then it needs to be designed so it's not vulnerable to "Algorithmic Complexity Attacks". These attacks exploit the difference between 'typical case' behavior versus worst-case behavior; they intentionally send data that causes an algorithm to consistently display worst-case behavior. For examples, hash tables usually have O(1) performance for all operations, but in many implementations an attacker can construct input so a large number of 'hash collisions' occur. Other common algorithms that can be attacked include sorting routines (e.g., quicksort's worst case is O(n^2) instead of O(n log n)) and regular expression implementations. Many higher-level capabilities are built on these basic algorithms (e.g., many searches use hash tables). More information about this attack is available in Crosby and Wallach's paper on the subject [Crosby 2003].

There are several solutions. One approach is to choose algorithms with the best "worst-case" behavior, not just the best "average-case" behavior; many "real-time" algorithms are specially designed to have the best "worst-case" behavior, so search for those versions. Crosby and Wallach propose a "universal hashing library" hash function that avoids this problem. Although I've seen no proof that they'd work, trivially keyed hashes (where what is to be hashed is first passed through a random local key) should be effective, and cryptographically keyed hashes should be very effective - just make sure the attacker can't determine the key. Judy trees may be an effective alternative to b-trees.