No, I'm not joking.
I've made the claim - no, I don't know if I was the first, I don't care - that superstition is the first sign of an advanced intelligence. In the end, all knowledge is a form of superstition - if X happens, Y will happen. Empirical evidence is itself superstition - it's the bias for the idea, which I think I first encountered on Hansen's blog Overcoming Bias, and no, I don't know which post - that what has happened before is more likely to happen again.
There's no physical link between mere repetition to suggest that repetition is likely to continue. Empirical evidence is a superstition, and a very specific kind of intellectual bias.
Thus, I'm disinclined to take anything from link. By definition, the more intelligent are more biased. The trick is to separate useful from nonuseful bias. Empiricism is a useful bias. Or at least it hasn't failed us yet. Well, it's self-consistent, anyways.
All heuristics can be wrong. Useful heuristics are simply those heuristics which usually aren't.
And when you design a game AI that is terrified of some arbitrary act like opening doors while carrying a shotgun in 30% lighting with a 15% blue tinge because they got shot by the player in those circumstances more than once, you'll know your AI is on the path to being a good AI. Which isn't the same as a useful AI, for the purposes of a first person shooter.