This is an interesting story and it doesn't surprise me at all. We can blame the lack of training and/or the inadequacies of the actions of the caretaker but, for me, the root cause is the complexity of the system requirements. If the system had made it simple for the users to understand what had happened so that they could take remedial action, an awful lot of trouble would have been saved.
Now, I'm not saying that the designers should have foreseen this failure and taken appropriate precautions - it's very easy to do that with the benefit of hindsight. But hindsight is a terrible thing, no good to anyone, it only serves to point fingers at unfortunate individuals who have done no more or less than you or I would have done under the same circumstances. What can be done is that we can learn from experience.
There are two lessons for me, the first is that care should be taken to ensure that trace heating is not isolated during cold spells. The second, and more important one, is the more general reminder that systems should not expect too much from their human counterparts. You can have the most complex and comprehensive automated system in the world but add a human element and you have to start anticipating failures.
How can a designer successfully anticipate all the potential problems that are introduced along with the humans? Well, they can't. But they can do their best by talking to the existing users of systems similar to the ones they are designing. Talking to the caretakers and the handy-men and the people who test the fire alarms and the associated systems, talking to the people who actually run the systems might help them better understand how to make systems that are less prone to human failure than some of the ones we see today.
Stu