A lot of compiler tasks are pattern matching on tree structures.
Both OCaml and Haskell have powerful and concise pattern matching capabilities.
It's harder to add pattern matching to imperative languages as whatever value is being evaluated or extracted to match the pattern against must be side-effect free.
Often times a compiler works a lot with trees. The source code is parsed into a syntax tree. That tree might then be transformed into another tree with type annotations to perform type checking. Now you might convert that tree into a tree only containing core language elements (converting syntactic sugar-like notations into an unsugared form). Now you might perform various optimizations that are basically transformations on the tree. After that you would probably create a tree in some normal form and then iterate over that tree to create the target (assembly) code.
Functional language have features like pattern-matching and good support for efficient recursion, which make it easy to work with trees, so that's why they're generally considered good languages for writing compilers.
One important factor to consider is that a big part of any compiler project is when you can self-host the compiler and "eat your own dog food." For this reason when you look at languages like OCaml where they are designed for language research, they tend to have great features for compiler-type problems.
In my last compiler-esque job we used OCaml for exactly this reason while manipulating C code, it was just the best tool around for the task. If the folks at INRIA had built OCaml with different priorities it might not have been such a good fit.
That said, functional languages are the best tool for solving any problem, so it logically follows that they are the best tool for solving this particular problem. QED.
/me: crawls back to my Java tasks a little less joyfully...
Seems like everyone missed another important reason. It's quite easy to write a embedded domain specific language (EDSL) for parsers which look a lot like (E)BNF in normal code. Parser combinators like Parsec are quite easy to write in functional languages using higher-order functions and function composition. Not only easier but very elegantly.
Basically you represent the most simplest generic parsers as just functions and you have special operations (typically higher-order functions) which let you compose these primitive parsers into more complicated, more specific parsers for your grammar.
This is not the only way to do parer frameworks of-course.
Basically, a compiler is a transformation from one set of code to another — from source to IR, from IR to optimized IR, from IR to assembly, etc. This is precisely the sort of thing functional languages are designed for — a pure function is just a transformation from one thing to another. Imperative functions don't have this quality. Although you can write this kind of code in an imperative language, functional languages are specialized for it.
One possibility is that a compiler tends to have to deal very carefully with a whole host of corner cases. Getting the code right is often made easier by using design patterns that structure the implementation in a way that parallels the rules it implements. Often that ends up being a declarative (pattern matching, "where") rather than imperative (sequencing, "when") design and thus easier to implement in a declarative language (and most of them are functional).