Stephen Gilmore wrote a nice set of tutorial: Programming in Standard ML ’97. There is a chapter on Types and type inference. He praises type checking of ML in compile time, making it impossible to have runtime type errors. He cites C as an example of those languages that “neither guarantee to enforce type-correctness when the program is compiled nor when it is running.” It is interesting to note that C programmers have since invented (or recommended) the Hungarian notation to minimize such type errors syntactically.
While type inference can save the programmer the tedious task of declaring types (for variables, function parameters and return values), it can be too smart: a typo that somehow didn’t cause syntax error can lead to a weird type being deduced. This may be easily spotted in top-level interactive mode, but can be hard to find if a large program is being compiled in batch mode.
Robin Milner, who developed ML, worked out an algorithm for type inference in 1978. He just call it Algorithm W.