That's not how thermodynamics works, and is a really old myth that never made any sense. Wouldn't you want the least flow possible for the best possible cooling, by your argument?
You want to get the heat out of the engine as fast as practically possible, and you want the deltaT between the radiator and the coolant to be as big as possible for best cooling efficiency. There's always a point where that curve flattens off though, and more flow really doesn't add anything of value.
Slow flow in the radiator (more time in the radiator) also means slow flow in the engine (more time in the engine), which leads to all sorts of problems lower heat transfer and uneven cooling within the engine.
That said, the entire cooling system needs to be able to take advantage of a higher flow pump, and I think most of these weird myths come from unexpected things happening. Stuff like too much pressure loss across a radiator or thermostat causing a high-flow pump to cavitate, which can make the high flow pump perform terribly.
If your cooling system sucks, a high flow pump may not solve any problems, and it may make things worse, but it's not because of coolant spending too little time inside the radiator.