If mathematically describe, The runtime of an divide and conquer algorithm on a problem of size 'n', usually denoted T(n), can be expressed by the recurrence relation T(n)=aT(n/b) + f(n), where f(n) is the time to create the subproblems and combine their results in the above procedure, a is the number of subproblems, and b is the reduce factor for the size of each subproblem. There are other approaches such as quick sort, their difference came from the cost of divide and combine, the speed of reducing the problem size and the number of subproblems created. The master theorem allows many recurrence relations of this form to be converted to big-O-notation directly. Both merge sort and quick sort reduces into O(nlogn) on average case, however, depends on the nature of the problem, one approach can be a few or many times faster than the other.
Merge sort represents the most common human experience:
Easy to say, how to do?
There are a few practical problem. First of all, how I know where to break down the big problem? I know nothing about the problem at all. The practical answer is ask someone knows enough to answer this question. Suggest a good dividing takes as little energy as a pointer for them, but takes the non-initiated me tons of energy to find a good one that is better than shooting in the dark.
Secondly, how do I know when to stop asking but put my hands down and do the dirty work? The answer is, when the problem is smaller enough. A system might be hard to understand, how about a module? Maybe a package? The main class of a program? Start from the first line of the code and start to engage and see what will happen.
Thirdly, put things together is not as easy as it sounds, how do I know my way of putting things together is correct. I need to study architects, know the rule of putting things together, which is the extra energy put aside for killing the merging entropy.
Simple thought, little by little, know the rule, sounds like merge sort.