T(n) =T(n-1) + T(n/2) +n can also be written as T(n) = f(n) + g(n)

f(n) = f(n-1) +n and g(n) = g(n/2) +n

after solving both , we get complexity of f(n) as O(n^{2}) and complexity of g(n) as O(log n)
So , complexity of T(n) is worst case of O(n^{2}) and O(log n) , thus time complexity is O(n^{2})

just take the one which is asymptotically larger. here t(n/2) will be consumed before the t(n-1) so for taking that only we can conclude t(n) ~== t(n-1) + n

Let us apply the recursion-tree method for solving this recurrence problem:

The cost of the root is n because it is the combined cost to divide and merge back the results from the conquered subprocesses. Now here you can see that the height of the subtrees will be largest for T(n-1). It is equivalent to the fact mentioned by @arpitdhuriya that T(n-1) is asymptotically larger.

Hence the complexity would be O(n^{2})
More precisely, Ω(n^{2})

T(n) =T(n-1) + T(n/2) +n can also be written as T(n) = f(n) + g(n)

f(n) = f(n-1) +n and g(n) = g(n/2) +n

after solving both , we get complexity of f(n) as O(n

^{2}) and complexity of g(n) as O(log n)So , complexity of T(n) is worst case of O(n

^{2}) and O(log n) , thus time complexity is O(n^{2})Thanks a lot arpit .

I know this but if u r doing this then it is omega not big o .

I need to know the answer and even i know the answer i wanted to know the approach.

Let us apply the recursion-tree method for solving this recurrence problem:

The cost of the root is n because it is the combined cost to divide and merge back the results from the conquered subprocesses. Now here you can see that the height of the subtrees will be largest for T(n-1). It is equivalent to the fact mentioned by @arpitdhuriya that T(n-1) is asymptotically larger.

Hence the complexity would be O(n

^{2})More precisely, Ω(n

^{2})