Given an array of integers nums
, sort the array in ascending order and return it.
You must solve the problem without using any built-in functions in O(nlog(n))
time complexity and with the smallest space complexity possible.
Example 1:
Input: nums = [5,2,3,1] Output: [1,2,3,5] Explanation: After sorting the array, the positions of some numbers are not changed (for example, 2 and 3), while the positions of other numbers are changed (for example, 1 and 5).
Example 2:
Input: nums = [5,1,1,2,0,0] Output: [0,0,1,1,2,5] Explanation: Note that the values of nums are not necessairly unique.
Constraints:
1 <= nums.length <= 5 * 104
-5 * 104 <= nums[i] <= 5 * 104
The problem asks to sort an array of integers in ascending order without using any built-in sorting functions and with a time complexity of O(n log n) and minimal space complexity. This points towards implementing a comparison-based sorting algorithm like Merge Sort or Quick Sort. The provided solutions demonstrate both approaches.
Quick Sort is a divide-and-conquer algorithm. It works by selecting a 'pivot' element and partitioning the array around the pivot such that elements smaller than the pivot are before it, and elements greater than the pivot are after it. This process is then recursively applied to the subarrays before and after the pivot.
Time Complexity: Average and best case O(n log n), worst case O(n^2) (when the pivot selection consistently results in highly unbalanced partitions, e.g., already sorted array). The provided implementations use a randomized pivot selection (randint
in Python, choosing middle element in others) to mitigate the risk of worst-case scenarios.
Space Complexity: O(log n) on average due to the recursive calls (depth of recursion is proportional to log n). In the worst case, it can be O(n).
Code Breakdown (Python Example):
def sortArray(self, nums: List[int]) -> List[int]:
def quick_sort(l, r):
if l >= r: # Base case: subarray with one or zero elements is already sorted
return
x = nums[randint(l, r)] # Random pivot selection
i, j, k = l - 1, r + 1, l # i tracks elements smaller than pivot, j larger, k iterates
while k < j:
if nums[k] < x:
nums[i + 1], nums[k] = nums[k], nums[i + 1] #Swap smaller element before pivot
i, k = i + 1, k + 1
elif nums[k] > x:
j -= 1
nums[j], nums[k] = nums[k], nums[j] #Swap larger element after pivot
else:
k = k + 1 # Element equal to pivot, move k
quick_sort(l, i) #Recursively sort before pivot
quick_sort(j, r) #Recursively sort after pivot
quick_sort(0, len(nums) - 1)
return nums
The other language implementations follow a similar logic. Variations exist in how the pivot is selected and the partitioning is performed (Hoare partition scheme is used in some versions).
Merge Sort is another divide-and-conquer algorithm. It recursively divides the array into halves until each subarray contains only one element (which is inherently sorted). Then, it repeatedly merges the sorted subarrays to produce new sorted subarrays until a single sorted array is obtained.
Time Complexity: O(n log n) in all cases (best, average, and worst). Merge Sort's time complexity is consistent regardless of the input data.
Space Complexity: O(n) because it requires an auxiliary array of size n for merging. This is a higher space complexity than Quick Sort's average case.
Code Breakdown (Python Example):
def sortArray(self, nums: List[int]) -> List[int]:
def merge_sort(l, r):
if l >= r:
return
mid = (l + r) // 2
merge_sort(l, mid)
merge_sort(mid + 1, r)
i, j = l, mid + 1
tmp = []
while i <= mid and j <= r: #Merge step
if nums[i] <= nums[j]:
tmp.append(nums[i])
i += 1
else:
tmp.append(nums[j])
j += 1
tmp.extend(nums[i:mid+1]) #Add remaining elements
tmp.extend(nums[j:r+1])
nums[l:r+1] = tmp #Copy back to original array
merge_sort(0, len(nums) - 1)
return nums
The other languages also follow the same merge sort steps. The merging process is the key component, efficiently combining the sorted subarrays.
This solution provides an alternative Java implementation of Merge Sort, very similar in functionality and complexity to Solution 2's Java code. The key difference might be in minor details of the merging process or array handling, but the overall algorithmic approach and complexities remain the same.
In summary: Both Quick Sort and Merge Sort provide solutions meeting the problem's requirements. Merge Sort guarantees O(n log n) time complexity, while Quick Sort's average case is also O(n log n), but it has a worst-case time complexity of O(n^2). The choice between them depends on the specific application and priorities regarding worst-case performance versus space usage. Merge Sort's guaranteed O(n log n) and its simpler implementation might make it preferable for applications demanding predictable performance.