This website contains ALL LeetCode **Premium** problems for
**FREE!!**.

All leaked interview problems are collected from Internet.

All leaked interview problems are collected from Internet.

Merge *k* sorted linked lists and return it as one sorted list. Analyze and describe its complexity.

b'

\n## Solution

\n

\n#### Approach #1 Brute Force [Accepted]

\n

\n#### Approach #2 Compare one by one [Accepted]

\n

\n#### Approach #3 Optimize Approach 2 by Priority Queue [Accepted]

\n

\n#### Approach #4 Merge lists one by one [Accepted]

\n

\n#### Approach #5 Merge with Divide And Conquer [Accepted]

\n

\n

'
\n\n

\n\n

**Intuition & Algorithm**

- \n
- Traverse all the linked lists and collect the values of the nodes into an array. \n
- Sort and iterate over this array to get the proper value of nodes. \n
- Create a new sorted linked list and extend it with the new nodes. \n

As for sorting, you can refer here for more about sorting algorithms.

\n\n**Complexity Analysis**

- \n
- \n
Time complexity : where is the total number of nodes.

\n- \n
- Collecting all the values costs time. \n
- A stable sorting algorithm costs time. \n
- Iterating for creating the linked list costs time. \n

\n - \n
Space complexity : .

\n- \n
- Sorting cost space (depends on the algorithm you choose). \n
- Creating a new linked list costs space. \n

\n

\n

**Algorithm**

- \n
- Compare every nodes (head of every linked list) and get the node with the smallest value. \n
- Extend the final sorted linked list with the selected nodes. \n

!?!../Documents/23_Merge_lists.json:1000,563!?!

\n**Complexity Analysis**

- \n
- \n
Time complexity : where is the number of linked lists.

\n- \n
- Almost every selection of node in final linked costs ( times comparison). \n
- There are nodes in the final linked list. \n

\n - \n
Space complexity :

\n- \n
- \n Creating a new linked list costs space. \n
- \n It\'s not hard to apply in-place method - connect selected nodes instead of creating new nodes to fill the new linked list. \n

\n

\n

**Algorithm**

Almost the same as the one above but optimize the **comparison process** by **priority queue**. You can refer here for more information about it.

**Complexity Analysis**

- \n
- \n
Time complexity : where is the number of linked lists.

\n- \n
- The comparison cost will be reduced to for every pop and insertion to priority queue. But finding the node with the smallest value just costs time. \n
- There are nodes in the final linked list. \n

\n - \n
Space complexity :

\n- \n
- \n Creating a new linked list costs space. \n
- \n The code above present applies in-place method which cost space. And the priority queue (often implemented with heaps) costs space (it\'s far less than in most situations). \n

\n

\n

**Algorithm**

Convert merge lists problem to merge 2 lists () times. Here is the merge 2 lists problem page.

\n**Complexity Analysis**

- \n
- \n
Time complexity : where is the number of linked lists.

\n- \n
- We can merge two sorted linked list in time where is the total number of nodes in two lists. \n
- Sum up the merge process and we can get: . \n

\n - \n
Space complexity : \n

\n- \n
- We can merge two sorted linked list in space. \n

\n

\n

**Intuition & Algorithm**

This approach walks alongside the one above but is improved a lot. We don\'t need to traverse most nodes many times repeatedly

\n- \n
- \n
Pair up lists and merge each pair.

\n \n - \n
After the first pairing, lists are merged into lists with average length, then , and so on.

\n \n - \n
Repeat this procedure until we get the final sorted linked list.

\n \n

Thus, we\'ll traverse almost nodes per pairing and merging, and repeat this procedure about times.

\n\n\n**Complexity Analysis**

- \n
- \n
Time complexity : where is the number of linked lists.

\n- \n
- We can merge two sorted linked list in time where is the total number of nodes in two lists. \n
- Sum up the merge process and we can get: \n \n

\n - \n
Space complexity : \n

\n- \n
- We can merge two sorted linked lists in space. \n

\n

\n

Analysis written by: @Hermann0

\n