100 MCQ DSA Question
100 MCQ DSA Question
BY: coding_error1
o b) Stack
o c) Queue
o d) Graph
10. Which is faster: array or linked list (for access by index)?
o a) Array ✔
o b) Linked List
o c) Both
o d) None
11. What is the default value of elements in an int array in Java?
o a) -1
o b) 1
o c) 0 ✔
o d) undefined
12. What is the index of first element in an array?
o a) 0 ✔
o b) 1
o c) -1
o d) Depends on language
13. Which method is used to reverse a string in Java?
o a) reverse()
o b) reverseString()
o c) StringBuilder.reverse() ✔
o d) strrev()
14. What data structure would you use for implementing a leaderboard?
o a) ArrayList
o b) Priority Queue ✔
o c) HashMap
o d) Stack
15. Which of the following is best for random access?
o a) Array ✔
o b) Linked List
o c) Stack
o d) Queue
• a) O(n)
• b) O(1) ✔
• c) O(log n)
• d) O(n log n)
• a) 1 pointer
BY: coding_error1
• b) 2 pointers ✔
• c) 3 pointers
• d) 0 pointers
• a) DFS
• b) Two pointers (slow/fast) ✔
• c) BFS
• d) Sort
• a) Insertion at head ✔
• b) Deletion at tail
• c) Random access
• d) Sorting
21. What is the time complexity to delete a node at position n in a singly linked list?
• a) O(1)
• b) O(n) ✔
• c) O(n log n)
• d) O(log n)
• a) Contiguous
• b) Non-contiguous ✔
• c) Fixed
• d) None
• a) For recursion
• b) For circular list
• c) To simplify operations ✔
• d) To reduce memory
• a) Array
BY: coding_error1
• b) Linked List ✔
• c) Queue
• d) Tree
• a) Fixed size
• b) Better cache
• c) Dynamic size ✔
• d) Faster indexing
• a) Queue
• b) Stack ✔
• c) Array
• d) Linked List
• a) Stack
• b) Queue ✔
• c) Tree
• d) Graph
• a) O(1) ✔
• b) O(n)
• c) O(log n)
• d) O(n log n)
• a) pop()
• b) push()
• c) enqueue() ✔
• d) delete()
• a) dequeue() ✔
• b) push()
• c) pop()
BY: coding_error1
• d) insert()
• a) Queue
• b) Stack ✔
• c) Tree
• d) Graph
• a) Array
• b) Stack ✔
• c) Queue
• d) Tree
• a) Program crashes ✔
• b) Data is lost
• c) Returns 0
• d) Starts again
• a) O(1) ✔
• b) O(n)
• c) O(log n)
• d) O(n log n)
• a) Array ✔
• b) Graph
• c) Tree
• d) HashMap
• a) Inorder
• b) Postorder ✔
• c) Preorder
• d) Level Order
38. In a BST, what is the time complexity for search in average case?
• a) O(1)
• b) O(n)
• c) O(log n) ✔
• d) O(n log n)
• a) Preorder
• b) Inorder ✔
• c) Postorder
• d) Level Order
• a) Stack
• b) Queue ✔
• c) Array
• d) Graph
• a) 0 ✔
• b) 1
• c) -1
• d) 2
• a) Binary Tree
• b) Balanced Binary Search Tree ✔
• c) Heap
• d) Graph
BY: coding_error1
• a) Inorder
• b) BFS ✔
• c) Preorder
• d) Postorder
45. What is the number of null links in a binary tree with n nodes?
• a) n
• b) n+1 ✔
• c) 2n
• d) 2n+1
• a) Queue
• b) Stack ✔
• c) Priority Queue
• d) Hash Table
• a) Kruskal
• b) Dijkstra
• c) Topological Sort ✔
• d) Prim
• a) DFS
• b) BFS ✔
• c) Inorder
• d) Preorder
• a) O(V+E) ✔
• b) O(V^2)
• c) O(E^2)
• d) O(log V)
• a) DFS ✔
• b) BFS
BY: coding_error1
• c) Prim
• d) Dijkstra
• a) DFS
• b) Prim’s ✔
• c) BFS
• d) Bellman-Ford
53. Which graph representation is most space efficient for sparse graphs?
• a) Adjacency Matrix
• b) Adjacency List ✔
• c) Incidence Matrix
• d) Edge List
• a) n
• b) n^2
• c) n(n-1)/2 ✔
• d) n(n+1)/2
55. Which algorithm is used for shortest path with negative weights?
• a) Dijkstra
• b) Bellman-Ford ✔
• c) Prim
• d) Kruskal
• a) O(n)
• b) O(log n) ✔
• c) O(1)
• d) O(n log n)
BY: coding_error1
57. Merge Sort follows which algorithmic paradigm?
• a) Greedy
• b) Divide and Conquer ✔
• c) Dynamic Programming
• d) Backtracking
• a) Merge Sort
• b) Quick Sort
• c) Insertion Sort ✔
• d) Radix Sort
• a) Merge Sort
• b) Radix Sort ✔
• c) Heap Sort
• d) Quick Sort
• a) O(n log n)
• b) O(log n)
• c) O(n^2) ✔
• d) O(n)
• a) Stack
• b) Heap ✔
• c) Queue
• d) Tree
• a) O(n) ✔
• b) O(n log n)
• c) O(n^2)
• d) O(log n)
• a) Quick Sort
• b) Heap Sort
• c) Merge Sort ✔
• d) Selection Sort
BY: coding_error1
64. What is the average case of Quick Sort?
• a) O(n^2)
• b) O(n log n) ✔
• c) O(n)
• d) O(log n)
• a) Merge Sort
• b) Quick Sort
• c) Insertion Sort ✔
• d) Selection Sort
• a) Tower of Hanoi
• b) N-Queens ✔
• c) Fibonacci
• d) Binary Search
• a) Queue
• b) Stack ✔
• c) Linked List
• d) Heap
• a) Base case
• b) Recursive call
• c) Iterative loop ✔
• d) Stack usage
• a) Greedy
• b) Brute-force with optimization ✔
• c) Divide and Conquer
BY: coding_error1
• d) Dynamic Programming
• a) O(1)
• b) O(log n)
• c) O(n)
• d) O(2^n) ✔
• a) Tree Traversal ✔
• b) Searching
• c) Sorting
• d) Hashing
• a) Returns null
• b) Infinite recursion ✔
• c) Compilation error
• d) Stack is unused
• a) Heap
• b) Stack ✔
• c) Queue
• d) Array
• a) Iteration
• b) Recursion ✔
• c) Inheritance
• d) Encapsulation
• a) Looping
• b) Storing results of expensive function calls ✔
• c) Searching algorithm
• d) Sorting algorithm
• a) Recursion
• b) Tabulation ✔
• c) Memoization
• d) Brute force
• a) Greedy
• b) DP ✔
• c) BFS
• d) DFS
• a) Always O(1)
• b) Depends on problem ✔
• c) O(log n)
• d) O(1)
BY: coding_error1
• a) N-Queens
• b) Matrix Chain Multiplication ✔
• c) Kruskal
• d) Prim’s
• a) Tabulation
• b) Memoization ✔
• c) Brute Force
• d) DFS
• a) Fractional Knapsack
• b) 0-1 Knapsack ✔
• c) Activity Selection
• d) Huffman Encoding
• a) Sorting
• b) Greedy ✔
• c) Backtracking
• d) DFS
• a) Dijkstra ✔
• b) Bellman-Ford
• c) Binary Search
• d) Merge Sort
• a) Dynamic Programming
• b) Greedy Approach ✔
BY: coding_error1
• c) Divide & Conquer
• d) Brute Force
• a) O(1)
• b) O(n) ✔
• c) O(log n)
• d) O(n^2)
• a) O(log n)
• b) O(n)
• c) O(1) ✔
• d) O(n log n)
• a) O(log n)
• b) O(1) ✔
• c) O(n)
• d) O(n log n)
• a) Best case
• b) Average case
• c) Worst case ✔
• d) No case
• a) O(1)
• b) O(n)
• c) O(n log n)
• d) O(n!) ✔
• a) O(n) ✔
• b) O(log n)
• c) O(1)
• d) O(n^2)
BY: coding_error1
97. Which is slower in growth rate?
• a) O(n)
• b) O(n log n) ✔
• c) O(n^2)
• d) O(2^n)
98. What is the time complexity of inserting in a hash table (average case)?
• a) O(log n)
• b) O(n)
• c) O(1) ✔
• d) O(n^2)
• a) O(1)
• b) O(log n)
• c) O(n) ✔
• d) O(n^2)
• a) O(1) ✔
• b) O(log n)
• c) O(n)
• d) O(n log n)
BY: coding_error1