Greedy Algorithm in data structure [ with basic concept]
Contents
- 1 Greedy Algorithm in Data Structure
- 2 Basic Concept of Greedy Algorithm
- 3 Characteristics of Greedy Algorithms
- 4 Steps in a Greedy Algorithm
- 5 Examples of Greedy Algorithms
- 6 Coin Change Problem
- 7 Fractional Knapsack Problem
- 8 Activity Selection Problem
- 9 Huffman Coding (Data Compression)
- 10 When to Use Greedy Algorithms?
- 11 Time Complexity of Greedy Algorithms
- 12 Conclusion
Greedy Algorithm in Data Structure
Basic Concept of Greedy Algorithm
A Greedy Algorithm is an approach that makes the best possible choice at each step to find the optimal solution for a problem. It never looks back or revises previous choices. Instead, it follows a step-by-step process, always choosing the option that seems best at that moment.
Key Idea:
Always take the locally optimal choice, hoping it leads to a globally optimal solution.
Characteristics of Greedy Algorithms
Locally Optimal Choice – Picks the best option at each step.
No Backtracking – Does not reconsider previous decisions.
Fast & Efficient – Works well for specific problems.
Not Always Optimal – Sometimes fails to find the best solution.
Steps in a Greedy Algorithm
Define the problem – Break it into steps.
Choose the best solution – Pick the most beneficial option at each step.
Repeat until complete – Continue selecting optimal choices until a solution is reached.
Examples of Greedy Algorithms
Coin Change Problem
Problem: Find the minimum number of coins to make a given amount.
Example: If coins available are ₹1, ₹2, ₹5, ₹10, and the target amount is ₹18, the greedy approach picks:
₹10 → ₹5 → ₹2 → ₹1 → (Total: 4 coins)
Limitation: Doesn’t work for all denominations (e.g., {₹1, ₹3, ₹4}, target = ₹6).
Fractional Knapsack Problem
Problem: Given items with values & weights, maximize the total value within a weight limit.
Solution: Choose items with the highest value-to-weight ratio first.
Works well for fractional selection (items can be broken into parts).
Fails in the 0/1 Knapsack Problem (where items can’t be split).
Activity Selection Problem
Problem: Select the maximum number of non-overlapping activities from a given set of activities with start and end times.
Solution:
Sort activities by end time.
Always choose the earliest finishing activity that doesn’t overlap.
Greedy approach gives an optimal solution.
Huffman Coding (Data Compression)
Problem: Reduce file size by using shorter codes for frequent characters.
Solution:
Create a binary tree where the most frequent characters have shorter codes.
Uses a priority queue to select the smallest elements at each step.
Used in ZIP file compression & text encoding.
When to Use Greedy Algorithms?
When the problem has the “Greedy Choice Property” → Making a local optimal choice leads to a global optimum.
When the problem has “Optimal Substructure” → The optimal solution contains optimal sub-solutions.
When efficiency is important → Greedy algorithms are usually faster than dynamic programming.
Time Complexity of Greedy Algorithms
Usually O(n log n) or O(n), depending on sorting & selection operations.
Problem | Time Complexity |
---|---|
Coin Change | O(n) |
Knapsack (Fractional) | O(n log n) |
Activity Selection | O(n log n) |
Huffman Coding | O(n log n) |
Conclusion
Greedy algorithms are simple, fast, and effective for problems where local choices lead to the best global result. However, they don’t always guarantee the optimal solution, so they must be used carefully.
Best used in problems like: Huffman coding, activity selection, fractional knapsack.
Fails in problems like: 0/1 knapsack, some coin change cases.
Would you like examples in code (Python/C++) for better understanding?