Core Algorithms
Core algorithms are fundamental techniques used in computer science and software development to solve a wide variety of problems efficiently. These algorithms form the building blocks of many applications, systems, and technologies. Below is an overview of some key core algorithms categorized by their functionality:
1. Sorting Algorithms
Sorting algorithms are used to reorder elements in a list or array into a particular sequence, typically in increasing or decreasing order.
Bubble Sort: Repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. Simple but inefficient for large datasets.
Merge Sort: A divide-and-conquer algorithm that divides the list into smaller parts, sorts them, and then merges them. Has a time complexity of O(n log n).
Quick Sort: Another divide-and-conquer algorithm that selects a "pivot" and partitions the array into two sub-arrays based on the pivot. Highly efficient with average time complexity O(n log n).
Insertion Sort: Builds the sorted list one element at a time, inserting each element in its proper place.
Heap Sort: Uses a binary heap to sort the elements. Time complexity is O(n log n).
2. Searching Algorithms
These algorithms are used to find an element or determine the presence of an element in a dataset.
Linear Search: Checks each element one by one until the desired element is found. Time complexity is O(n).
Binary Search: Efficient search algorithm that works on sorted arrays. It repeatedly divides the search interval in half. Time complexity is O(log n).
3. Graph Algorithms
Graph algorithms deal with problems related to graphs, such as finding the shortest path, detecting cycles, or finding connected components.
Breadth-First Search (BFS): Explores all vertices of a graph level by level. Used for finding the shortest path in unweighted graphs.
Depth-First Search (DFS): Explores as far down a branch of the graph as possible before backtracking. Useful for pathfinding and cycle detection.
Dijkstra's Algorithm: Finds the shortest path from a source node to all other nodes in a graph with non-negative weights. Time complexity is O(E log V).
Bellman-Ford Algorithm: Like Dijkstra but works on graphs with negative edge weights and can detect negative cycles. Time complexity is O(V * E).
Kruskal's Algorithm: A greedy algorithm that finds the minimum spanning tree of a graph by adding edges in increasing order of weight.
Prim's Algorithm: Another greedy algorithm for finding the minimum spanning tree, starting from a source vertex and growing the tree one vertex at a time.
Floyd-Warshall Algorithm: Finds shortest paths between all pairs of vertices in a graph. Useful for dense graphs with time complexity O(V^3).
4. Dynamic Programming (DP) Algorithms
Dynamic programming algorithms solve complex problems by breaking them into simpler subproblems and solving each one just once, storing the results for future use.
Fibonacci Sequence: A classic example solved using dynamic programming to avoid recalculating the same subproblems.
Knapsack Problem: Determines the optimal subset of items to include in a knapsack to maximize value while respecting a weight limit.
Longest Common Subsequence (LCS): Finds the longest subsequence common to two sequences.
Matrix Chain Multiplication: Determines the most efficient way to multiply a chain of matrices.
5. Greedy Algorithms
Greedy algorithms build up a solution piece by piece, always choosing the next piece that offers the most immediate benefit.
Huffman Coding: A compression algorithm that uses a greedy approach to assign variable-length codes to characters based on their frequencies.
Activity Selection Problem: Finds the maximum number of activities that can be performed by a single person or machine.
Fractional Knapsack Problem: Allows fractional parts of items to be included in the knapsack, solved efficiently using a greedy strategy.
6. Divide and Conquer Algorithms
Divide and conquer algorithms break down a problem into smaller subproblems, solve each subproblem, and then combine the solutions.
Merge Sort: A classic example of a divide and conquer algorithm.
Quick Sort: Another divide and conquer algorithm, known for its efficiency in sorting large datasets.
Binary Search: Efficiently searches sorted data by dividing the problem space in half at each step.
7. Backtracking Algorithms
Backtracking algorithms try to build a solution incrementally, abandoning a path as soon as it is determined that this path cannot possibly lead to a valid solution.
N-Queens Problem: Places N queens on an N×N chessboard such that no two queens threaten each other.
Sudoku Solver: Solves the Sudoku puzzle using backtracking to find the correct placement of numbers.
Graph Coloring: Assigns colors to vertices of a graph such that no two adjacent vertices share the same color.