Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: <Feature Name> Request to add java code algorithm in Graph section #1277 : fix #1997

Merged
merged 5 commits into from
Nov 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
File renamed without changes.
126 changes: 123 additions & 3 deletions docs/graphs/Page-Rank-Algorithm.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,21 +13,24 @@ The **PageRank algorithm** is a **link analysis algorithm** developed by Larry P
### Characteristics:

- **Graph-Based Algorithm**:
- The PageRank algorithm treats the web as a directed graph where nodes represent web pages and directed edges represent hyperlinks between them.

- The PageRank algorithm treats the web as a directed graph where nodes represent web pages and directed edges represent hyperlinks between them.

- **Random Surfer Model**:

- The algorithm is based on the concept of a random surfer who randomly clicks on links. The likelihood of landing on a page is influenced by the number and quality of inbound links to that page.

- **Damping Factor**:

- A damping factor (usually set to around 0.85) is used to model the probability that a user continues clicking on links, with a chance of jumping to a random page. This prevents the score from being skewed by a small number of highly connected pages.

- **Iterative Calculation**:
- The PageRank scores are calculated iteratively, updating the scores based on the ranks of inbound pages until convergence is achieved.

### Time Complexity:

- **Time Complexity: O(N * I)**
The time complexity of the PageRank algorithm is O(N * I), where N is the number of nodes (pages) and I is the number of iterations until convergence.
- **Time Complexity: O(N \* I)**
The time complexity of the PageRank algorithm is O(N \* I), where N is the number of nodes (pages) and I is the number of iterations until convergence.

### Space Complexity:

Expand Down Expand Up @@ -102,6 +105,123 @@ if __name__ == "__main__":
main()
```

### Java Implementation:

```java
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;

class Node {
int name; // Unique identifier for the node
ArrayList<Integer> inbound_nodes = new ArrayList<>(); // Links coming to this node
ArrayList<Integer> outbound_nodes = new ArrayList<>(); // Links going from this node

public Node(int name) {
this.name = name; // Node name
}

public void add_inbound(int node) {
inbound_nodes.add(node);
}

public void add_outbound(int node) {
outbound_nodes.add(node);
}

@Override
public String toString() {
return "Node{name=" + name + ", inbound_nodes=" + inbound_nodes + ", outbound_nodes=" + outbound_nodes + "}";
}
}

public class Main {

// PageRank calculation function
public static void pageRank(ArrayList<Node> nodes, int iterations, double dampingFactor) {
// Initialize ranks: all nodes start with rank 1
Map<Integer, Double> ranks = new HashMap<>();
for (Node node : nodes) {
ranks.put(node.name, 1.0);
}

// Count of outbound links for each node
Map<Integer, Integer> outboundCount = new HashMap<>();
for (Node node : nodes) {
outboundCount.put(node.name, node.outbound_nodes.size());
}

for (int i = 0; i < iterations; i++) {
System.out.println("======= Iteration " + (i + 1) + " =======");
Map<Integer, Double> newRanks = new HashMap<>();

for (Node node : nodes) {
double rankSum = 0.0;

for (int inboundNodeName : node.inbound_nodes) {
if (outboundCount.get(inboundNodeName) != 0) {
rankSum += ranks.get(inboundNodeName) / outboundCount.get(inboundNodeName);
}
}

double newRank = (1 - dampingFactor) + dampingFactor * rankSum;
newRanks.put(node.name, newRank);
}

ranks = newRanks;
System.out.println("Current Ranks: " + ranks);
}
}

public static void main(String[] args) {
ArrayList<ArrayList<Integer>> adjacencyMatrix = new ArrayList<>();
adjacencyMatrix.add(new ArrayList<>(Arrays.asList(0, 1, 1)));
adjacencyMatrix.add(new ArrayList<>(Arrays.asList(0, 0, 1)));
adjacencyMatrix.add(new ArrayList<>(Arrays.asList(1, 0, 0)));

// Initialize nodes
ArrayList<Node> nodes = new ArrayList<>();
for (int i = 0; i < adjacencyMatrix.size(); i++) {
nodes.add(new Node(i));
}

// Populate inbound and outbound links based on the adjacency matrix
for (int rowIndex = 0; rowIndex < adjacencyMatrix.size(); rowIndex++) {
for (int colIndex = 0; colIndex < adjacencyMatrix.get(rowIndex).size(); colIndex++) {
if (adjacencyMatrix.get(rowIndex).get(colIndex) == 1) {
nodes.get(rowIndex).add_outbound(colIndex);
nodes.get(colIndex).add_inbound(rowIndex);
}
}
}

System.out.println("======= Nodes =======");
for (Node node : nodes) {
System.out.println(node);
}

// Calculate and display PageRank values
pageRank(nodes, 3, 0.85);
}
}
```

#### Output:

```Output
======= Nodes =======
Node{name=0, inbound_nodes=[2], outbound_nodes=[1, 2]}
Node{name=1, inbound_nodes=[0], outbound_nodes=[2]}
Node{name=2, inbound_nodes=[0, 1], outbound_nodes=[0]}
======= Iteration 1 =======
Current Ranks: {0=1.0, 1=0.575, 2=1.4249999999999998}
======= Iteration 2 =======
Current Ranks: {0=1.3612499999999996, 1=0.575, 2=1.06375}
======= Iteration 3 =======
Current Ranks: {0=1.0541874999999998, 1=0.7285312499999999, 2=1.2172812499999996}
```

### Summary:

The PageRank algorithm is an essential tool for determining the importance of web pages based on their link structure. By modeling the browsing behavior of users and using a damping factor, the algorithm effectively ranks pages while minimizing the impact of less relevant pages. It has numerous applications in search engines and network analysis, making it a foundational algorithm in the field of data science and information retrieval.
166 changes: 164 additions & 2 deletions docs/graphs/flood-fill.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ sidebar_label: Flood Fill Algorithm
description: "In this blog post, we'll explore the Flood Fill Algorithm, a popular technique used in computer graphics for determining connected regions, such as filling areas in images and solving puzzles like the paint bucket tool in graphics editing software."
tags: [dsa, algorithms, graphics, connected components]
---

# Flood Fill Algorithm

## Introduction
Expand All @@ -13,7 +14,7 @@ The **Flood Fill Algorithm** is a method used in computer graphics to determine

## How Flood Fill Works

Flood fill operates by exploring adjacent pixels (or nodes) that share the same color or value as the starting pixel. The process can be performed using either a **recursive** or **iterative** approach.
Flood fill operates by exploring adjacent pixels (or nodes) that share the same color or value as the starting pixel. The process can be performed using either a **recursive** or **iterative** approach.

### Steps of the Algorithm:

Expand Down Expand Up @@ -128,10 +129,13 @@ int main() {
return 0;
}
```

### 2. Iterative Flood Fill

The iterative version uses a queue (or stack) to keep track of the pixels that need to be processed. This approach is more memory-efficient for larger areas as it avoids deep recursion.

### Iterative Implementation in C++:

```cpp

void floodFillIterative(int x, int y, int targetColor, int newColor, vector<vector<int>>& image) {
Expand All @@ -155,7 +159,162 @@ void floodFillIterative(int x, int y, int targetColor, int newColor, vector<vect
}
}
```
### Applications of Flood Fill

### Java Implementation

```Java
import java.util.*;

class Pair{
int X;
int Y;

Pair(int X, int Y)
{
this.X = X;
this.Y = Y;
}
}

public class Main {

// dfs
public static void floodFillRecursive(int x, int y, int targetColor, int newColor, int[][] image)
{
// Out of bounds check
if (x < 0 || x >= image.length || y < 0 || y >= image[0].length) return;
// Color doesn't match
if (image[x][y] != targetColor) return;

image[x][y] = newColor; // Fill the color

// Recursively fill the neighboring pixels
// row-1,col (up) | row,col+1 (right) | row+1,col (down) | row,col-1(left)
int[] r = {-1, 0, 1, 0};
int[] c = { 0, 1, 0, -1};

for(int i=0; i<4; i++)
floodFillRecursive(x+r[i], y+c[i], targetColor, newColor, image);
}

//bfs
public static void floodFillIterative(int x, int y, int targetColor, int newColor, int[][] image)
{
if(image[x][y] != targetColor) return;

int rows = image.length;
int cols = image[0].length;

Queue<Pair> q = new LinkedList<>();
q.offer(new Pair(x,y));

while(!q.isEmpty())
{
Pair p = q.poll();
int curX = p.X;
int curY = p.Y;

image[curX][curY] = newColor; // Fill the color

// Explore neighbors
// row-1,col (up) | row,col+1 (right) | row+1,col (down) | row,col-1(left)
int[] r = {-1, 0, 1, 0};
int[] c = { 0, 1, 0, -1};

for(int i=0; i<4; i++)
{
int adjX = curX+r[i], adjY = curY+c[i];
if(adjX >= 0 && adjX < rows && adjY >= 0 && adjY < cols && image[adjX][adjY] == targetColor)
q.offer(new Pair(adjX, adjY));
}
}

}

// Function to print the image
public static void printImage(int[][] image)
{
for(int[] img : image)
{
for(int ele : img)
System.out.print(ele + " ");
System.out.println();
}

}

public static void main(String args[]){

int V = 5;
int[][] image = {{1, 1, 1, 1, 0},
{1, 1, 0, 1, 1},
{1, 1, 1, 1, 0},
{0, 0, 0, 0, 1},
{1, 1, 1, 1, 1}
};

int x = 1, y = 1; // Starting point
int newColor = 2; // New color to fill
int targetColor = image[x][y]; // Color to replace

System.out.println("Original Image:");
printImage(image);

// Using Recursive Flood Fill
System.out.println("\n--- Via Depth First Search ---");
floodFillRecursive(x, y, targetColor, newColor, image);
System.out.println("\nImage after Recursive Flood Fill:");
printImage(image);

image = new int[][]{
{1, 1, 1, 1, 0},
{1, 1, 0, 1, 1},
{1, 1, 1, 1, 0},
{0, 0, 0, 0, 1},
{1, 1, 1, 1, 1}
};

// Using Recursive Flood Fill
System.out.println("\n--- Via Breadth First Search ---");
floodFillIterative(x, y, targetColor, newColor, image);
System.out.println("\nImage after Iterative Flood Fill:");
printImage(image);

}
}
```

#### Output

```
Original Image:
1 1 1 1 0
1 1 0 1 1
1 1 1 1 0
0 0 0 0 1
1 1 1 1 1

--- Via Depth First Search ---

Image after Recursive Flood Fill:
2 2 2 2 0
2 2 0 2 2
2 2 2 2 0
0 0 0 0 1
1 1 1 1 1

--- Via Breadth First Search ---

Image after Iterative Flood Fill:
2 2 2 2 0
2 2 0 2 2
2 2 2 2 0
0 0 0 0 1
1 1 1 1 1
```

### Applications of Flood Fill

Image Editing: Filling contiguous areas with color, as seen in paint applications.
Game Development: Used for detecting and filling areas of influence or terrain.
Pathfinding: Determining reachable areas in grid-based maps.
Expand All @@ -164,9 +323,12 @@ Geographical Mapping: Filling regions in geographical information systems (GIS).
### Complexity Analysis

### Time Complexity: O(N), where N is the number of pixels in the area to be filled. Each pixel is visited once.

### Space Complexity:

Recursive: O(H), where H is the maximum height of the recursion stack.
Iterative: O(N) in the worst case, if all pixels are connected.

### Conclusion

The Flood Fill Algorithm is a fundamental technique in computer graphics and various other fields. Understanding its implementation and applications allows developers to efficiently solve problems related to connected regions, making it a valuable tool in both graphics programming and algorithm design.
Loading