AI7 min read

Graph Neural Networks

Neural networks for graph-structured data.

Dr. Patricia Moore
December 18, 2025
0.0k0

AI for networks and relationships.

What are Graph Neural Networks?

Neural networks that work on graph data.

**Graphs**: Networks of nodes and edges **Examples**: Social networks, molecules, knowledge graphs

Why GNNs?

Traditional NNs need fixed-size inputs.

**Graphs are different**: - Variable number of nodes - Complex relationships - Irregular structure

Graph Structure

```python import networkx as nx

Create graph G = nx.Graph()

Add nodes (people) G.add_nodes_from(['Alice', 'Bob', 'Charlie', 'David'])

Add edges (friendships) G.add_edges_from([ ('Alice', 'Bob'), ('Bob', 'Charlie'), ('Charlie', 'David') ])

Visualize nx.draw(G, with_labels=True) ```

Message Passing

**Core idea**: Each node aggregates information from neighbors

**Steps**: 1. Each node has features 2. Nodes send messages to neighbors 3. Each node aggregates received messages 4. Update node features

```python # Simplified message passing def message_passing(node_features, adjacency_matrix): # For each node for i in range(num_nodes): # Aggregate neighbor features neighbor_sum = 0 for j in range(num_nodes): if adjacency_matrix[i][j] == 1: # Is neighbor neighbor_sum += node_features[j] # Update node feature node_features[i] = activation(W @ neighbor_sum + b) return node_features ```

Graph Convolutional Network (GCN)

```python import torch import torch.nn as nn import torch.nn.functional as F

class GCNLayer(nn.Module): def __init__(self, in_features, out_features): super().__init__() self.weight = nn.Parameter(torch.FloatTensor(in_features, out_features)) self.bias = nn.Parameter(torch.FloatTensor(out_features)) def forward(self, x, adj): # x: node features [num_nodes, in_features] # adj: adjacency matrix [num_nodes, num_nodes] # Aggregate neighbors support = torch.mm(x, self.weight) output = torch.spmm(adj, support) + self.bias return output

class GCN(nn.Module): def __init__(self, input_dim, hidden_dim, output_dim): super().__init__() self.gc1 = GCNLayer(input_dim, hidden_dim) self.gc2 = GCNLayer(hidden_dim, output_dim) def forward(self, x, adj): x = F.relu(self.gc1(x, adj)) x = F.dropout(x, training=self.training) x = self.gc2(x, adj) return F.log_softmax(x, dim=1)

Use GCN model = GCN(input_dim=feature_dim, hidden_dim=16, output_dim=num_classes) output = model(node_features, adjacency_matrix) ```

Using PyTorch Geometric

```python import torch from torch_geometric.nn import GCNConv from torch_geometric.data import Data

Create graph data edge_index = torch.tensor([ [0, 1, 1, 2], # Source nodes [1, 0, 2, 1] # Target nodes ], dtype=torch.long)

x = torch.tensor([ # Node features [-1], [0], [1] ], dtype=torch.float)

data = Data(x=x, edge_index=edge_index)

Build GNN class Net(torch.nn.Module): def __init__(self): super().__init__() self.conv1 = GCNConv(1, 16) self.conv2 = GCNConv(16, 2) def forward(self, data): x, edge_index = data.x, data.edge_index x = self.conv1(x, edge_index) x = F.relu(x) x = F.dropout(x, training=self.training) x = self.conv2(x, edge_index) return F.log_softmax(x, dim=1)

model = Net() ```

Graph Attention Networks (GAT)

Learns importance of neighbors:

```python from torch_geometric.nn import GATConv

class GAT(torch.nn.Module): def __init__(self, in_channels, out_channels): super().__init__() self.conv1 = GATConv(in_channels, 8, heads=8, dropout=0.6) self.conv2 = GATConv(8 * 8, out_channels, heads=1, concat=False, dropout=0.6) def forward(self, x, edge_index): x = F.dropout(x, p=0.6, training=self.training) x = F.elu(self.conv1(x, edge_index)) x = F.dropout(x, p=0.6, training=self.training) x = self.conv2(x, edge_index) return F.log_softmax(x, dim=1) ```

Node Classification

Predict properties of nodes:

```python # Example: Classify users in social network

Forward pass output = model(data)

Calculate loss (only on training nodes) loss = F.nll_loss(output[data.train_mask], data.y[data.train_mask])

Backprop loss.backward() optimizer.step()

Evaluate pred = output.argmax(dim=1) correct = (pred[data.test_mask] == data.y[data.test_mask]).sum() accuracy = int(correct) / int(data.test_mask.sum()) ```

Graph Classification

Classify entire graphs:

```python from torch_geometric.nn import global_mean_pool

class GraphClassifier(torch.nn.Module): def __init__(self): super().__init__() self.conv1 = GCNConv(num_features, 64) self.conv2 = GCNConv(64, 64) self.fc = torch.nn.Linear(64, num_classes) def forward(self, data): x, edge_index, batch = data.x, data.edge_index, data.batch # Node-level operations x = F.relu(self.conv1(x, edge_index)) x = F.relu(self.conv2(x, edge_index)) # Graph-level pooling x = global_mean_pool(x, batch) # Classification x = self.fc(x) return F.log_softmax(x, dim=1)

Use for molecule classification, etc. ```

Applications

- **Social networks**: Friend recommendations, influence prediction - **Molecules**: Drug discovery, property prediction - **Knowledge graphs**: Question answering, reasoning - **Traffic**: Traffic prediction, route optimization - **Recommendation**: User-item graphs

Real Example - Molecular Property Prediction

```python from torch_geometric.datasets import MoleculeNet

Load molecule dataset dataset = MoleculeNet(root='data', name='ESOL')

Each molecule is a graph # Nodes = atoms, Edges = bonds

Train GNN to predict solubility model = GCN(input_dim=dataset.num_features, hidden_dim=64, output_dim=1)

for data in loader: output = model(data.x, data.edge_index, data.batch) loss = F.mse_loss(output, data.y) loss.backward() ```

Remember

- GNNs work on graph-structured data - Message passing is the core operation - PyTorch Geometric makes implementation easy - Many applications beyond images and text

#AI#Advanced#GNN