Project: AND Logic Gate using scikit-learn

In this demonstration, we train a Perceptron using scikit-learn to learn the AND logic gate. Unlike the manual implementation, scikit-learn automatically handles weight initialization, iterative training, and convergence.

Step 1: Define the Training Dataset

Toggle Code

import numpy as np
from sklearn.linear_model import Perceptron

# Inputs (x1, x2) and targets (AND truth table)
X = np.array([
  [0, 0],
  [0, 1],
  [1, 0],
  [1, 1]
])
y = np.array([0, 0, 0, 1])  # AND gate outputs
		  

Explanation:

  • Each row in X represents a possible combination of inputs.
  • y contains the target outputs corresponding to the AND gate.
  • This format satisfies scikit-learn’s (n_samples, n_features) requirement.

Step 2: Initialize and Train the Perceptron

Toggle Code

# Initialize Perceptron
perceptron = Perceptron(
  max_iter=1000,    # Maximum training iterations
  eta0=0.1,         # Learning rate
  random_state=42)  # For reproducibility

# Train the Perceptron
perceptron.fit(X, y)
		  

Explanation:

  • scikit-learn handles forward pass, error calculation, and weight updates internally.
  • Training continues until convergence or max_iter is reached.

Step 3: Inspect the Trained Model

Toggle Code

print("Perceptron coefficients (weights):", perceptron.coef_)
print("Perceptron intercept (bias):", perceptron.intercept_)
		  

Explanation:

  • coef_ shows the importance of each input feature.
  • intercept_ is the bias that adjusts the decision threshold.

Step 4: Test the Trained Perceptron

Toggle Code

for i in range(len(X)):
  y_pred = perceptron.predict(X[i].reshape(1, -1))[0]
  print(f"Input: {X[i]} → Output: {y_pred}")
		  

Explanation:

  • predict() handles forward pass and thresholding internally.
  • Output is 1 only when both inputs are 1.

Step 5: Visualize the Decision Boundary

Toggle Code

import matplotlib.pyplot as plt

# Plot points
plt.scatter(X[:, 0], X[:, 1], c=y, cmap=plt.cm.coolwarm, edgecolors='k')

# Decision boundary: w1*x1 + w2*x2 + b = 0
x_vals = np.array([X[:,0].min() - 0.5, X[:,0].max() + 0.5])
y_vals = -(perceptron.coef_[0][0] * x_vals + perceptron.intercept_[0]) / perceptron.coef_[0][1]

plt.plot(x_vals, y_vals, 'k--')
plt.xlabel("Input 1")
plt.ylabel("Input 2")
plt.title("Perceptron Decision Boundary (AND Gate)")
plt.show()
		  

Explanation:

  • The line separates output 0 from 1.
  • All points are correctly classified on the appropriate side.

Understanding the Process

  • Data Setup: Each row is an AND truth table input.
  • Initialization & Training: scikit-learn initializes weights and updates them automatically.
  • Output: The model outputs 1 only for (1,1).
  • Visualization: Confirms the linear decision boundary separates the classes.

Outcome

The scikit-learn Perceptron converges quickly and correctly models the AND gate. This demonstrates that the AND gate is linearly separable and can be solved with a single-layer perceptron.