0% found this document useful (0 votes)
3 views24 pages

naan mudhalvan

Uploaded by

Shanmugam K
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views24 pages

naan mudhalvan

Uploaded by

Shanmugam K
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 24

UTILIZE TRAINING ALGORITHM FOR

EX NO:
PATTERN ASSOCIATION IN IMAGES OR
DATE:
SIGNALS

AIM:
To write a python code for pattern association in images or signals.

ALGORITHM:
Step 1: Start the program.

Step 2: Import the necessary libraries.

Step 3: Define the function.

Step 4: Calculate the distance.

Step 5: Sort and select the nearest neighbours.

Step 6: Train the data.

Step 7: test the prediction.

Step 8: display the predicted label.

PROGRAM:
Import numpy as np

From collection import counter

Def predict(fruit,training_data,k=3):

Neighbours=sorted(training_data,key=1 ambda data,np_linalg.norm(np.array

(fruit)-np.array(data[:-1])))[:k]

Return counter (nn[-1]for nn in neighbours).most-common(1)[0][0]

Training_data=[

[1,1,’apple’],

[1,0.5,’apple’],

[0.5,1,’orange’],

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
[0.5,0.5,’orange’]]

Print(predict([1,1],training_data))

OUTPUT:

Apple

RESULT:

Thus a python code for utilize the traning algorithm for pattern image or

Signals was executed successfully.

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
IMPLEMENT A HETER ASSOCIATIVE
EX NO:
MEMORY
DATE:
NETWORK FOR RECALLING HISTORICAL
DATA

AIM:

To write a python code to implement a heter associative memory network

For recalling historical data.

ALGORITHM:

Step 1: Initialize input and output patterns.

Step 2: Compute the weight matrix.

Step 3: Recall the function and test recall it.

Step 4: Compute the retrieved output with expected results.

PROGRAM:

import numpy as np

x=np.array([

[1,-1,1],

[-1,1,-1],

[1,1,-1]])

y=np.array([

[1,1,-1],

[-1,-1,1],

[1,-1,1]])

w=x.T@y

def recall(input_pattern):

output=np.sign(input_pattern @ w)

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
return output

test_input=np.array([1,-1,1])

retrieved_output=recall(test_input)

print("test input:",test_input)

print("recalled output:",retrieved_output)

OUTPUT:

Test input : [1 -1 1]

Recalled output : [1 1 -1]

RESULT:

Thus the implementation of heter associative memory network for


recalling

Historical data was executed successfully.


CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
EX NO: USE KOH0NEN MAPS TO VISUALIZE
EX NO:
DATE: HIGH-DIMENSIONAL DATA
DATE:

AIM:

To write a python code for kohonen maps to visualize the high-

Dimensional data.

ALGORITHM:
Step: Import necessary libraries.

Step 2: Generate synthetic data.

Step 3: Initialize the SOM.

Step 4: Train the SOM.

Step 5: Visualize the SOM output.

PROGRAM:
import numpy as np

import matplotlib.pyplot as plt

from miniSom import miniSom

data=np.random.rand(100,5)

Som=minisom(x=10,y=10,input_len=5,sigma=1.0,learning_rate=0.5)

Som.random_wights_init(data)

Som.train_random(data,1000)

plt.figure(figsize=(6,6))

plt.pcolor(som.distance_map().T,map="coolwarm")

plt.colorbar(label='neuron activation distance')

plt.title("self-organizing map(som)virsualization")

plt.show()

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
OUTPUT:

RESULT:

Thus the python program code for kohonen maps to visualize the high-
dimensional Data was executed successfully.

USING SUPERVISED LEARNING NETWORKS


EX NO:
CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
TO
DATE:
AIM:

To write a python code using supervised learning networks to predict stack


market trends.

ALGORITHM:

Step 1: Import required libraries.

Step 2: Load and normalize the data.

Step 3: Create sequence for LSTM model.

Step 4: Define and train the LSTM model.

Step 5: Predict stack prices on the test set.

Step 6: Inverse transform prediction to original scale.

Step 7: Plot actual vs predicted stack prices.

PROGRAM:

import numpy as np

import yfinance as yf

import matplotlib.pyplot as plt

from sklearn.preprocessing import MinMaxScaler

from tensorflow.keras.models import Sequential

from tensorflow.keras.layers import LSTM, Dense

df = yf.download("AAPL", start="2015-01-01", end="2024-01- 01")


[['Close']].values

scaler = MinMaxScaler((0,1))

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
scaled_data = scaler.fit_transform(df)

def create_sequences(data, seq_len=50):

X, y = [], []

for i in range(len(data) - seq_len):

X.append(data[i:i+seq_len])

Y.append(data[i+seq_len])

return np.array(X), np.array(y)

train_size = int(len(scaled_data) * 0.8)

X_train, y_train = create_sequences(scaled_data[:train_size])

X_test, y_test = create_sequences(scaled_data[train_size:])

X_train, X_test = X_train.reshape((-1, 50, 1)), X_test.reshape((-1, 50, 1))

model = Sequential([

LSTM(50, return_sequences=True, input_shape=(50, 1)),

LSTM(50),

Dense(25),

Dense(1)

])

model.compile(optimizer='adam', loss='mse')

model.fit(X_train, y_train, epochs=20, batch_size=32, verbose=1)

predictions = scaler.inverse_transform(model.predict(X_test, verbose=0))

plt.figure(figsize=(12, 6))

plt.plot(df[len(df)-len(y_test):], label="Actual Price")

plt.plot(predictions, label="Predicted Price", linestyle="dashed")


CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
plt.legend()

plt.show()

OUTPUT:

RESULT:

Thus the python code using supervised learning networks to predict stack

Market trends was written and executed successfully.

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
EX NO: INTERPRET HANDWRITTEN TEXT

DATE: OR NUMBERS

AIM:

To write a python code to implement on ANN to recognize and interpret


Handwritten text or numbers.

ALGORITHM:

Step 1: Import required libraries.

Step 2: Load and normalize the data.

Step 3: Build and the neural network model.

Step 4: Compile and train the model.

Step 5: Evaluate the model.

Step 6: Predict the first test image.

Step 7: Display the predicted digit.

PROGRAM:

import tensorflow as tf

from tensorflow.keras import layers, models

import matplotlib.pyplot as plt

import numpy as np

(X_train, y_train), (X_test, y_test) = tf.keras.datasets.mnist.load_data()

X_train, X_test = X_train / 255.0, X_test / 255.0

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
model = models.Sequential([

layers.Flatten(input_shape=(28, 28)),

layers.Dense(128, activation='relu'),

layers.Dropout(0.2),

layers.Dense(10, activation='softmax')

])

model.compile(optimizer='adam',

loss='sparse_categorical_crossentropy',

metrics=['accuracy'])

model.fit(X_train, y_train, epochs=5)

test_loss, test_acc = model.evaluate(X_test, y_test)

print(f"Test accuracy: {test_acc}")

predictions = model.predict(X_test)

print(f"Prediction for first image: {np.argmax(predictions[0])}")

plt.imshow(X_test[0], cmap=plt.cm.binary)

plt.title(f"Predicted: {np.argmax(predictions[0])}")

plt.show()

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
OUTPUT:

RESULT:

Thus the python code to implement an ANN to recognize and interpret

Handwritten text or numbers was written and executed successfully.

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
EX NO: CREATE A SIMPLE IMAGE CLASSIFIER USING
DATE: BASIC MODELS OF ANN

AIM:

To write a python code to create a simple image classifier using basic models

of ANN.

ALGORITHM:

Step 1: Import required libraries.

Step 2: Load dataset (using MNIST for simplicity).

Step 3: Normalize the dataset and flatten the images.

Step 4: define ANN model and compile model.

Step 5: Train model and evaluate model.

Step 6: Print the output.

PROGRAM:

import tensorflow as tf

from tensorflow.keras.models import Sequential

from tensorflow.keras.layers import Dense, Flatten

from tensorflow.keras.datasets import mnist

import matplotlib.pyplot as plt

(X_train, y_train), (X_test, y_test) = mnist.load_data()

X_train, X_test = X_train / 255.0, X_test / 255.0

model = Sequential([

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
Flatten(input_shape=(28, 28)),

Dense(128, activation='relu'),

Dense(10, activation='softmax')

])

model.compile(optimizer='adam', loss='sparse_categorical_crossentropy',
metrics=['accuracy'])

model.fit(X_train, y_train, epochs=10, batch_size=32, validation_data=(X_test,


y_test))

test_loss, test_acc = model.evaluate(X_test, y_test)

print(f"Test Accuracy: {test_acc:.4f}")

predictions = model.predict(X_test[:2])

for i in range(2):

plt.imshow(X_test[i], cmap='gray')

plt.title(f"Predicted: {predictions[i].argmax()}, Actual: {y_test[i]}")

plt.axis('off')

plt.show()

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
OUTPUT:

RESULT:

Thus the python code using create a simple image classifier using basic
of ANN was executed successfully.

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
EX NO: LANGUAGE TRANSLATION TOOL
DATE:

AIM:

To write a python code to using develop a basic tool for translating one

Language to another using ANNs.

ALGORITHM:

Step 1: Start the program.

Step 2: Data processing and model architecture (encoder-decoder).

Step 3: Translation process is input a sentence in the secure language.

Step 4: Connect numerical predictions back into words.

Step 5: Print the output.

PROGRAM:
from transformers import MarianMTModel, MarianTokenizer

def translate_text(text, source_lang, target_lang):

model_name = f'Helsinki-NLP/opus-mt-{source_lang}-{target_lang}'

model = MarianMTModel.from_pretrained(model_name)

tokenizer = MarianTokenizer.from_pretrained(model_name)

tokenized_text = tokenizer.prepare_seq2seq_batch([text], return_tensors='pt')

translated = model.generate(**tokenized_text)

translated_text = tokenizer.decode(translated[0], skip_special_tokens=True)

return translated_text

def main():

print("Welcome to the Language Translation Tool!")

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
source_lang = input("Enter source language code (e.g., 'en' for English, 'fr' for
French): ").strip()

target_lang = input("Enter target language code (e.g., 'es' for Spanish, 'de' for
German): ").strip()

text = input(f"Enter the text you want to translate from {source_lang} to


{target_lang}: ").strip()

translated_text = translate_text(text, source_lang, target_lang)

print(f"\nOriginal text: {text}")

print(f"Translated text: {translated_text}")

if __name__ == "__main__":

main()

OUTPUT:

RESULT:

Thus the python code to using develop a basic tool for translating one

Language to another using ANNs was executed successfully.

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
EX NO: APPLY LEARNING VECTOR QUANTIZATION

DATE: FOR MARKET SEGMENTATION ANALYSIS

AIM:

To write a python code to apply learning vector quantization for market

Segmentation analysis.

ALGORITHM:

Step 1: Generate a dataset.

Step 2: Train a give model to learn the boundaries between clusters.

Step 3: Uses the trained model to predict cluster.

Step 4: Visualizes the results.

Step 5: Displays the output.

PROGRAM:

import numpy as np

import matplotlib.pyplot as plt

from sklearn.preprocessing import MinMaxScaler

from sklearn.model_selection import train_test_split

from sklearn.neighbors import KNeighborsClassifier

data = np.array([ [15, 8], [18, 7], [22, 6], [25, 5], [30, 4], [35, 3],

[40, 2], [45, 1.5], [50, 1], [55, 0.8], [60, 0.5], [65, 0.3]

])

labels = np.array([0, 0, 0, 1, 1, 1, 2, 2, 2, 2, 2, 2])

scaler = MinMaxScaler()

data = scaler.fit_transform(data)

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
X_train, X_test, y_train, y_test = train_test_split(data, labels, test_size=0.2,
random_state=42)

model = KNeighborsClassifier(n_neighbors=3)

model.fit(X_train, y_train)

predictions = model.predict(data)

colors = {0: 'red', 1: 'green', 2: 'blue'}

plt.figure(figsize=(8, 6))

for i, point in enumerate(data):

plt.scatter(point[0],point[1],color=colors[predictions[i]], label=f'Cluster
{predictions[i]}' if i < 3 else "")

plt.xlabel("Age")

plt.ylabel("Mobile Usage (Hours per Day)")

plt.title("Age vs Mobile Usage Classification")

plt.legend(["Heavy User", "Moderate User", "Low User"])

plt.grid()

plt.show()

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
OUTPUT:

RESULT:

Thus the python program to apply learning vector quantization for marget

Segmentation analysis was executed successfully.

CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K
CSE/NM/NN&DL 513322104048/SHANMUGASUNDARAM K

You might also like