0% found this document useful (0 votes)
62 views6 pages

Lab No 1

The document discusses implementing various neural network transfer functions in MATLAB using Simulink. It defines transfer functions as representing the relationship between input and output of a dynamic system. It then provides details on hard limit, symmetric hard limit, linear, positive linear, saturating linear, symmetric saturating linear, log-sigmoid, and tangent sigmoid transfer functions. The conclusion states that activation functions like these offer diverse non-linear transformations that are crucial for effective neural network modeling.

Uploaded by

Sohira Qazi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views6 pages

Lab No 1

The document discusses implementing various neural network transfer functions in MATLAB using Simulink. It defines transfer functions as representing the relationship between input and output of a dynamic system. It then provides details on hard limit, symmetric hard limit, linear, positive linear, saturating linear, symmetric saturating linear, log-sigmoid, and tangent sigmoid transfer functions. The conclusion states that activation functions like these offer diverse non-linear transformations that are crucial for effective neural network modeling.

Uploaded by

Sohira Qazi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

LAB NO 2

OBJECT: To implement all the neural network transfer functions in MATLAB using
simulink.

THEORY:
Transfer function: A transfer function in MATLAB represents the relationship between the
input and output of a dynamic system in the frequency domain. It is commonly used in control
systems analysis and design to model the behavior of linear time-invariant systems.
Hard limit transfer function: hardlim (Neural Network Toolbox) The hard limit transfer
function forces a neuron to output a 1 if its net input reaches a threshold, otherwise it outputs 0.
This allows a neuron to make a decision or classification.

RESULT:

Symmetric hard-limit transfer function: A symmetrical hard limit transfer function


approximates the behavior of a symmetrical hard limit function within a given threshold range. It
constrains input values symmetrically around zero by mapping them linearly to output values
within the defined range
RESULT:

Linear transfer function: A pure linear transfer function in MATLAB refers to a transfer
function that solely captures linear dynamics, without any non-linear components. It represents a
linear relationship between input and output signals, following the principles of superposition
and homogeneity. Such transfer functions are fundamental for modeling and analyzing linear
time-invariant systems in various domains such as control theory, signal processing, and
communication systems.
RESULT:

Positive linear transfer function: It's a function that applies a linear transformation to its input,
clipping negative values to zero while leaving positive values unchanged. This function is
commonly used in neural network architectures for introducing non-linearities that enforce
positive activation values.

RESULT:
Saturating linear transfer function: It's a function that applies a linear transformation to its
input while saturating values that exceed specified upper and lower bounds. It's commonly used
in neural network architectures for introducing non-linearities that restrict the range of activation
values.

RESULT:

Symmetric saturating linear transfer function: satlins is a transfer function. Transfer


functions calculate a layer's output from its net input.
RESULT:

Log-sigmoid transfer function: the logsig, is commonly used in neural network architectures.
This function maps input values to the range between 0 and 1

RESULT:
Tangent sigmoid transfer function: The hyperbolic tangent sigmoid function, often referred to
as tansig, is commonly used in neural network architectures. This function maps input values to
the range between -1 and 1, providing smooth and non-linear transformations that are useful for
capturing complex relationships in data.

RESULT:

CONCLUSION:

In MATLAB, activation functions like hardlim, hardlims, purelin, poslin,


satlin, satlins, tansigmoid, logsigmoid are crucial for neural network
architectures, offering diverse non-linear transformations for effective model
learning and representation.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy