0 Stimmen dafür0 Stimmen dagegen

0 Aufrufe8 SeitenMar 03, 2019

© © All Rights Reserved

PDF, TXT oder online auf Scribd lesen

© All Rights Reserved

Als PDF, TXT **herunterladen** oder online auf Scribd lesen

0 Aufrufe

© All Rights Reserved

Als PDF, TXT **herunterladen** oder online auf Scribd lesen

- The Law of Explosive Growth: Lesson 20 from The 21 Irrefutable Laws of Leadership
- Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race
- Hidden Figures Young Readers' Edition
- The E-Myth Revisited: Why Most Small Businesses Don't Work and
- Micro: A Novel
- The Wright Brothers
- The Other Einstein: A Novel
- State of Fear
- State of Fear
- The Power of Discipline: 7 Ways it Can Change Your Life
- The Kiss Quotient: A Novel
- Being Wrong: Adventures in the Margin of Error
- Algorithms to Live By: The Computer Science of Human Decisions
- The 6th Extinction
- The Black Swan
- The Art of Thinking Clearly
- The Last Battle
- Prince Caspian
- A Mind for Numbers: How to Excel at Math and Science Even If You Flunked Algebra
- The Theory of Death: A Decker/Lazarus Novel

Sie sind auf Seite 1von 8

Propagation Neural Network

Nilesh Y. Choudhary, GF’S GCOE, Jalgaon, India

Mrs. Rupal Patil, GF’S GCOE, Jalgaon, India

Dr. Umesh. Bhadade, GF’S GCOE, Jalgaon, India

Prof. Bhupendra M Chaudhari, Govt. Polytechnics Nadurbar, India

using the popular pen-and-paper signature to one where

The fact that the signature is widely used as a means of the handwritten signature is captured and verified

personal identification tool for humans require that the electronically.

need for an automatic verification system. Verifwication

can be performed either Offline or Online based on the There are two main streams in the signature recognition

application. However human signatures can be handled task. First approach requires finding information and

as an image and recognized using computer vision and can recognize signature as the output of the system and

neural network techniques. With modern computers, it is seen that in a certain time interval, it is necessary to

there is need to develop fast algorithms for signature make the signature. This system models the signing

recognition. There are various approaches to signature person and other approach is to take a signature as a

recognition with a lot of scope of research. In this static two-dimensional image which does not contain

paper, off-line signature recognition & verification any time-related information [1].in short, signature

using back propagation neural network is proposed, recognition can be divided into two groups. Online and

where the signature is captured and presented to the offline.

user in an image format. Signatures are verified based

on features extracted from the signature using Invariant The online signature recognition, where signatures are

Central Moment and Modified Zernike moment for its acquired during the writing process with a special

invariant feature extraction because the signatures are instrument, such as pen tablet. In fact, there is always

Hampered by the large amount of variation in size, dynamic information available in case of online

translation and rotation and shearing parameter. signature recognition, such as velocity, acceleration and

Before extracting the features, preprocessing of a pen pressure. So far there have been many widely

scanned image is necessary to isolate the signature part employed methods developed for online signature

and to remove any spurious noise present. The system is recognition for example, Artificial Neural Networks

initially trained using a database of 56 persons (ANN)[2,3], dynamic time warping (DTW)[4,5], the

signatures obtained from those 56 individuals whose hidden Markov models (HMM)[6,7].

signatures have to be authenticated by the system. For

each subject a mean signature is obtained integrating The off-line recognition just deals with signature

the above features derived from a set of his/her genuine images acquired by a scanner or a digital camera. In

sample signatures .This signature recognition& general, offline signature recognition& verification is a

verification system is designed using MATLAB. This challenging problem. Unlike the on-line signature,

work has been tested and found suitable for its purpose. where dynamic aspects of the signing action are

captured directly as the handwriting trajectory, the

dynamic information contained in off-line signature is

INTRODUCTION

highly degraded. Handwriting features, such as the

handwriting order, writing-speed variation, and

Handwritten signature is one of the most widely skillfulness, need to be recovered from the grey-level

accepted personal attributes for identity verification of pixels.

the person. The written signature is regarded as the

primary means of identifying the signer of a written In the last few decades, many approaches have been

document based on the implicit assumption that a developed in the pattern recognition area, which

person’s normal signature changes slowly and is very approached the offline signature verification problem.

difficult to erase, alter or forge without detection. The Justino,[8] propose an off-line signature verification

handwritten signature is one of the ways to authorize system using Hidden Markov Model . Zhang, Fu and

transactions and authenticate the human identity Yan [9] proposed handwritten signature verification

compared with other electronic identification methods system based on Neural ‘Gas’ based Vector

such as fingerprints scanning and retinal vascular

International Journal of IT, Engineering and Applied Sciences Research (IJIEASR) ISSN: 2319-4413 2

Volume 2, No. 1, January 2013

Quantization. Vélez, Sánchez and Moreno [10] propose The input signature is captured from the scanner or

a robust off-line signature verification system using digital high pixel camera which provides the output

compression networks and positional cuttings. [11, 12, image in term of BMP Colour image. The

13] preprocessing algorithm provides the required data

suitable for the final processing. In the feature

The signature recognition & verification system shown extraction phase the invariant central moment and

in Fig 1 is broadly divided into three subparts Zernike moment are used to extract the feature for the

1) Preprocessing, 2) Feature extraction,3) Recognition classification purpose. In classification the Back

& Verification. propagation Neural Network is used to provide high

accuracy and less computational complexity in training

and testing phase of the system.

1. SIGNATURE DATABASE signed 4 forgery signatures in the training set the total

number of signatures is 675 (12 x 56) are used. In order

For training and testing of the signature recognition and to make the system robust, signers were asked to use as

verification system 675 signatures are used. The much as variation in their signature size and shape and

signatures were taken from 56 persons. The templates the signatures are collected at different times without

of the signature as shown in Fig 2 seeing other signatures they signed before.

For testing the system, another 112 genuine signatures

For training the system 56 person’s signatures are used. and 112 forgery signatures are taken from the same 56

Each of these persons signed 8 original signature and persons in the training set.

International Journal of IT, Engineering and Applied Sciences Research (IJIEASR) ISSN: 2319-4413 3

Volume 2, No. 1, January 2013

Image

In today’s technology, almost all image capturing and

Scanning devices gives their output in color format. A

color image consists of a coordinate matrix and three

color matrices. Coordinate matrix contains X, Y

coordinate values of the image. The color matrices are

labeled as red (R), green (G), and blue (B). The

technique presented in this study are based on grey

scale images, therefore, scanned or captured color

images are initially converted to grey scale using the

following equation (1)

+0.114*Blue

(1)

Noise reduction (also called “smoothing” or “noise

filtering”) is one of the most important processes in

Fig 2.Signature Templates image processing. Images are often corrupted due to

positive and negative impulses stemming from

decoding errors or noisy channels. An image may also

2. PREPROCESSING be degraded because of the undesirable effects due to

illumination and other objects in the environment.

Preprocessing algorithms is nothing but data Median filter is widely used for smoothing and

conditioning algorithm which provide data for feature restoring images corrupted by noise. It is a non-linear

extraction process. It establishes the link between real process useful especially in reducing impulsive or salt-

world data and recognition & verification system. The and-pepper type noise. In a median filter, a window

preprocessing of the trajectory of input signature pattern slides over the image, and for each positioning of the

directly facilitates pattern description and affects the window, the median intensity of the pixels inside it

quality of description. Any image-processing determines the intensity of the pixel located in the

application suffers from noise like touching line middle of the window. Different from linear filters such

segments, isolated pixels and smeared images. This as the mean filter, median filter has attractive properties

noise may cause severe distortions in the digital image for suppressing impulse noise while preserving edges.

and hence result in ambiguous features and a Median Filter is used in this study due to its edge

correspondingly poor recognition and verification rate. preserving feature [14,15, 16, 17].

The preprocessing step is applied both in training and

testing phases. Background elimination, noise

reduction, width normalization and skeletonization are

the sub steps

International Journal of IT, Engineering and Applied Sciences Research (IJIEASR) ISSN: 2319-4413 4

Volume 2, No. 1, January 2013

Clearing , - pixel coordinates for the normalized signature,

In Many image processing algorithms require the , - pixel coordinates for the original signature,

separation of objects from the image background. M- one of the dimensions (width or height) for the

Thresholding is the most easily & sophistically normalized signature

applicable method for this purpose. It is widely used in

image segmentation [18, 19].

assigning 0 to the pixels with values smaller than or

equal to H and 1 to those with values greater than H.

We used Thresholding technique for separating the

signature pixels from the background pixels. Clearly, in Fig 6. Normalized Image

this application, we are interested in dark objects on a

light background, and therefore, a threshold value H,

called the brightness threshold, is appropriately chosen 3. FEATURE EXTRACTION

and applied to image pixels f(x, y) as in the following

Equation (2) Feature extraction, as defined by Devijver and Kittle

[20] is “Extracting the information from the raw data

If f(x, y) ≥ H then which is most relevant for classification stage. This

f(x, y) = Background data can be minimized within-class pattern variation

else f(x, y) = Object and increases the inter-class variations.” Therefore,

(2) achieving a high recognition performance in signature

Signature image which is located by separating it from recognition system is highly influenced by the selection

complex background image is converted into binary of efficient feature extraction methods, taking into

image white background taking the pixel value of 1. consideration the domain of the application and the type

Vertical and horizontal (histogram) projections are used of classifier used [21]. An efficient feature extraction

for border clearing. For both direction, vertical and algorithm should require two characteristics: Invariance

horizontal, we counted every row zeros and the and reconstruct-ability Features [21] that are invariant

resulting histogram is plotted sideways. to certain transformations on the signature which would

be able to recognize many variations of these

2.4 Signature Normalization signatures. Such transformations include translation,

Signature dimensions may vary due to the irregularities scaling, rotation, stretching, skewing and mirroring.

in the image scanning and capturing process.

Furthermore, height and width of signatures vary from On the other hand, the ability to reconstruct signature

person to person and, sometimes, even the same person from their extracted features ensures that complete

may use different size signatures. First, we need to information about the signature shape is present in these

eliminate the size differences and obtain a standard features. In this feature extraction step, the well known

signature size for all signatures. After this normalization feature set in pattern recognition is used. one is depends

process, all signatures will have the same dimensions. on invariant central moment designed by Hu’s [22]

In this study, we used a normalized size of 50x50 pixels which is used for scale and translation normalization

for all signatures that will be processed further. During and other is modified Zernike moment[23] which is

the normalization process, the aspect ratio between used for rotation normalization.

width and height of a signature is kept intact.

3.1 Invariant Central Moment

Normalization process made use of the following The moments of order (u + v) of an image composed of

equation (3) & (4). binary pixels B(x, y) are proposed by [24], [25] as

shown in eq. (5).

(3)

(5)

The body’s area A and the image’s center of

mass i s found from eq. 6.

(4)

(6)

International Journal of IT, Engineering and Applied Sciences Research (IJIEASR) ISSN: 2319-4413 5

Volume 2, No. 1, January 2013

are given by eq. 7.

(12)

Where

(7)

The orthogonality property of Zernike moments, as

expressed in the eq.8, allows easy image reconstruction

Finally, the normalized central moments, which are

from its Zernike moments by simply adding the

translation and scale invariant, are derived from the

information content of each individual order moment.

central moments as shown in eq. 8.

Moreover, Zernike moments have simple rotational

transformation properties interestingly enough the

Zernike moments of a rotated image, have identical

(8) magnitudes to those of the original one, where they

Where merely acquire a phase shift upon rotation.

K=1+ (u+ v)/2 for u+v≥2

Therefore, the magnitudes of the Zernike moments are

3.2 Zernike Moments rotation invariant features of the underlying image.

Zernike polynomials are a set of complex polynomials Translation and scale-invariance, on the other hand, are

which form a complete orthogonal set over the interior obtained by shifting and scaling the image into the unit

of the unit circle [26].The form of polynomial is shown circle.

by eq. 10.

(10)

Where

is the Length of the vector from the origin to the point

(x, y), θ is the angle between this vector and the x axis

in the Counterclockwise direction and the radial

polynomial is Fig 7. Rotation Normalization

NETWORK

(10) There are several algorithms that can be used to create

Zernike moments are the projections of the image an artificial neural network, but the Back propagation

function onto these orthogonal basis functions. The [27] was chosen because it is probably the easiest to

Zernike moment of order n with repetition m for a implement, while preserving efficiency of the network.

digital image is given by Backward Propagation Artificial Neural Network

(ANN) use more than one input layers (usually 3). Each

of these layers must be either of the following:

• Input Layer – This layer holds the input for the

network

(11) • Output Layer – This layer holds the output data,

Where, * is the complex conjugate operator and usually an identifier for the input.

x2+y2≤1. • Hidden Layer – This layer comes between the input

layer and the output layer. They serve as a

To calculate the Zernike moments for a given image, its propagation point for sending data from the

pixels are mapped to the unit circle x2+y2≤1. This is previous layer to the next layer.

done by taking the geometrical center of the image as

the origin and then scaling its bounding rectangle into A typical Back Propagation ANN is as depicted in Fig 8

the unit circle, as shown in Figure 7. Due to the The black nodes (on the extreme left) are the initial

orthogonality of the Zernike basis, the part of the inputs. Training such a network involves two phases. In

original image inside the unit circle can be the first phase, the inputs are propagated forward to

approximated using its Zernike moments Anm up to a compute the outputs for each output node. Then, each

given order nmax using of these outputs is subtracted from its desired output,

causing an error [an error for each output node].

International Journal of IT, Engineering and Applied Sciences Research (IJIEASR) ISSN: 2319-4413 6

Volume 2, No. 1, January 2013

passed backward and the weights are fixed. These two

phases are continued until the sum of square of output

errors reaches an acceptable value. Each neuron is

composed of two units. The First unit adds products of

weights coefficients and input signals while the second

unit realizes nonlinear function, called neuron

activation function. Signal is adder output signal and

= is output signal of nonlinear element. Signal y is

also output signal of neuron. To teach the neural

network, we need data set. The training data set consists

of input signals 1 2 assigned with corresponding

target (desired output). The network training is an

iterative process. In each iteration weights coefficients

of nodes are modified using new data from training data

set. Each teaching step starts with forcing both input Fig 8. A 3-layer neural network using back propagation

signals from training set. After this stage we can

determine output signals values for each neuron in each When the application launches, it waits for the user to

network layer determine whether he wishes to train or verify a set of

signatures. At the training stage, based on the back

Symbols represent weights of connections propagation neural network algorithm, the user gives

between output of neuron and input of neuron in eight 12 different images as input, of which the real

the next layer. input to the network, are the individual pixels of the

images. When input is confirmed and accepted, it

In the next algorithm step, the output signal of the passes through the back propagation neural network

network is compared with the desired output value algorithm to generate an output which contains the

(the target), which is found in training data set. The network data of the trained images. The back

difference is called error signal of output layer propagation artificial neural network simply calculates

neuron. It is impossible to compute error signal for the gradient of error of the network regarding the

internal neurons directly, because output values of these networks modifiable weights. In this paper we a multi-

neurons are unknown. For many years the effective layer neural network designed by O.C Abikoye [28],

method for training multilayer networks has been

unknown. Only in the middle eighties the back 5. TRAINING AND TESTING

propagation algorithm has been worked out. The idea is

to propagate error signal (computed in single teaching

The recognition phase consists of two parts, training

step) back to all neurons, which output signals were

and testing respectively which is accomplished by back

input for discussed neuron.

propagation neural network.

The weights' coefficient used to propagate errors

As explained in Section 1. 672 images in our database

back are equal to this used during computing output

belonging to 56 people are used for both training and

value. Only the direction of data flow is changed

testing. Since 8 (out of 12) input vectors for each image

(signals are propagated from output to inputs one after

were used for training purposes, there are only 224

the other). This technique is used for all network layers.

(56*4) input vectors (data sets) left to be used for the

If propagated errors came from few neurons they are

test set. Under normal (correct) operation of the back

added. The illustration is below

propagation neural network, only one output is expected

to take a value of “1” indicating the recognition of a

signature represented by that particular output. The

other output values must remain zero. The output layer

used a logic decoder which mapped neuron outputs

between 0.5-1 to a binary value of 1. If the real value of

an output is less than 0.5, it is represented by a “0”

value. The back propagation neural network program

recognized all of the 56 signatures correctly. This result

translates into a 100% recognition rate. We also tested

the system with 15 random signatures which are not

contained in the original database.

International Journal of IT, Engineering and Applied Sciences Research (IJIEASR) ISSN: 2319-4413 7

Volume 2, No. 1, January 2013

Only two of these signatures which are very similar to percentage error with the dimension of the feature

at least one of the 56 stored images resulted in “false vector is calculated.

positives” (output > 0.5) while the remaining 8 are

recognized correctly as not belonging to the original set The recognition system gives the 98% success rate by

(the output value was <= 0.5). Since recognition step is recognizing the all signature pattern correctly for all

always followed by the verification step, these kinds of that signature which is used in training. It gives the poor

false positives can be easily caught by our verification performance for signature that is not in the training

system. In other words, the verification step serves as a phase. Generally the failure to recognize/verify a

safeguard against “false positives” as well as “false signature was due to poor image quality and high

negatives”. similarity between 2 signatures. Recognition and

verification ability of the system can be increased by

6. RESULT AND CONCLUSION using additional features in the input data set. This

study aims to reduce to a minimum the cases of forgery

In this study, we presented Off-Line Signature in business transactions.

Recognition and Verification System using the back

propagation neural network which is based on steps of REFERENCES

image processing, invariant central moment invariants,

Zernike moment & some global properties and back [1] A. Pacut, A Czajka, “Recognition of Human

propagation neural networks. Signatures”, pp. 1560-1564, 2001.

[2] Ronny Martens, Luc Claesen, “On- Line

Both systems used a three-step process; in the first step, Signature Verification by Dynamic Time-

the signature is separated from its image background. Warping”, IEEE Proceedings of ICPR'96

Second step performs normalization and digitization of 1996.

the original signature. Invariant central moment [3] Quen-Zong Wu, I-Chang Jou, and Suh-Yin

invariants, Zernike moment and global properties which Lee, “On-Line Signature Verification Using

are used as input features for the back propagation LPC Cepstrumand Neural Networks”, IEEE

neural network are obtained in the third step. Transactions on Systems, Man, and

Cybernetics—Part B: Cybernetics, 27(1):148-

153, 1997.

[4] Pave1 Mautner, OndrejRohlik, Vaclav

Matousek, JuergenKempp, “Signature

Verification Using ART-2 Neural Network”,

Proceedings of the 9th International

Conference on Neural information Processing

(ICONIP'OZ) ,2: 636-639, 2002

[5] A. Jain, F. Griess, S. Connell, “On-line

signature Verification”, Pattern Recognition,

Vol. 35, No. 12, 2002

[6] W. Nelson, W. Turin, T. Hastie, “Statistical

methods for on-line signature verification”,

International Journal of Pattern

Recognition and Artificial Intellingence, 8,

1994

Fig. 9 Online Result for Proposed Method [7] R. Kashi, J. Hu, W.L. Nelson, W.Turin, “A

hidden markov model approch to online

As Shown in Fig 9 the proposed system is evaluated on handwritten signature verification”,

two performance criteria. The feature extraction stage International Journal on Document Analysis

and the overall recognition rate for achieving high and Recognition, Vol. 1, No.1, 1998.

recognition performance in signature recognition [8] E. J. R. Justino, F. Bortolozzi and R.

system is highly influenced by the selection of efficient Sabourin,( 2001) “Offline Signature

feature vector. In this paper, we compute the Verification Using HMM for Random, Simple

comparison between the previously implemented Hu‘s and Skilled Forgeries”, ICDAR 2001,

moment [28] Zernike moment [23]. Original feature International Conference on Document

vectors produced from the different moment invariant Analysis and Recognition, vol. 1, pp. 105--

techniques are applied for signature features extraction 110.

from the binary images of the signature and absolute [9] B. Zhang, M. Fu and H. Yan (1998 ),

“Handwritten Signature Verification based on

International Journal of IT, Engineering and Applied Sciences Research (IJIEASR) ISSN: 2319-4413 8

Volume 2, No. 1, January 2013

Neural ‘Gas’ Based Vector Quantization”, [18] Erdem, U.M., “2D Object Recognition In

IEEE International Joint Conference on Neural Manufacturing Environment Using Implicit

Net-works, pp. 1862-186 Polynomials and Algebraic Invariants”, Master

[10] J. F. Vélez, Á. Sánchez, and A. B. Moreno ( Thesis, Bogazici University, 1997.

2003 ) , “Robust Off-Line Signature [19] Fu, K.S., Mui, J.K., “A survey On Image

Verification Using Compression Networks Segmentation”, Pattern Recognition, Vol. 13,

And Positional Cuttings”, Proc. 2003 IEEE pp.3-16, Pergoman Press, 1981.

Workshop on Neural Networks for Signal [20] Devijver, P.A. and J. Kittler, 1982, Pattern

Processing, vol. 1, pp. 627-636. Recognition: A Statistical Approach. Prentice-

[11] Q. Yingyong, B. R. Hunt, "Signature Hall, London, ISBN: 10: 0136542360.

Verification Using Global and Grid Features", [21] Trier, O.D., A.K. Jain and T. Taxt, Feature

Pattern Recognition, vol. 22, no.12, Great extraction methods for character recognition-a

Britain (1994), 1621--1629. survey. Patt.Recog., 29: 641-662.

[12] Drouhard, J.P., R. Sabourin, and M. Godbout, [22] Hu M, “Visual pattern recognition by moment

“A neural network approach to off-line invariants”, IRE Trans. Inf. Theory, IT-8, PP:

signature verification using directional 179–187.

PDF”, Pattern Recognition, vol. 29, no. 3, [23] Khotanzad, A. and Y.H. Hong, “Invariant

(1996), 415-424. image recognition by Zernike moments”, IEEE

[13] G. Rigoll, A. Kosmala, "A Systematic Trans. Patt. Anal. Mach. Intell., 12: 489-

Comparison Between On-Line and Off-Line 497. DOI: 10.1109/34.55109.

Methods for Signature Verification with [24] Theodoridis, S. and K. Koutroumbas, 2006,

Hidden Markov Models", 14th International “Pattern Recognition”, 3rd Edn., Academic

Conference on Pattern Recognition - vol. Press, ISBN: 10: 0123695317, pp: 856.

II, Australia (1998), 1755. [25] Reiss, T.H, “The revised fundamental theorem

[14] Lim, J.S., “Two-Dimensional and Image of moment invariants”, IEEE Trans. Patt.Anal.

Processing”, Prentice-Hall, 1990. Mach. Intell., 13: 830-834. DOI:

[15] Yang, X., and Toh, P.S., “Adaptive Fuzzy 10.1109/34.85675, 1991.

Multilevel Median Filter”, IEEE Transaction [26] Khotanzad, A, Y. H. Hong, “Invariant image

on Image Processing, Vol. 4, No. 5, pp.680- recognition by Zernike moments”, IEEE Trans.

682, may 1995. Patt. Anal.Intell, pp489-497, March 1990.

[16] Hwang, H., and Haddad, R.A. “Adaptive [27] Golda, A. 2005. Principles of Training multi-

Median Filters: new Algorithm and Results”, layer neural network using back propagation.

Transactions on Image processing, Vol. 4, No. [28] O.C Abikoye, M.A MabayojeR. Ajibade

4 pp.449-505, April 1995. “Offline Signature Recognition & Verification

[17] Rosenfeld, A., “Digital Picture Processing”, using Neural Network” International Journal of

Academic Press Inc., 1982. Computer Applications (0975 – 8887) Volume

35– No.2, December 2011

- Understanding Backpropagation Algorithm - Towards Data ScienceHochgeladen vonKashaf Bakali
- Neural Network, Sewage TreatmentHochgeladen vonKumar Gaurav
- 11773-26759-1-SMHochgeladen vonlubeck abraham huaman ponce
- ArtificialneuralnetworkHochgeladen vonPrabal Goyal
- deep_learning.pdfHochgeladen vonArnaldo Preso De Liga
- Lecture 9Hochgeladen vonJoe
- Artificial Neural Network Based Method for Location and Classification of Faults on a Transmission LinesHochgeladen vonIJSRP ORG
- Draft-1Hochgeladen vonNikhil Ratna Shakya
- Survey on Handwritten Character Recognition using Artificial Neural NetworkHochgeladen vonIJSTE
- Emotion Recognition FinalHochgeladen vonDebmalya Sinha
- Data Mining for Unemployment Rate PredictionHochgeladen vonazziaty256
- Singh 2000Hochgeladen vonVitthal Patnecha
- Neural NetworkingHochgeladen vonpallab869343
- A Comparative Study on Improving the Latency Time of File Access Using Standard Back Propagation Neural NetworksHochgeladen vonIJSER ( ISSN 2229-5518 )
- paper finalHochgeladen vonSilvia Sharmin
- A Comparison of Neural Networks for Real-time Emot (1)Hochgeladen vonleon
- VIPINPPTHochgeladen vonGaurav Kumar
- FPGA Based Artificial Neural NetworkHochgeladen von29377
- SNNSv4.2.ManualHochgeladen vonDanny Ruben Martinez Alegria
- Ann Rec054Hochgeladen vonYASHWANTHj58
- A Simplified Design of Multiplier for Multi Layer Feed Forward Hardware Neural NetworksHochgeladen vonInternational Journal of Research in Engineering and Technology
- A brief introduction to neural networks.pdfHochgeladen vonRichieQC
- 10.1109%2FIAS.1994.345442Hochgeladen vonsam
- Neural NetHochgeladen vonAchyut Hegde
- Nips Tut 3Hochgeladen vonEsatheeshmib
- Comparison Study of Neural Network and Deep Neural Network on Repricing Gap Prediction in Indonesian Conventional Public BankHochgeladen vonHendri Karisma
- Hibrida Soft and Hard ComputingHochgeladen vonThoth Dwight
- Basic Milling OperationHochgeladen vonMOHAN KUMAR
- A Review Study of Predictive Model Blast Vibration Attenuation Equation by Using Neural Network as an EvaluatorHochgeladen vonSajjad Azari
- [Jagannathan_Sarangapani]_Neural_network_control_o(b-ok.org).pdfHochgeladen vonJosmir S. Rodríguez K.

- ms software syllabus(vit)Hochgeladen vonSuraj Nunna
- HackerRank 2019 2018 Developer Skills ReportHochgeladen vonAlbus Severus
- e CommerceHochgeladen vonSuraj Nunna
- catII_2012Hochgeladen vonSuraj Nunna
- Review on Matrimonial Information Systems and Services - An Indian PerspectiveHochgeladen vonsphkol
- 10.1.1.187Hochgeladen vonSuraj Nunna

- Recurrent Neural Networks - HintonHochgeladen vonjeffconnors
- Application of Social Cognitive Career Theory to Investigate the Effective Factors of the Career Decision-Making Intention of Iranian Agriculture StudentsHochgeladen vonsjblora
- M_Sc__IT2009Hochgeladen vonmazhar940
- Christopher MacLeod-An Introduction to Practical Neural Networks and Genetic Algorithms For Engineers and Scientists.pdfHochgeladen vonAnaKarolinaMuniz
- Intelligent Diagnostic System for the diagnosis and prognosis of Breast Cancer using ANNHochgeladen vonJournal of Computing
- Using Neural Networks for Image ClassificationHochgeladen vonwixu
- A Review of Data Mining Techniques for RHochgeladen vonManu Díaz Villouta
- The Use of Kalman Filter and Neural Network Methodologies in Gas Turbine Performance Diagnostics: A Comparative StudyHochgeladen vonAdha Montpelierina
- Estimation of Body Fat via Levenberg - Marquardt AlgorithmHochgeladen vonInternational Journal for Scientific Research and Development - IJSRD
- Analyticsvidhya.com-Understanding and Coding Neural Networks From Scratch in Python and RHochgeladen vonsurajdhunna
- Zone Based Method to Classify Isolated Malayalam Handwritten Characters Using Hu-Invariant Moments and Neural NetworksHochgeladen vonIradewa
- a Survey of Classification MethodsHochgeladen vonIJAERS JOURNAL
- Neural Practicals.docxHochgeladen vonmansi
- 1938-5893-1-PBHochgeladen vonthavaselvan
- Wear PredictionHochgeladen vonAnonymous PsEz5kGVae
- NeuralNetworkFAQHochgeladen vonkarvinu2020
- Error Correction LearningHochgeladen vonNagaVenkateshG
- A Basic Introduction To Neural Networks.docxHochgeladen vonHassan Ali
- kerasHochgeladen vonIshan Sane
- an autonomous land vehicleHochgeladen vonEduDam
- Project2MTECHHochgeladen vonkpyes34
- Extraction of Natural Dye from Flame of Forest Flower: Artificial Neural Network ApproachHochgeladen vonSEP-Publisher
- p50.pdfHochgeladen vonChetan Kotwal
- CnnHochgeladen vonXtremMartinez
- 8.Content PageHochgeladen vonPraveen Togadia
- A Step by Step Backpropagation Example – Matt MazurHochgeladen vonAndres Tuells Jansson
- A Basic Introduction To Neural Networks.docHochgeladen vonDennis Ebenezer Dhanaraj
- Activation Ensembles for Deep Neural NetworksHochgeladen vonJan Hula
- CreateSpace.Data.Mar.2016.ISBN.1530655277.pdfHochgeladen vonJUAN JESUS SALAZAR JACOBE
- Data Mining Classification and Prediction by Dr. Tanvir AhmedHochgeladen vonHapi

## Viel mehr als nur Dokumente.

Entdecken, was Scribd alles zu bieten hat, inklusive Bücher und Hörbücher von großen Verlagen.

Jederzeit kündbar.