Author |
Topic: Neural Suggestion (Read 1429 times) |
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Neural Suggestion
« on: Jul 12th, 2004, 9:54am » |
Quote Modify
|
I am currently in final year and i want to do a project on Neural Network for my final year project. Me and my teacher have been discussing abt what we should do and nothing fruitful has come out of the discussion so far. The problem is we have very little background of NN and my group is doing the entire study of NN from ground up (which basically means we are complete novice in this field). I would like to have any suggestion whatsoever on the project under NN. All suggestions are warmly welcomed .
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #1 on: Jul 12th, 2004, 12:15pm » |
Quote Modify
|
Well, neural networks scream 'pattern recognition', and you can go numerous ways with that. Letter/number recognition (b/w bitmap-picture to characters) face recognition (f.i. faces of cartoon characters) Classification (unsupervised: kohonenmaps) I saw an interesting combination of markovmodels and kohonen networks a while ago to classify/recognize timeseries, which might be fun to do.. (or try another combination)
|
« Last Edit: Jul 12th, 2004, 12:27pm by towr » |
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #2 on: Jul 12th, 2004, 2:12pm » |
Quote Modify
|
Thanks for the suggestions towr! I personally have no idea where to start my coding on any of those. I just started reading up on NN's a week back or so and just know that kohonen networks are self-organising networks but beyond that i am blank. I read a nice article at ibm , which gave me some idea as to how i should start thinking abt all this. Though still how the network "learns" (through coding) is a mystery for me!?! The perceptron model discussed in the ibm article doesn't really clarify things for me cuz they talk of changing weight matrix during learning stage but exactly how they go abt doing it seems unknown!! Anyways u have any ideas where i can start afresh on these things!?! Thanking again! TR
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #3 on: Jul 12th, 2004, 2:36pm » |
Quote Modify
|
on Jul 12th, 2004, 2:12pm, TenaliRaman wrote:I personally have no idea where to start my coding on any of those. |
| The easiest 'neural network'-like system you can make is a sort of vector-network (not sure what the proper name is anymore) What you do is this -each item you want to recognize can be represented with a vector, and you can compare vectors on similarity -generate a number of random vector (prototypes) -compare test items to all the prototypes, and make the prototype that is most like it even more like it (by averaging them weighted, f.i. 99% prototype + 1% new item) After learning for a while the prototype vectors will represent classes of items. It's not really a neural network though, but it does work similarly (especially once you get into kohonen maps you'll see the similarities, vector machines might also be of interest) Quote:I just started reading up on NN's a week back or so and just know that kohonen networks are self-organising networks but beyond that i am blank. I read a nice article at ibm , which gave me some idea as to how i should start thinking abt all this. Though still how the network "learns" (through coding) is a mystery for me!?! The perceptron model discussed in the ibm article doesn't really clarify things for me cuz they talk of changing weight matrix during learning stage but exactly how they go abt doing it seems unknown!! |
| There are several learning rules, the most often used is error back propagation (or delta-rule). For this you need an input and to know what the output should be (supervised learning). Once you feed the input to the neural network you'll get some output, but it will differ from the output you want. This 'error' can be used to blame adjust the weight between the output layer and the layer before it, the goal here is to ensure that next time the output will be more like what you want. If you have multiple layers you'll need to propagate the error backwards, and lay blame in the nodes of the hidden layers (f.i. if a certain output node was 0 and should have been 1, and was connected to two nodes in a previous layer with weights 0.1 and 0.7 respectively, then you can propagate that error backwords by blaming the first node for 0.1/(0.7+0.1) * 1, and the other for 0.7/(0.1+0.7) * 1) So this gives new errors for the previous layer, and you can repeat the same process again. (maybe I should draw it.. ) Quote:Anyways u have any ideas where i can start afresh on these things!?! |
| I'm pretty sure there are lecture nodes floating around on the web about neural networks. And of course I can try to explain things, but I'm not sure if I'm all that good at explaining things.. There's another simple learning rule, Hebbian learning, which was inspired by nature. Basicly you strengthen weights between neurons that are (in)active at the same time (and perhaps weaken the bond otherwise). If the activation of a node ranges from -1 to 1, you can just use the rule weight(i,j) = weight(i,j) + [nu] node(i)*node(j), where [nu] is the learning parameter. The weight matrix may be unstable though, some weights could grow to infinity, so it is customary to normalize it. There are a few common pitfalls with neural networks. How good it learns does depend a lot on the architecture of the network. It may have been proven that a threelayered network (input-hidden-output) can represent any function as long as the layers have enough nodes, but in practice the network may never learn actually learn the function. Another problem lies with the input. All inputs should lie in the same range (f.i. [0..1]), because if one input goes from 100-3000, and another goes from 0.01-0.05, the latter is probably going to be ignored by the network. Or at best it will take a very long time for the network to learn it is actually important. A third caution, use simple representations. A network can in theory learn binary coding, so you'd need only 3 nodes to represent inputs/outputs 1..8, but in practise it will work much better if you use 8 inputs (one for each number) Of course in some cases this may not be practical. To be honest, every time I did anything with neural networks I have been disappointed, I expect too much, too fast (the same probalem I have with genetic algorithms, and let's not get started on the unholy combination of the two, so tempting, yet so evil..) As my professor said, it's not magic, don't expect miracles.
|
« Last Edit: Jul 12th, 2004, 2:52pm by towr » |
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #4 on: Jul 14th, 2004, 12:11pm » |
Quote Modify
|
Quote:The easiest 'neural network'-like system you can make is a sort of vector-network (not sure what the proper name is anymore) What you do is this .... |
| I will look into that ... I have read into details on error back propagation and kohonen model from "Neural Networks PC Tools by Eberhart and Dobbins". Personally i feel error back propagation is cheating and looks nothing more than trial and error measurement to me (which may not be even perfect). In this respect atleast i felt kohonen maps does well .... atleast during its training we need not adjust the learning factor manually. Quote:I'm pretty sure there are lecture nodes floating around on the web about neural networks. And of course I can try to explain things, but I'm not sure if I'm all that good at explaining things.. |
| hehe i am currently wading through all the notes that i could find on neural nets online .... though i never would mind learning it from u either u can be a better teacher than u think Quote:There are a few common pitfalls with neural networks. .... |
| Yes i did guess something like that would happen .... and now it is just confirmed by listening to someone like you who has programmed a neural network. Quote:To be honest, every time I did anything with neural networks I have been disappointed, I expect too much, too fast (the same probalem I have with genetic algorithms, and let's not get started on the unholy combination of the two, so tempting, yet so evil..) As my professor said, it's not magic, don't expect miracles. |
| The more i think abt it ... the more i tend to agree with your professor!! (ofcourse that hasn't dampened my spirits yet and noting that neural network is still a highly unexplored territory)
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #5 on: Jul 14th, 2004, 2:15pm » |
Quote Modify
|
on Jul 14th, 2004, 12:11pm, TenaliRaman wrote:Personally i feel error back propagation is cheating and looks nothing more than trial and error measurement to me (which may not be even perfect). |
| ? In what sense would it be cheating? It is a proven (theoretical and practical) way of reducing the error each step, and thus it's exactly what learning ought to be. Now genetic training of a network, that's really trial and error, and for some problems it work well (EBP isn't suited for every problem, most notably recurrent networks may give problems) Quote:In this respect atleast i felt kohonen maps does well .... atleast during its training we need not adjust the learning factor manually. |
| Whenever you're dealing with learning you'll have to guess at the best learning parameters, that's no different for kohonen than for standard multilayers perceptrons.. Even if you let the learning parameter decrease as the number of steps increase, the best rate to do this is still guesswork (or trial and error), as is the best value to start at. Quote:hehe i am currently wading through all the notes that i could find on neural nets online .... though i never would mind learning it from u either u can be a better teacher than u think |
| Well, I'll gladly be of help if I can. I'm no expert though, I only followed two courses that directly dealt with the subject (and a few others that grazingly mentioned it, like handwriting recognition) Here's a link you might find interesting, http://yann.lecun.com/exdb/lenet/ It's quite a different approach to neural networks than you'd usually expect.
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #6 on: Jul 17th, 2004, 7:09am » |
Quote Modify
|
Quote:? In what sense would it be cheating? It is a proven (theoretical and practical) way of reducing the error each step, and thus it's exactly what learning ought to be. Now genetic training ...... |
| I dunno maybe its bcos , when i first read it , the first thing that came to my mind was "how close it resembles the difference method of extrapolation". The only difference being that the network tries to adjust the coefficients so as to get close to the answer while the difference method just gives us a polynomial which might give us a close value. Quote:Even if you let the learning parameter decrease as the number of steps increase, the best rate to do this is still guesswork (or trial and error), as is the best value to start at. |
| Hmm the method i had thought of for deciding the parameter was something like "newton-raphson method" wherein the start guess would be 1. Quote:Here's a link you might find interesting... |
| Does the front page applet implement their method .. if yes! then its oh so cool ..... ofcourse i haven't read through the literature yet but it sounds so interesting. Thanks for the link!!
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #7 on: Jul 18th, 2004, 7:58am » |
Quote Modify
|
on Jul 17th, 2004, 7:09am, TenaliRaman wrote:I dunno maybe its bcos , when i first read it , the first thing that came to my mind was "how close it resembles the difference method of extrapolation". The only difference being that the network tries to adjust the coefficients so as to get close to the answer while the difference method just gives us a polynomial which might give us a close value. |
| Well, in a sense it is similar. A neural network with just one output is 'simply' a mathematical function. Of course it's hardly ever a polynomial. Most neurons use a sigmoid function or tanh to achieve nonlinearity (otherwise you might as well use a matrix instead of a neural network), though you can use other functions. And as long as they're differentiable you can use error back propagation. Learning is basicly just 'hillclimbing' in the error landscape. Whether you use gradient descent (EBP) or another method (f.i. simulated annealing, genetic algorithm) doesn't fundamentally make a difference. Just as long as you get to a good local minimum, prefereably the absolute minimum. Quote:Hmm the method i had thought of for deciding the parameter was something like "newton-raphson method" wherein the start guess would be 1. |
| I'm not sure how or if that would work. If it does work let me know Quote:Does the front page applet implement their method .. if yes! then its oh so cool ..... |
| Yes it does. The interesting twist is the neural networks scans the image, much like our eyes scan a piece of paper when we read. The big advantage is it can recognize 'objects' in a picture, rather than only whole pictures (like traditional neural nets, where putting the same object elsewhere in the picture makes it unrecognizable). And of course you make many variations on it. Another important pitfall just came to mind: overtraining a network. It may just memorize, or at least specialize, on the training set. So it's important to have a good, independant test set with which to check how well the neural network generalizes what he learned fromt he training set to new cases. A new problem here, is that inevitably you'll optimize the network to learn from the training set in such a way it'll optimally perform on the test set, so you've actually trained it on the latter as well, which means you need a third set, the validation set (and if it does much worse there than on the test set you know you've optimized it too much). Of course often you already have a very limited suply of data to learn from, so splitting it into three sets can pose new practical problems
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
rmsgrey
Uberpuzzler
    


Gender: 
Posts: 2874
|
 |
Re: Neural Suggestion
« Reply #8 on: Jul 18th, 2004, 8:40am » |
Quote Modify
|
Of course, another classic pitfall is mistraining - closely related to the overtraining problem - a classic example being the tank detection software that trained on the set of pictures taken for the purpose of the exercise of the same patch of woodland on two occasions, one occasion had a tank present in all the pictures, and the other occasion had no tanks whatsoever. What no-one noticed at the time was that one set of pictures had been taken on a sunny day, while the other was taken under an overcast, so the software ended up learning the difference between sunny and cloudy weather...
|
|
IP Logged |
|
|
|
Sir Col
Uberpuzzler
    

impudens simia et macrologus profundus fabulae
Gender: 
Posts: 1825
|
 |
Re: Neural Suggestion
« Reply #9 on: Jul 18th, 2004, 1:45pm » |
Quote Modify
|
I like it, rmsgrey! on Jul 18th, 2004, 7:58am, towr wrote:Learning is basicly just 'hillclimbing' in the error landscape. Whether you use gradient descent (EBP) or another method (f.i. simulated annealing, genetic algorithm) doesn't fundamentally make a difference. Just as long as you get to a good local minimum, prefereably the absolute minimum. |
| I'm afraid that I am quite ignorant in the area of artificial intelligence, but I've always wondered about this problem (as I see it) with AI and would like to know a little more. To use a crude analogy... Suppose that you are using an AI unit to find the lowest point (relative to sea level) within a given region, and suppose further that you start the unit in a valley. As it roams around it would continue to detect an increase in gradient in all directions and so after "searching all possible paths" it would conclude that it had located the absolute minimum. The alternative to this is to allow the unit to complete an exhaustive search of the entire landscape before it arrives at a conclusion. However, in a more complex problem this may prove to be impossible wthin a realistic timescale. So how would a human handle this? I would probably locate a high point and use it to survey the "landscape" and narrow the areas of research. It seems that, in order to complete a task, an AI would need to redesign their optimisation algorithm to do the opposite of what they were programmed to do. Will this ever be possible?
|
|
IP Logged |
mathschallenge.net / projecteuler.net
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #10 on: Jul 19th, 2004, 2:17am » |
Quote Modify
|
The problem is you can't 'survey' the errorlandscape, regardless of where you are at it. You can only sample bits of it and maybe make a map. If you're dealing with a real landscape, yeah sure, then you can have the AI use cameras to survey the landscape. But even then the minimum can be hidden from view by the next mountainrange. With complex problems chances are you'll never find the very best solution, but generally you don't need to. Getting a sufficient solution is often good enough, espescially when time limits are an issue. Anyway, there are many techniques to avoid getting trapped in local minima. So generally you wouldn't get trapped between grains of sand, only between really big rocks
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #11 on: Jul 20th, 2004, 12:08pm » |
Quote Modify
|
After all the hassles, my project guide has settled on a neural network simulator and he seems to be fixated on it. (i am not sure whether to feel happy abt it or sad! the idea of a simulator isn't particularly exciting to me!). Anyways , i will ofcourse pester towr whenever i need help on the proceedings . Thanks for all the help and guidance so far tho!!
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #12 on: Jul 20th, 2004, 12:43pm » |
Quote Modify
|
I wonder what is meant with a 'neural network simulator'. I don't think many people would consider building a real physical artificial neural network (consisting of physical nodes and links), it's usually software and inherently a serial simulation of a parallel system.
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #13 on: Jul 21st, 2004, 3:12pm » |
Quote Modify
|
Yes ofcourse its a software simulator .. not a hardware one . Not the best project around , well atleast i can get my basics right here. *sigh*
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #14 on: Jul 21st, 2004, 3:38pm » |
Quote Modify
|
on Jul 21st, 2004, 3:12pm, TenaliRaman wrote:Yes ofcourse its a software simulator .. not a hardware one . |
| My point was more that all ANNs in use are allready simulated. You can hardly make one that isn't a simulator. So I wonder what explicitly calling it a NN simulator means, I suppose it must distinguish itself in some way.. There's many things it could mean. You could make a nice graphic representation of how neurons fire and how activity passes from one neuron to the next. Or you could simulate how noise on the input, or destruction of neurons, affects performance. Any number of aspects you could work with really.. Quote:Not the best project around , well atleast i can get my basics right here. *sigh* |
| There's no reason why a simulator of anything should be able to do the equivalent of the real thing.. I'd be quite willing to accept the simulation of a human mind as a real intelligence if it passed the turing test and such things. (And considering the brain is a neural network, you can always try to simulate that as your project )
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #15 on: Jul 23rd, 2004, 7:32am » |
Quote Modify
|
Nah! it need not be unique (tho i wish it were!) We will be implementing some already known models and learning techniques and test and analyse some results. I guess my guide doesn't want to overburden me with things with my semester syllabus being a bit bulky this year. Quote:(And considering the brain is a neural network, you can always try to simulate that as your project ) |
| Hehe! i will try ( ) .
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #16 on: Aug 22nd, 2004, 7:28am » |
Quote Modify
|
HELP!!! Two weeks ago i finished with whatever i could read up on Neural Networks. I got the general feel of what i am expected to do , so i thought i better start with the implementation. I spend two weeks to design classes (yes two weeks .... erm got other works too u know ). In these two weeks i considered various different implementations with all its pros and cons and i finally settled on one two days ago and decided to implement it..... The model was simple as well as generic (to some extent) so they worked out fine and in no time i could develop a simple multi-layer feed forward network .... where each neuron has sigmoid transfer function.. I was overjoyed! I decided to hack in back propagation learning algorithm and this is where everything went wrong .... I think i have implemented it correctly but its not learning anything ..... i mean the output is absolutely non-sense ..... after two days of banging my head over it and wasting a beautiful weekend on this ,i came to a conclusion that i was making some sort of idiotic mistake in the bpn algorithm ..... i looked into the books .... the two books that i currently have with me discusses weight changing for just 3 layers that is input-hidden-output ... (i have not implemented a simple 3 layer network with bpn yet but if nothing works i might implement that and see what happens) .. in anycase i thought it must easily extensible .. but its not working .... i searched the net but most of the lecture notes discusses simple 2 layer or 3 layer bpn networks i need help!!! what should i do?
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #17 on: Aug 22nd, 2004, 10:54am » |
Quote Modify
|
I suppose the first question to ask is: What sort of function are you trying to learn? And of course, haven't you set the learning speed too high (or too low, both can be catastrophic)
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #18 on: Aug 22nd, 2004, 1:09pm » |
Quote Modify
|
i tried to implement this std diagram given for Xor (the 3 layer diagram) Now it should work since all the literature claimed so but mine does not so i am definitely making some god-damn mistake in my bpn algorithm but i dunno where the problem is ?? I kept the learning factor at 0.1
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #19 on: Aug 22nd, 2004, 1:46pm » |
Quote Modify
|
Do you have a bias at each node as well? I'm not sure if it's strictly necessary, but it can help.. (And the diagram of the XOR network I have on my wall has it..) Aside from that there's little I can say without seeing your implementation of the algorithm.
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
 |
Re: Neural Suggestion
« Reply #20 on: Aug 22nd, 2004, 2:09pm » |
Quote Modify
|
No i haven't given any bias yet. But the results aren't encouraging enough for me to tell that the bias might help .. I wanted to show the implementation but its distributed across different classes .... So i did not know how to put it up here ?? Should i attach the class files?
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #21 on: Aug 22nd, 2004, 2:16pm » |
Quote Modify
|
You could zip them together and attach them as a zip-file.
|
|
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
towr
wu::riddles Moderator Uberpuzzler
    
 Some people are average, some are just mean.
Gender: 
Posts: 13730
|
 |
Re: Neural Suggestion
« Reply #22 on: Aug 23rd, 2004, 1:53am » |
Quote Modify
|
After sleeping on it, I'm positive the bias is essential for the network to work. You can't even get a proper AND or OR without it. Just try to construct them by hand. Given inputs x and y, what you want is something AND = f(x+y - 1.5) such that x+y-1.5 is positive when x and y are positive, and after the sigmoid function it'll be higher then 0.5 (and thus '1') and otherwise x+y-1.5 is negative and after the sigmoid function it'll be lower then 0.5 (and thus '0'). Without the bias of about -1.5 you can't get that. (the weights here don't matter very much, x and y are equally important, so should have the same weight, and so essential increasing the weight is equal to contracting the x-axis in the sigmoid function making it more and more like the step/round function)
|
« Last Edit: Aug 23rd, 2004, 1:56am by towr » |
IP Logged |
Wikipedia, Google, Mathworld, Integer sequence DB
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
ok i will try the bias ... i need some time as to where and how to place the bias now .... prolly i will be placing it in the neural layer ... anyways, i have attached the zip file. Sorry tho i have not given any comments ... I will attach a small figure i just drew now in the next post which should clarify as to what i am doing.
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
TenaliRaman
Uberpuzzler
    
 I am no special. I am only passionately curious.
Gender: 
Posts: 1001
|
Ok here is the diagram ....
|
|
IP Logged |
Self discovery comes when a man measures himself against an obstacle - Antoine de Saint Exupery
|
|
|
|