Monday, September 29, 2008

A working AI: Mission Accompished??

I'd taken a break from all the AI business for a few weeks. I'm surprised at how much I got done working with a clear head. I've run some initial tests through my concept for the Biological Based but not Weights Based Neural Network. The results look unbelievably amazing. The program starts out with absolutely no info and then learns everything quickly and efficiently. Here's what I've added to the log (read Concept AI part 1 and 2 first).

******

Miller's Law: A person can only keep 7 plus or minus 2 items in mind at one time.

http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#Introduction%20to%20neural%20networks

Firing rules to figure out weather a synapse fires based upon the input.

http://www.spiegel.de/international/spiegel/0,1518,466789,00.html

The nerve cells open or close it's ion channels before it fires.

*Going back to some more basic research about the fundamentals of the Brain itself.

http://www.enchantedlearning.com/subjects/anatomy/brain/Neuron.shtml
The brain has about 100 billion neurons. Then there are glial cells, which provides support (what kind?). The sizes of the Neurons differ.
Neruron: Cell Body, Dentrites (signal receivers), projections (axon, which conduct the nerve signal). Axon Terminals transmit electro-chemical signals over the gap known as the synapse.
Dentrites bring info to the cell body. The axon takes information away from the cell body. Bundles of them are called nerves or as nerve tracts or pathways. Dendrites branch from the cell body and receive message.
One neuron has about 1000-10,000 synapses, meaning it communicates with that many other neurons.

Glial cells: Nerve cells that don't carry nerve impulses. They basically manufucture stuff for the neurons.

Basically:
Dendrites: Input Connections (one or more)
Axon: Output Connection (only one). So it's like a linked list where the last elements connects to the first element. One axon can speak to multiple dentrites on the reveiver.
Axon Terminals: (more than one)
Synapse: Space between input & output connection

Hebbs Rule: Cells that fire together, wire together.
A synapse's strength depends on the number of Ion-Channels it has.

9-26-08
6:47

It works! I went through the previously non-working AI program and reworked the whole thing. It's version 0.03 now. I'm amazed that it was able to work so efficiently when I put in the charges concept into it. There are still many lil bugs to work out. I've just tested it with some lettter comparisons. The current data set is just 2 sets of 3 letters. It goes like this:
a b
b c
/end
d e
e f
/end

Then I can go in and reset the charges and add another connection linking the two node-network segments toghether into one.

/rnodecharges
c d

Before those statements, a connection to node a would activate node b and c, but after the statements, it activates b, c, d, e and f (with smaller charges as you go futher down the line.)

The main bug now is that the connection between c d is looking too strong. IDK. Maybey it's supposed to be like that. D is emitting a charge similar to that of a.

10:59. After examining it a bit more, I've found and fixed some more bugs. I'm using a new data set now. It goes like this:

Greek is spoken in Greece
/end
Greece is in England
/end
The Greeks created the Olympics
/end
Athens is the capital of Greece
/end
Sparta is the enemy of Greece
/end
Where refers to the places like Greece Athens and Sparta
/end

After clearing the charges, I ask it:
where is Greek spoken
/end

And I got the result:
/end
/charges
greek 1.0
is 1.0
spoken 1.0
in 2.4935020205414458
greece 3.4648806640625
england 1.4565926569239502
the 0.9368278391792345
greeks 0.43610654105691715
created 0.6253771713864278
olympics 0.6719345918743638
athens 1.8224265625
capital 1.4442937950131418
of 1.4180175011313327
sparta 1.6934484375000003
enemy 1.3801804402274132
about 1.2319124453341073
spartans 1.3034834714624521
where 1.0
refers 0.55
to 0.72875
places 0.75366328125
like 0.7785765625000001
and 0.85331640625

This CLEARLY gives you Greece as the answer to the question. It's amazing how quickly it can learn. All of this was achieved with a minimal amount of processing and with NO INFORMATION (except the /end and the other /commands) hard-coded in. Such clear results with just a single repetition!
I think I'm onto something here. I'm now trying to incorporate larger and larger datasets to see how it'll handle it. Oh, by the way: The dataset is completly made up. I have no clue what the capital of Greece is. This is just an example.

9/27/08
11-34 AM
I'm now working with a data set of 40 Nodes creating 221 Connections (Which becomes 185 when sleep is applied with a 0.001 charge). A basic version of the Sleep method has been created to clean up the weak nodes to increase efficiency. I've also put in a Feedback system (+ or - depending on the answer) that strengthens or weakens the node based upon the feedback. It's still looking amazingly good. Will keep updating!
1:47 PM
Grammar is the real trouble here. grammar words like Of, and, or, an, a, the ect. is connected to a whole bunch of different nodes, so it creates some unnecessary charges.

An Idea for improvement.
Greece refers to a place
Where refers to place and Greece, China, England, America, India
Place refers to Greece
So rework the system so
Where refers to place
place refers to India, Greece, china, england, America ect.

This reduces the number of connections. Adding this to the sleep system will allow you to delete the unnecessary (unacceptably weak) connections and the looping connections.
I'm still amazed at it's efficiency though. It's able to comprehend simple inputs like:
My name is Cobalt
Your name is Ron New
What refers to Name
What is My name
?
Cobalt
What is your name
?
Ron New

Ron New is New Ron spelled backwards and with a w instead of a u.
I've now moved it up a notch to read files. Each sentence must be put in a different line, or else it gets REALLY slow.
The current data set is:
184 Nodes with 1678 connections!
The system resources taken up (At the start of the program):
Java.exe PID: 1472 CPU: 00% CPU Time: 00:00:03 Mem Usage: 348 K

This fluctuates between as low as 384 KB to as high as 10,000 KB. System.gc is called a few times in the program so that the memory is used wisely.

The network is working, but it still needs work. I'm going back to more research to apply new concepts into the network to improve it.

http://www.learnartificialneuralnetworks.com/
Action potentials are the electric signals that neurons use to convey information to the brain. All these signals are identical. Therefore, the brain determines what type of information is being received based on the path that the signal took. The brain analyzes the patterns of signals being sent and from that information it can interpret the type of information being received.

It turns out one aspect of my program, the weakening of signals, is not consistent with the biological system. "There are uninsulated parts of the axon. These areas are called Nodes of Ranvier. At these nodes, the signal traveling down the axon is regenerated. This ensures that the signal traveling down the axon travels fast and remains constant (i.e. very short propagation delay and no weakening of the signal)."

Instead of letting one node send pretty much equal amount of strength to it's connected nodes, how about a system where the signal strength is determined by percentage. This would mean that all charges must originate from the initial charges. The output would then be smaller numbers. This might be good or this might be bad. Let's find out!

Nop, that doesn't work. I don't know why, but it doesn't work. I think I was just lucky hitting the target on that first try after the break. Some small, ever so minute changes can sometimes throw the system off.

*********************

Mission Accomplished?? Not exactly. Answering one question (will this work) leads to a thousand new questions. I'm creating a hand-made knowledge base for it to work out off (some basic grammar concepts and general knowledge). The program can handle any information in any language that you teach it! The charges concept has been accomplished (sorta), so now I have to work on some other concepts to make the system independent. It has to take the answer and then figure out how it's going to be applied or presented as an answer. There are a hundreds of applications for the charge-based neural network concept. I assure you, as soon as I'm done with a stable version of the program, it'll be released as an Open Source Program :) I'm planning on releasing it as part of my other project known as OSCEAN (Open Source Content Environment and Abstract Network) that will put the AI to work in a few logical application. But until then, the actual code will have to be kept under wraps. Just for the thrill of suspense when it's finally released!

But now, please reply with some other types of tests that you want me to run on the system and I'll post you the results. Wish me & Ron (NewRon get it?) luck!

Saturday, September 27, 2008

Levels Of Thinking

Levels of Thinking
1. When your environment is unfavorable, you spend most of your time thinking about the basic needs for survival: food, water, air.

2. When you have food, water and air, you think about other things that make life easier: Electricity, Technology, Science, Mathematics

Suppose that that goal has been achieved as well, and somehow, everyday-life was made as easy as it possibly could be. Let's say, we have robots that cook for us, clean for us, farm for us ect. We won't have to do anything ourselves.
3. This will lead us to a new level of thinking that'll bring forth a new era in Human History. I'm not sure what that will be though.

But, will we want to be just idle humans?? Will we want to waste our lives consuming without producing? I believe that Boredom makes us human. We are not satisfied by what we have. We feel that we are endowed with a higher purpose to do something else. Always, something else. So it just may be that we humans will never reach that 3rd level of thinking. But an Artificial Intelligent program, with no needs, goals or desires, might be able to...

Sunday, September 21, 2008

Still here!

Yep, Still Here! I've been caught up with school the for the past few weeks and Hurricane IKE left us without power for 7 days! I also had school off the entire week, but without power there was nothing else to do except think (like the good ol` days. Actually think for a while!). I'll be posting all about it in my next post. Stay put!