1.05:  Thu Jun 20 08:47:31 EDT 2002
   - jettero made removed that patch

1.04p2:  Thu Jun 20 08:41:36 EDT 2002
   - jettero made this lib a bunch more portable

1.04:  Mon Oct 30 13:59:55 EST 2000
   - jettero fixed some pragma trouble in apps/displays/

1.03:  Sun Oct 29 12:53:56 EST 2000
   - jettero found a boo boo in the I command, and fix a
     dependancy bug.

1.02:  Sun Oct 29 12:52:05 EST 2000
   - jettero made some giant changes to the compilation
     dependancies. Mmm... good

1.01:  Sun Oct 29 11:55:06 EST 2000
   - jettero got the configure stuff decked out
   - jettero got autoconf to go in the top src-dir
   - jettero intruduced his newer version of the I command
     and (somewhat more importantly) gave the jneural
     project a version. ;)

Sun May  7 10:30:21 EDT 2000
  - jettero updated
    synaptic_group::query_current_dendrilitic_output().
    Kirill Erofeev (vertex@baga.ac.net.ru) observed that it
    was returning the fireing output rather than the
    dendrilitic output for the given arc.  As far as I can
    tell, the function isn't used, since it's usually the
    query_output for the neuron containing the weighted
    arcs.  Still, perfection is glory.  ;)

Sat Apr 15 20:44:18 EDT 2000
  - jettero made update_tmpl (a perl program btw) deal with
    the new style of web page Jet's got goin'.

Thu Jan 13 23:26:22 EST 2000
  - jettero added a normalizer object that's unused and
    undocumented.

Fri Jan  7 10:08:47 EST 2000
  - jettero added a has_a_next_matrix() function to the
    matrix_reader class, and updated the documentation
    accordingly

Sat Jan  1 11:34:56 EST 2000
  - jettero changed the last_error display in grid_w to a
    first-prediction.

Fri Dec 31 16:24:16 EST 1999
  - jettero re-did the add_element function in matrix.cpp.
    It was really really slow

Sun Dec 26 09:54:07 EST 1999
  - jettero added last_error and hit_wall displays to
    grid_w.

Sat Dec 25 16:45:53 EST 1999
  - jettero was not satisfied with the output of a
    sarsa-prop net using sigmoid/bipolar-sigmoid as the
    transfer function in the output layer. Typically the
    predictions should be much higher than 1 (or lower than
    -1/0). He would very much have liked to use the linear
    weighted sums (pure linear) transfer function for the
    output layer. The nets showed divergent behaviours all
    too often with that transfer function, so... Jet created
    Jet's-Bipolar-Sigmoid. It's basically linear (y=5x) for
    values (-900,900), but becomes sigmoidal beyond that.
    This prevents the divergent behaviour that's been
    plauging him for the past few days.

Fri Dec 24 14:43:23 EST 1999
  - jettero got walker to stop diverging. Alpha was set way
    to high...

Fri Dec 24 13:57:50 EST 1999
  - jettero got walker and grid_w to sort work with the SUM
    transfer function in the output layer. This is
    advantageous because the prediction with SIGMOID is
    often very far off simply because the output must be
    between 0 and 1. Walker does occasionally diverge when
    the output layer is set to SUM.  I've not yet seen
    grid_w diverge with the output layer set to SUM... the
    only side effect I've seen so far, is _really accurate_
    learning.

Fri Dec 24 12:06:54 EST 1999
  - jettero made the strict_overflow_checking actually do
    some overflow/underflow checking ... all is well. Man,
    that class is slow... use it only for debugging

Thu Dec 23 12:15:25 EST 1999
  - jettero installed a new flag
    USE_STRICT_OVERFLOW_CHECKING. Strictly speaking, it
    doesn't actually do any overflow checking... yet. The
    real class is about 30% slower than the real typedef, so
    ya need to be able to turn it off.

Thu Dec 23 12:14:35 EST 1999
  - jettero finished the real class.

Thu Dec 23 01:12:05 EST 1999
  - jettero created a new class real. It is turned off right
    now (it seems to do math wrong), soon it will enable him
    to find the overflows before they happen! Happy Happy
    Joy Joy.

Wed Dec 22 12:27:27 EST 1999
  - FTD debugged grid_w and mate it more "sarsa like".  He
    also added a handfull of functions to the sarsa.cpp.
    grid_w seems to be much more intellegent now. ;) A lotta
    work went into this step.

Wed Dec 22 02:03:45 EST 1999
  - jettero re-did the bipolar sigmoid.... it was all messed
    up

Tue Dec 21 11:58:34 EST 1999
  - jettero created and documented two new functions for
    both backprop and sarsa:
    set_transfer_function_for_output() and
    set_transfer_function_for_hidden()

Tue Dec 21 11:00:52 EST 1999
  - jettero keep spotting all kinds of bugs... not sure what
    to think right now.

Tue Dec 21 09:58:15 EST 1999
  - jettero redid irand and rrand again.  This is the last
    time.

Tue Dec 21 09:33:48 EST 1999
  - jettero redid the debugging messages to print to stderr,
    so he could trace them for the grid_w example

Mon Dec 20 18:05:59 EST 1999
  - jettero added assloads of colors to the grid walker
    example.

Mon Dec 20 18:05:04 EST 1999
  - jettero changed the epsilon_policy for the grid walker
    ... woopsie, it was randomly pickking left and right
    still (cut and paste mistake).  That's actually north
    and east ... which explains why it kept getting stuck in
    the upper lefthand corner.

Mon Dec 20 16:42:14 EST 1999
  - jettero finally debugged the display for gridworld ...
    Also, he debugged his gird_w program to run correctly
    ... it's still really stupid. *shrug*

Mon Dec 20 15:59:48 EST 1999
  - jettero added color to d_grid_w.cpp (the gridworld
    display object).

Mon Dec 20 13:47:04 EST 1999
  - jettero made irand() and rrand() (int walker.cpp and
    grid_w.cpp) work better.  FTD pointed out that they were
    stupidly wrong.

Mon Dec 20 12:02:07 EST 1999
  - jettero added a weight_reinitializer to sarsa. Didn't
    help... the grid walker is still really stupid.

Mon Dec 20 10:20:27 EST 1999
  - jettero learned from FTD that our softmax is _not_
    softmax at all. The name has been changed to protect the
    innocent.

Sun Dec 19 22:38:23 EST 1999
  - jettero made gridworld go. He's pretty sure he's doing
    something wrong, cuz his grid walkers are really stupid.

Sun Dec 19 12:59:40 EST 1999
  - jettero discovered and removed a superfluous reward
    calculation at the end of the while loop in walker.cpp.

Sun Dec 19 11:59:28 EST 1999
  - jettero added this progress list to the
    web-page-posting-code.

Sun Dec 19 11:38:02 EST 1999
  - jettero got the grid world display objects
    working--featureless but working.

Sun Dec 19 01:33:42 EST 1999
  - jettero started working on the gird_world display
    objects

Sat Dec 18 21:13:36 EST 1999
  - jettero is making room for gridworld.  Another directory
    was created. It's the displays directory, off the apps
    directory ... guess what it contains?

Thu Dec 16 12:53:00 EST 1999
  - jettero experimented with linear activations in the
    output layer. Sarsa and Backprop overflow with this
    setting. Now what could be causing that?

Thu Dec 16 12:29:01 EST 1999
  - jettero installed FTD's so-called softmax epsilon tests
    in the walker

Thu Dec 16 12:10:45 EST 1999
  - jettero made the curses for walker center the display.

Thu Dec 16 03:46:14 EST 1999
  - jettero learned ncurses last night, and coded a kick-ass
    graphics interface for the walker.

Wed Dec 15 10:33:59 EST 1999
  - jettero did a buncha optimization.

Wed Dec 15 10:25:55 EST 1999
  - jettero added FTD's query_last_error() to sarsa.
    Functions by request? This is getting wierd.

Wed Dec 15 02:01:38 EST 1999
  - jettero documented the new save/restore features.

Wed Dec 15 00:43:15 EST 1999
  - jettero noticed that two nets created in rapid
    succession have the same weights (since they have the
    same random seed). I made the seeds different from
    object to object.

Wed Dec 15 00:42:31 EST 1999
  - jettero implemented net-save features.

Tue Dec 14 12:25:19 EST 1999
  - jettero documented the sarsa code.

Tue Dec 14 10:25:40 EST 1999
  - jettero added an assload of "features" to the walker.cpp
    example

Tue Dec 14 01:51:26 EST 1999
  - jettero had to debug the walker.cpp a bunch. He's
    convinced it actually learns stuff now. Further, if it's
    not trained enough, it doesn't learn very well. If it's
    not trained at all, the results are random. All is as it
    should be. Off to the documentation next.

Mon Dec 13 23:04:59 EST 1999
  - jettero reformatted FTD's stuff to be more jet-style.
    It is the Jneural lib afterall... ;)

Mon Dec 13 23:04:38 EST 1999
  - FTD debugged the hell outta sarsa, and the example.  Jet
    tried to figure it out his way, but gave up...  No sense
    in it if there's already a solution worked out.

Mon Dec 13 13:34:47 EST 1999
  - jettero worked on the first sarsa test program (the
    random walk). It's not working so well...

Mon Dec 13 10:59:10 EST 1999
  - jettero "finished" implementing sarsa. Now on to the
    debugging.  Apparently there's an error in the
    pseudo-code.  FTD is helping to nail that down.

Sun Dec 12 17:07:47 EST 1999
  - jettero worked on sarsa for a long time

Sun Dec 12 14:00:38 EST 1999
  - jettero changed all the function/variable names
    s/error/delta/g.  This action was prompted by the fact
    that delta isn't error in either backprop nor sarsa.
    Though, in sarsa it really really isn't error.

Sun Dec 12 09:58:56 EST 1999
  - jettero moved transfer.h to the utils directory, and
    updated the docs accordingly.

Sun Dec 12 00:05:18 EST 1999
  - jettero and FTD finished the research on sarsa(lambda).
    The next net is practically coded. Thanx to The Flair
    for his help. Later, the documentation will explain that
        sarsa is _not_ a neural net. The Jneural
        implementation of sarsa will use backprop as its
        "function approximator."

Sat Dec 11 18:00:00 EST 1999
  - FTD translated the pure theory for Sarsa(lambda) into
    the pure theory for Sarsa with Backpropagation.  Then,
    he translated the pure theory for Sarsa with Backprop
    into the pseudo-code for it.  Jet listened...  mostly...
    and began thinking about the Jneural implementation of
    the pseudo-code.

Thu Dec  9 10:31:39 EST 1999
  - jettero debugged transfer functions for an hour or so.
    They worked fine... it was the error overflowing.
    *frown*

Tue Dec  7 10:29:17 EST 1999
  - jettero believes the group detection stuff, in
    kohonen.cpp, won't work for rectangular training.

Tue Dec  7 10:17:25 EST 1999
  - jettero had to add kohonen::set_matrix_size() so apps
    can set the size of the output layer's matrix size.

Tue Dec  7 09:48:41 EST 1999
  - jettero added a rectangular_train() to kohonen.cpp

Tue Dec  7 09:48:08 EST 1999
  - jettero added query_above_neighbor() and
    query_below_neighbor() to neuron.cpp. Note that a call
    to set_matrix_size() (for the neuron's layer) is
    required to use either.

Tue Dec  7 09:42:50 EST 1999
  - jettero added above and below pointers to
    layer_node.cpp.  They can only be set by using
    set_matrix_size() (otherwise they are all 0's).

Tue Dec  7 09:24:48 EST 1999
  - jettero added a set_matrix_size() to layer.cpp, to
    support kohonen::rectangular_training.

Mon Dec  6 13:21:56 EST 1999
  - jettero fixed some nasty null_pointer errors in
    kohonen.cpp.

Mon Dec  6 11:21:31 EST 1999
  - jettero noticed huge problems with kohonen radii larger
    than 0. Also, the kohonen nets don't seem to group just
    right for the 3 font 7 letter problem in the apps
    directory.

Sun Dec  5 12:03:53 EST 1999
  - jettero made cluster_four use the new matrix reader.

Sun Dec  5 11:17:53 EST 1999
  - jettero "completed" the matrix_reader. It needs a couple
    more features before I'll talk about it much...

Sun Dec  5 08:50:00 EST 1999
  - jettero started working on a vector reader grammer.

Sat Dec  4 18:38:40 EST 1999
  - jettero started work on the vector reader class

Sat Dec  4 11:34:23 EST 1999
  - jettero made the kohonen linear radius thing work! It is
    very exciting (and completely untested).

Sat Dec  4 11:15:39 EST 1999
  - jettero added knowledge of the overlying layer_node to
    neurons (for the aforementioned neighbor knowledge)

Sat Dec  4 11:14:32 EST 1999
  - jettero added a query_left_neighbor() and a
    query_right_neighbor() to neurons.

Fri Dec  3 10:57:17 EST 1999
  - jettero forgot to make the bias units activation
    function linear-weighted-sum ... fixed. 

Thu Dec  2 14:19:12 EST 1999
  - jettero much improved the nmse calculation, though it's
    still an mse...

Thu Dec  2 10:49:36 EST 1999
  - jettero updated the documentation to show the new
    support.

Thu Dec  2 10:49:24 EST 1999
  - jettero added bias units to backprop nets

Thu Dec  2 10:49:18 EST 1999
  - jettero added bias unit support to arch

Wed Dec  1 16:55:40 EST 1999
  - jettero updated the docs to reflect the new transfer
    functions

Wed Dec  1 16:38:42 EST 1999
  - jettero added two new transfer functions that were
    actually coded by qi3ber.

Tue Nov 30 11:52:52 EST 1999
  - jettero finished the documentation. It has been
    spellchecked, but still needs to be checked for grammer.

Mon Nov 29 22:56:05 EST 1999
  - jettero made some major progress on the docs. They're
    almost filled in. I won't say done since they've not
    even been spell checked, much less read for grammer.

Sun Nov 28 16:10:39 EST 1999
  - jettero got a good headstart on the documentation.

Sat Nov 27 15:46:56 EST 1999
  - jettero created a method for initializing the weights in
    a customizeable way.

Sat Nov 27 02:21:12 EST 1999
  - jettero got kohonen to converge as well as he's ever
    seen it converge. Neat net, but it's "wrong"
    surprisingly often.

Fri Nov 26 21:52:55 EST 1999
  - jettero got kohonen to almost work ... it doesn't
    converge right, but it's gettin' there.

Fri Nov 26 21:52:20 EST 1999
  - jettero found most of the wierd pointer errors.  Man
    those were strange.  I had to do things like weights =
    0; while(weights) { weights = 0; }.  Later on I found
    the real problems ... long story.

Fri Nov 26 20:39:06 EST 1999
  - jettero changed a bunch of pointers around ... now
    there's _wierd_ bugs ... hunting

Thu Nov 25 11:36:00 EST 1999
  - jettero added a query_weights() to synaptic_group.cpp,
    so the kohonen net can do it's min() and max() of the
    D(J)s ... or whatever.

Thu Nov 25 10:05:00 EST 1999
  - jettero surgically removed the hidden_layer linked list
    from backprop.cpp. It never really belonged in there. It
    has been moved to arch. Now my new kohonen.cpp can use
    it. ;)

Wed Nov 24 13:52:58 EST 1999
  - jettero made the quicker forward propagation work! Now
    I'm as fast as the marty lib!

Tue Nov 23 12:16:40 EST 1999
  - jettero added a bunch of code to prevent the
    over-recalculation of forward propagation. ;) It's not
    quite finished though

Tue Nov 23 10:46:18 EST 1999
  - jettero tested the new layering support with xor.cpp.
    Works great... xor now runs with 2 hidden layers (of
    size 2 and 3).

Tue Nov 23 10:19:57 EST 1999
  - jettero created a backprop layer-linked-list to store
    all the wonderful and varied hidden layer variations.

Tue Nov 23 00:20:33 EST 1999
  - jettero created the sin.cpp example (it learns a small
    portion of the sin(x) curve).

Mon Nov 22 15:46:59 EST 1999
  - jettero created the examples dir, and the xor.cpp
    example was born.  Aall #includes no point to the funny
    dirs in /.../jneural/include.

Mon Nov 22 13:05:41 EST 1999
  - jettero reorganized the directories again. This is all
    fallin' inta place!

Mon Nov 22 09:03:17 EST 1999
  - jettero got the backprop net to converge on the correct
    answer. The net-error calculation is way wrong, it's
    gotta be. This stuff is working though. Good too, cuz he
    almost deleted the lib in frustration this morning.

Sun Nov 21 18:53:53 EST 1999
  - jettero did some makefile voodoo to make the DFLAGS
    (debug flags) match up.

Sun Nov 21 18:00:20 EST 1999
  - jettero noticed something horribly wrong with the
    transfer_dot(). Fixed.

Sun Nov 21 16:13:40 EST 1999
  - jettero removed all debugging message code. He's gonna
    start it from scratch, there was too much info to be
    able to read it.

Sun Nov 21 15:35:36 EST 1999
  - The nets seem to overflow wierdly.  Clearly something is
    monstrously wrong, but I can't figure out what.  More
    testing and hunting is needed.

Sat Nov 20 14:37:50 EST 1999
  - jettero added weight change functions
  - jettero added a typedef double real;  // idea stolen
    from marty franz

Sat Nov 20 11:05:23 EST 1999
  - jettero believes the weighted sums, and the backprop of
    the error are being calculated correctly.

Sat Nov 20 10:55:45 EST 1999
  - created the 'I' comand in the parent dir...  It's
    startlingingly similiar to the 'I' command built into
    LPC environments (particularly lima).  That will explain
    the 'jettero' at the beginning of most of the next
    progress entries.

Fri Nov 19 18:59:31 EST 1999
  - Huge problems with the weighted sums ... apparently the
    pointers in synaptic_group::axons_fire_at arn't getting
    set right ...  so they appear to be empty

Fri Nov 19 17:49:32 EST 1999
  - Ran into some trouble ... in order to backpropagate, the
    hidden neurons need to know who they're fireing at.  the
    dendrites_group object became a synaptic_group ...  now
    the nodes in the list contain arc objects.  These arcs
    know who's fireing and at whom, and each neuron knows
    who it's fireing at, and who is fireing at it.  Acky
    acky code ... :(  but it works!

Fri Nov 19 11:21:53 EST 1999
  - Started coding the backpropagation of the error ...

Fri Nov 19 09:44:25 EST 1999
  - Added error support.  Neurons know about their error.
    layers can talk about a double array of their neuron's
    errors.

Thu Nov 18 11:19:10 EST 1999
  - added primitive layer connectivity support (e.g.
    hidden_layer->dendrites_touch(input_layer)).
  - added input/output stuffs... still no error support.  I
    think I'm scared of it.  It's clearly next, since I'm
    now in a position where I can test it.

Thu Nov 18 10:42:56 EST 1999
  - improved the include dirs hierarchy immensely
  - coded a basic non-working backprop object/header

Wed Nov 17 08:26:25 EST 1999 
  - improved the directory hierarchy a little.  Now there's
    a place to make nets.  There'll be an apps dir too ...
    soon as I get a net done.  'Course, to do that I'll have
    to figure out a clean generic way to handle weight
    adjustments.

Wed Nov 17 00:35:42 EST 1999
  - added layer objects.  They contain many neurons, so the
    neurons can be manipulated easily.

Tue Nov 16 23:30:54 EST 1999
  - made a set_transfer_functions() to set all the
    transfer's on one line.  
  - made a dendrites_touch() to set the connectedness of a
    neuron in one shot.

Tue Nov 16 17:04:10 EST 1999 Set up the first web page.

Tue Nov 16 12:03:24 EST 1999 I got a basic framework going.
  - Transfer functions and forward propagation are all set.
    There's no such thing as error or adjusting weights.
    For now, all weights are initialized to 1 (for testing).
