Archive-name: ai-faq/neural-nets/part5 Last-modified: 2000-11-28 URL: ftp://ftp.sas.com/pub/neural/FAQ5.html Maintainer: email@example.com (Warren S. Sarle)
This is part 5 (of 7) of a monthly posting to the Usenet newsgroup comp.ai.neural-nets. See the part 1 of this posting for full information what it is all about.
If you are using a small computer (PC, Mac, etc.) you may want to have a look at the Central Neural System Electronic Bulletin Board (see question "Other sources of information"). There are lots of small simulator packages. Some of the CNS materials can also be found at http://www.cs.cmu.edu/afs/cs.cmu.edu/project/ai-repository/ai/areas/neural/cns/0.html
Note for future submissions: Please restrict product descriptions to a maximum of 60 lines of 72 characters, in either plain-text format or, preferably, HTML format. If you include the standard header (name, company, address, etc.), you need not count the header in the 60 line maximum. Please confine your HTML to features that are supported by primitive browsers, especially NCSA Mosaic 2.0; avoid tables, for example--use <pre> instead. Try to make the descriptions objective, and avoid making implicit or explicit assertions about competing products, such as "Our product is the *only* one that does so-and-so." The FAQ maintainer reserves the right to remove excessive marketing hype and to edit submissions to conform to size requirements; if he is in a good mood, he may also correct your spelling and punctuation.
The following simulators are described below:
ftp retina.cs.ucla.edu [126.96.36.199]; Login name: sfinxftp; Password: joshua; directory: pub; files : README; sfinx_v2.0.tar.Z; Email info request : firstname.lastname@example.org
Currently supports backpropagation (vanilla, online, with momentum term and flat spot elimination, batch, time delay), counterpropagation, quickprop, backpercolation 1, generalized radial basis functions (RBF), RProp, ART1, ART2, ARTMAP, Cascade Correlation, Recurrent Cascade Correlation, Dynamic LVQ, Backpropagation through time (for recurrent networks), batch backpropagation through time (for recurrent networks), Quickpropagation through time (for recurrent networks), Hopfield networks, Jordan and Elman networks, autoassociative memory, self-organizing maps, time-delay networks (TDNN), RBF_DDA, simulated annealing, Monte Carlo, Pruned Cascade-Correlation, Optimal Brain Damage, Optimal Brain Surgeon, Skeletonization, and is user-extendable (user-defined activation functions, output functions, site functions, learning procedures). C code generator snns2c.
Works on SunOS, Solaris, IRIX, Ultrix, OSF, AIX, HP/UX, NextStep, Linux, and Windows 95/NT. Distributed kernel can spread one learning run over a workstation cluster.
SNNS web page: http://www-ra.informatik.uni-tuebingen.de/SNNS
Ftp server: ftp://ftp.informatik.uni-tuebingen.de/pub/SNNS
The software is available from two FTP sites: from CMU's simulator collection on pt.cs.cmu.edu [188.8.131.52] in /afs/cs/project/connect/code/unsupported/am6.tar.Z and from UCLA's cognitive science machine ftp.cognet.ucla.edu [184.108.40.206] in /pub/alexis/am6.tar.Z (2 MB).
An ALN consists of linear functions with adaptable weights at the leaves of a tree of maximum and minimum operators. The tree grows automatically during training: a linear piece splits if its error is too high. The function computed by an ALN is piecewise linear and continuous. It can learn to approximate any continuous function to arbitrarily high accuracy.
Parameters allow the user to input knowledge about a function to promote good generalization. In particular, bounds on the weights of the linear functions can be directly enforced. Some parameters are chosen automatically in standard mode, and are under user control in expert mode.
The program can be downloaded from http://www.dendronic.com/beta.htm
For further information please contact:
William W. Armstrong PhD, President Dendronic Decisions Limited 3624 - 108 Street, NW Edmonton, Alberta, Canada T6J 1B4 Email: email@example.com URL: http://www.dendronic.com/ Tel. +1 403 421 0800 (Note: The area code 403 changes to 780 after Jan. 25, 1999)
The software can be obtained by anonymous ftp from:
There is a 250 page (printed) manual and an HTML version available on-line at the above address.
The new features in 2.0 include:
The modules are provided in a library. Several "front-ends" or clients are also available. X-Window support by editor/visualization tool Xmume. MUME can be used to include non-neural computing modules (decision trees, ...) in applications. MUME is available for educational institutions by anonymous ftp on mickey.sedal.su.oz.au [220.127.116.11] after signing and sending a licence: /pub/license.ps (67 kb).
Marwan Jabri, SEDAL, Sydney University Electrical Engineering,
NSW 2006 Australia, firstname.lastname@example.org
MAJOR FEATURES OF NevProp3 OPERATION (* indicates feature new in version 3)
[backprop, quickprop, delta-bar-delta, recurrent networks], [simple clustering, k-nearest neighbor, LVQ1, DSM], [Hopfield, Boltzman, interactive activation network], [interactive activation network], [feedforward counterpropagation], [ART I], [a simple BAM] and [the linear pattern classifier]For details see: http://www.dontveter.com/nnsoft/nnsoft.html
An improved professional version of backprop is also available; see Part 6 of the FAQ.
Questions to: Don Tveter, email@example.com
REQUIREMENTS (Unix version): X11 Rel. 3 and above, Motif Rel 1.0 and above, 12 MB of physical memory, recommended are 24 MB and more, 20 MB disc space. REQUIREMENTS (PC version): PC-compatible with MS Windows 3.0 and above, 4 MB of physical memory, recommended are 8 MB and more, 1 MB disc space.
Four neuron models are implemented in BIOSIM: a simple model only switching ion channels on and off, the original Hodgkin-Huxley model, the SWIM model (a modified HH model) and the Golowasch-Buchholz model. Dendrites consist of a chain of segments without bifurcation. A neural network can be created by using the interactive network editor which is part of BIOSIM. Parameters can be changed via context sensitive menus and the results of the simulation can be visualized in observation windows for neurons and synapses. Stochastic processes such as noise can be included. In addition, biologically orientied learning and forgetting processes are modeled, e.g. sensitization, habituation, conditioning, hebbian learning and competitive learning. Three synaptic types are predefined (an excitatatory synapse type, an inhibitory synapse type and an electrical synapse). Additional synaptic types can be created interactively as desired.
Available for ftp from ftp.uni-kl.de in directory /pub/bio/neurobio: Get /pub/bio/neurobio/biosim.readme (2 kb) and /pub/bio/neurobio/biosim.tar.Z (2.6 MB) for the Unix version or /pub/bio/neurobio/biosimpc.readme (2 kb) and /pub/bio/neurobio/biosimpc.zip (150 kb) for the PC version.
Department of Software Engineering (ZXA/US)
D-67056 Ludwigshafen; Germany
firstname.lastname@example.org phone 0621-60-21372 fax 0621-60-43735
The Brain requires 512K memory and MS-DOS or PC-DOS version 3.20 or later (versions for other OS's and machines are available). A 386 (with maths coprocessor) or higher is recommended for serious use of The Brain. Shareware payment required.
Demo version is restricted to number of units the network can handle due to memory contraints on PC's. Registered version allows use of extra memory.
External documentation included: 39Kb, 20 Pages.
Source included: No (Source comes with registration).
Available via anonymous ftp from ftp.tu-clausthal.de as /pub/msdos/science/brain12.zip (78 kb) and from ftp.technion.ac.il as /pub/contrib/dos/brain12.zip (78 kb)
PO Box 712
Noarlunga Center SA 5168
Email: email@example.com (preferred) or firstname.lastname@example.org or email@example.com
NeuDL is available from the anonymous ftp site at The University of Alabama: cs.ua.edu (18.104.22.168) in the file /pub/neudl/NeuDLver021.tar. The tarred file contains the interpreter source code (in C++) a user manual, a paper about NeuDL, and about 25 sample NeuDL programs. A document demonstrating NeuDL's capabilities is also available from the ftp site: /pub/neudl/NeuDL/demo.doc /pub/neudl/demo.doc. For more information contact the author: Joey Rogers (firstname.lastname@example.org).
DemoGNG can be accessed most easily at http://www.neuroinformatik.ruhr-uni-bochum.de/ in the file /ini/VDM/research/gsn/DemoGNG/GNG.html where it is embedded as Java applet into a Web page and is downloaded for immediate execution when you visit this page. An accompanying paper entitled "Some competitive learning methods" describes the implemented models in detail and is available in html at the same server in the directory ini/VDM/research/gsn/JavaPaper/.
It is also possible to download the complete source code and a Postscript version of the paper via anonymous ftp from ftp.neuroinformatik.ruhr-uni-bochum [22.214.171.124] in directory /pub/software/NN/DemoGNG/. The software is in the file DemoGNG-1.00.tar.gz (193 KB) and the paper in the file sclm.ps.gz (89 KB). There is also a README file (9 KB). Please send any comments and questions to email@example.com which will reach Hartmut Loos who has written DemoGNG as well as Bernd Fritzke, the author of the accompanying paper.
PMNEURO 1.0a is available at: ftp://ftp.uni-stuttgart.de/pub/systems/os2/misc/pmneuro.zip PMNEURO 1.0a creates neuronal networks (backpropagation); propagation results can be used as new training input for creating new networks and following propagation trials.
Name: nn/xnn Company: Neureka ANS Address: Klaus Hansens vei 31B 5037 Solheimsviken NORWAY Phone: +47 55 20 15 48 Email: firstname.lastname@example.org URL: http://www.bgif.no/neureka/ Operating systems: nn: UNIX or MS-DOS, xnn: UNIX/X-windows, UNIX flavours: OSF1, Solaris, AIX, IRIX, Linux (1.2.13) System requirements: Min. 20 Mb HD + 4 Mb RAM available. If only the nn/netpack part is used (i.e. not the GUI), much less is needed. Approx. price: Free for 30 days after installation, fully functional After 30 days: USD 250,- 35% educational discount.A comprehensive shareware system for developing and simulating artificial neural networks. You can download the software from the URL given above.
nn is a high-level neural network specification language. The current version is best suited for feed-forward nets, but recurrent models can and have been implemented as well. The nn compiler can generate C code or executable programs, with a powerful command line interface, but everything may also be controlled via the graphical interface (xnn). It is possible for the user to write C routines that can be called from inside the nn specification, and to use the nn specification as a function that is called from a C program. These features makes nn well suited for application development. Please note that no programming is necessary in order to use the network models that come with the system (netpack).
xnn is a graphical front end to networks generated by the nn compiler, and to the compiler itself. The xnn graphical interface is intuitive and easy to use for beginners, yet powerful, with many possibilities for visualizing network data. Data may be visualized during training, testing or 'off-line'.
netpack: A number of networks have already been implemented in nn and can be used directly: MAdaline, ART1, Backpropagation, Counterpropagation, Elman, GRNN, Hopfield, Jordan, LVQ, Perceptron, RBFNN, SOFM (Kohonen). Several others are currently being developed.
The pattern files used by the networks, have a simple and flexible format, and can easily be generated from other kinds of data. The data file generated by the network, can be saved in ASCII or binary format. Functions for converting and pre-processing data are available.
NNDT Neural Network Development Tool Evaluation version 1.4 Bjvrn Saxen 1995http://www.abo.fi/~bjsaxen/nndt.html ftp://ftp.abo.fi/pub/vt/bjs/
The NNDT software is as a tool for neural network training. The user interface is developed with MS Visual Basic 3.0 professional edition. DLL routines (written in C) are used for most of the mathematics. The program can be run on a personal computer with MS Windows, version 3.1.
Bjorn Saxen Heat Engineering Laboratory Abo Akademi University Biskopsgatan 8 SF-20500 Abo FinlandRemember, this program comes free but with no guarantee!
A user's guide for NNDT is delivered in PostScript format. The document is split into three parts and compressed into a file called MANUAL.ZIP. Due to many bitmap figures included, the total size of the uncompressed files is very large, approx 1.5M.
The training requires a set of input signals and corresponding output signals, stored in a file referred to as pattern file. This is the only file the user must provide. Optionally, parameters defining the pattern file columns, network size and network configuration may be stored in a file referred to as setup file.
NNDT includes a routine for graphical presentation of output signals, node activations, residuals and weights during run. The interface also provides facilities for examination of node activations and weights as well as modification of weights.
A Windows help file is included, help is achieved at any time during NNDT execution by pressing F1.
Trajan 2.1 Shareware concentrates on ease-of-use and feedback. It includes Graphs, Bar Charts and Data Sheets presenting a range of Statistical feedback in a simple, intuitive form. It also features extensive on-line Help.
The Registered version of the package can support very large networks (up to 128 layers with up to 8,192 units each, subject to memory limitations in the machine), and allows simple Cut and Paste transfer of data to/from other Windows-packages, such as spreadsheet programs. The Unregistered version features limited network size and no Clipboard Cut-and-Paste.
There is also a Professional version of Trajan 2.1, which supports a wider range of network models, training algorithms and other features.
See Trajan Software's Home Page at http://www.trajan-software.demon.co.uk for further details, and a free copy of the Shareware version.
Alternatively, email email@example.com for more details.
The major motivation for Nenet was to create a user-friendly SOM algorithm tool with good visualization capabilities and with a GUI allowing efficient control of the SOM parameters. The use scenarios have stemmed from the user's point of view and a considerable amount of work has been placed on the ease of use and versatile visualization methods.
With Nenet, all the basic steps in map control can be performed. In addition, Nenet also includes some more exotic and involved features especially in the area of visualization.
Features in Nenet version 1.0:
Nenet web site is at: http://www.mbnet.fi/~phodju/nenet/nenet.html The web site contains further information on Nenet and also the downloadable Nenet files (3 disks totalling about 3 Megs)
If you have any questions whatsoever, please contact: Nenet-Team@hut.fi or firstname.lastname@example.org
Name: NICO Artificial Neural Network Toolkit Author: Nikko Strom Address: Speech, Music and Hearing, KTH, S-100 44, Stockholm, Sweden Email: email@example.com URL: http://www.speech.kth.se/NICO/index.html Platforms: UNIX, ANSI C; Source code tested on: HPUX, SUN Solaris, Linux Price: FreeThe NICO Toolkit is an artificial neural network toolkit designed and optimized for automatic speech recognition applications. Networks with both recurrent connections and time-delay windows are easily constructed. The network topology is very flexible -- any number of layers is allowed and layers can be arbitrarily connected. Sparse connectivity between layers can be specified. Tools for extracting input-features from the speech signal are included as well as tools for computing target values from several standard phonetic label-file formats.
SOM Toolbox, a shareware Matlab 5 toolbox for data analysis with self-organizing maps is available at the URL http://www.cis.hut.fi/projects/somtoolbox/. If you are interested in practical data analysis and/or self-organizing maps and have Matlab 5 in your computer, be sure to check this out!
Highlights of the SOM Toolbox include the following:
Independent component analysis, or ICA, is neural network or signal processing technique that represents a multidimensional random vector as a linear combination of nongaussian random variables ('independent components') that are as independent as possible. ICA is a nongaussian version of factor analysis, and somewhat similar to principal component analysis. ICA has many applications in data analysis, source separation, and feature extraction.
The FastICA algorithm is a computationally optimized method for performing the estimation of ICA. It uses a fixed-point iteration scheme that has been found in independent experiments to be 10-100 times faster than conventional gradient descent methods for ICA. Another advantage of the FastICA algorithm is that it can be used to estimate the independent components one-by-one, as in projection pursuit, which is very practical in exploratory data analysis.
The FastICA package for MATLAB (versions 5 or 4) is freeware package
with a graphical user interface that implements the
fixed-point algorithm for ICA. The package is available on the Web at
Email contact: Aapo Hyvarinen <Aapo.Hyvarinen@hut.fi>
The network engine is an attempt at creating a biological neural network simulator. It consists of a C++ class, called "network". A network object houses a set of objects of another C++ class, called "neuron". The neuron class is a detailed functional simulation of a neuron (i.e. the actual chemical processes that lead to a biological neuron's behavior are not modeled explicitly, but the behavior itself is). The simulation of the neuron is handled entirely by the neuron class. The network class coordinates the functioning of the neurons that make up the neural network, as well as providing addressing services that allow the neurons to interact. It is also responsible for facilitating the interface of the neural network it houses onto any existing software into which the neural network is to be integrated.
Since a simulated neural network consisting of a large number of heavily interconnected neurons is extremely difficult to generate manually, NEXUS was developed. To create a network with NEXUS, one need only describe the network in general terms, in terms of groups of sets of specifically arranged neurons, and how the groups interface onto each other and onto themselves. This information constitutes a network architecture descriptor. A network architecture descriptor is read by NEXUS, and NEXUS uses the information to generate a network, building all the neurons and connecting them together appropriately. This system is analogous to nature's brain construction system. For example, human brains, in general, are very similar. The basic design is stored in human DNA. Since it is certainly not possible to record information about each neuron and its connections, DNA must instead contain (in some form) what is essentially a set of guidelines, a set of rules about how the brain is to be laid out. These guidelines are used to build the brain, just like NEXUS uses the guidelines set out in the network architecture descriptor to build the simulated neural network.
NEXUS and the network engine have deliberately been engineered to be highly efficient and very compact. Even so, large, complex networks require tremendous amounts of memory and processing power.
The network engine:
Email: Lawrence O. Ryan <firstname.lastname@example.org>
The Netlab simulation software is designed to provide the central tools necessary for the simulation of theoretically well founded neural network algorithms for use in teaching, research and applications development. It consists of a library of Matlab functions and scripts based on the approach and techniques described in Neural Networks for Pattern Recognition by Christopher M. Bishop, (Oxford University Press, 1995). The functions come with on-line help, and further explanation is available via HTML files.
The Netlab library includes software implementations of a wide range of data analysis techniques. Netlab works with Matlab version 5.0 and higher. It is not compatible with earlier versions of Matlab.
One can then design a scenario with walls, rocks, lights, fat (fuel) sources (that can be smelled) and many other such things. Robot tanks are then introduced into the Scenario and allowed interact or battle it out. The last one alive wins, or maybe one just watches the motion of the robots for fun. While the scenario is running it can be stopped, edited, zoom'd, and can track on any robot.
The entire program is mouse and graphicly based. It uses DOS and VGA and is written in TurboC++. There will also be the ability to download designs to another computer and source code will be available for the core neural simulator. This will allow one to design neural systems and download them to real robots. The design tools can handle three dimentional networks so will work with video camera inputs and such.
NuTank source code is free from
Contact: Richard Keene; Keene Educational Software
Email: email@example.com or firstname.lastname@example.org
Lens runs under Windows as well as a variety of Unix platforms. It includes a graphical interface and an embedded script language (Tcl). The key to the speed of Lens is its use of tight inner-loops that minimize memory references when traversing links. Frequently accessed values are stored in contiguous memory to achieve good cache performance. It is also able to do batch-level parallel training on multiple processors.
Because it is recognized that no simulator will satisfy sophisticated users out of the box, Lens was designed to facilitate code modification. Users can create and register such things as new network or group types, new weight update algorithms, or new shell commands without altering the main body of code. Therefore, modifications can be easily transferred to new releases.
Lens is available free-of-charge to those conducting research at academic or non-profit institutions. Other users should contact Douglas Rohde for licensing information at email@example.com.
For some of these simulators there are user mailing lists. Get the packages and look into their documentation for further info.
------------------------------------------------------------------------Next part is part 6 (of 7). Previous part is part 4.