- Intuitive, interactive and highly responsive user interface
- Built-in spreadsheets for convenient editing of data
- Dynamic graphing and data displays continuously illustrate the neural network training progress
- Industry standard, documented, field-proven algorithms, make these products the perfect tool for students and professionals
- Ideal for forecasting, prediction, signal processing, classification, function approximation, decision making, sensor interpretation, and nonlinear regression

- Preprocessing Functions
- Interconnection Architectures
- Processing Element Summation Functions
- Processing Element Transfer Functions
- Learning Algorithms
- Learning Error Criteria

- Min/Max and Mean/Std Input Preprocessing
- Sum Inputs to 1 Normalization
- Sum of Squares to 1 Normalization
- Input Disabling (TP)
- Time Series Inputs (TP)
- Input Added Noise (TP)

- Dot Product Quadratic Sum
- L1 Distance L2 Distance
- Radial Basis Function Sigma-Pi
- GRNN Sum (TP)

- Sigmoid Bipolar Sigmoid
- Arctan Bipolar Arctan
- Sin Bipolar Sin
- Threshold Linear Bipolar Threshold Linear
- Threshold Bipolar Threshold
- Linear Gaussian
- Cauchy Winner Take All
- Stochastic Threshold (Hopfield Net TP)
- Mean Field Threshold (Hopfield Net TP)

- Multilayer Normal Feed Forward
- Multilayer Full Feed Forward
- Total Recurrent
- Cascade
- Prior Recurrent
- Cascade Recurrent
- Jordan Recurrent (TP)
- Elman Recurrent (TP)

- Back Propagation
- Quick Propagation
- Jacob's Enhanced Back Propagation
- Recurrent Back Propagation
- Kohonen Winner Take All
- Kohonen Learning Vector Quantization
- Simulated Annealing
- Solis & Wets Simulated Annealing (TP)
- Cascade Correlation Learning
- Simplex Simulated Annealing (reinforcement learning) (TP)
- Powell's Method (reinforcement learning) (TP)
- Conjugate Gradient Training (TP)
- Probabilistic Neural Network (TP)
- General Regression Neural Network (TP)
- Levenberg-Marquardt Training (TP)
- Temporal Differences Algorithm (TP)
- Hopfield (TP)
- Hopfield Mean Field Annealing (TP)

- Mean Squared Error
- Mean Absolute Error
- Mean Fourth Power Error
- Hyperbolic Square Error
- Bipolar Hyperbolic Square Error
- Classification Error (TP)
- User Defined using EvalNet DLL (TP)
- Error Tolerance Training (TP)

- Adding Noise to inputs during training
- Tolerance training
- Selective deactivation of inputs
- Time series window on inputs
- Sensitivity Analysis of inputs
- Confusion Matrix for classification problems
- Reinforcement Learning with user defined DLL for net evaluation
- Core network functions DLL at no additional charge
- Code Generation of run time application development
- TrainDos 32bit Extended Dos Appl. for Batch mode training.
*ThinksPro*Views will allow viewing of Inputs, Weights, States and Outputs as Time series, Hinton or Color diagrams.

*Thinks* and *ThinksPro* take the confusion out of the choice of neural network
options and training by providing demonstration
training files and examples. Each option is
clearly described in on-line help files. The
documentation includes a run-time library
reference for each of the available functions.

Containing some of the most sophisticated neural network algorithms
and capabilities available today, these products provide a
window into how networks train and work. With *THINKS* or *ThinksPro*,
you can be training your neural network in minutes.