Suggestions for using neural networks.
Normalize Your Data
This can speed up the process of selecting connection weights. One aspect that may not be obvious is that the normalization should done in a fashion that will be consistent with your future inputs; your data should be stationary. By maintaining that consistency, it will be more likely that your future results will be more consistent with the data you have used to create your model.
So, for your data, if you normalize with respect to your minimums and maximums in your current data, but your future data could range further, it is best to select minimums and maximums that are beyond what you expect to reach.
Break Up Your Problem into Parts
If your results are not satisfactory, perhaps a subset of the problem will be more solvable.
Try Using More Hidden Layers (or Less)
Selecting the right number of nodes and layers is still art more than science, so expect to experiment with a number of possibilities.
Randomize the Order of Input
If you are not using recurrent networks, then each set of inputs is independent of the other inputs. By randomizing the inputs, you might avoid subtle weight shifting of the connections that might lead you to local minimums rather than a global minimum.
net = NeuralNet() # after setting up the rest... net.learn(epochs=100, show_epoch_results=True, random_testing=True, show_sample_interval=0)
Train the Network Several Times
It is easy to get get stuck in a local minimum. Sometimes that can be dependent on your starting values. Restarting the training will reset the random connection weights perhaps resulting in a better result.