This project is driven by three lofty goals: speed, coherence of source, and ever growing functionality. The former has obvious motivation, given the nature of neural networks and their ability to tax a processor.
As for the second, well, I have seen too many open source projects leave cleanliness of code and documentation out of the picture. Thus I am making a point here to keep the code clean and documented.
The third goal is a bit harder to pin down, and is meant to be. I don't know how well recepted this project will be, but I want it to grow to fit the needs of weekend hackers, as well as the needs of larger research projects if the demand warrants it.
Below is the current goal matrix. It is pretty self-explanatory.
|Next Release (0.2 ?)||- Finish documenting rest of source in doxygen||
- Add Bias-node option to cffNetwork
- Re-structure API to allow for many different network modules. (DONE)
|Short Term||- Add more example and test drivers||
- Ability to save/load network networks via factory
- Add other types of squashing functions besides the classic sigmoid
- Add functions to allow for mid-training layer expansion to cffNetwork
|Long Term||- Allow option to do network calculations with an arbitrary number of threads and/or processes (in cffNetwork or other)||
- Add module to do different types of recursive networks
- Research and add other types of advanced networks?
- Create drivers in other languages (scheme, python, perl, etc.) which link to the library