| Peer-Reviewed

Architecture of the Extended-Input Binary Neural Network and Applications

Received: 4 June 2018     Accepted: 20 June 2018     Published: 6 July 2018
Views:       Downloads:
Abstract

The proposed architecture of a binary artificial neural network is inspired by the structure and function of the major parts of the brain. Consequently it is divided into an input module that resemble the sensory (stimuli) area and an output module similar to the motor (responses) area. These two modules are single layer feed forward neural networks and have fixed weights to transform input patterns into a simple code and then to convert this code back to output patterns. All possible input and output patterns are stored in the weights of these two modules. Each output pattern can be produced by a single neuron of the output module asserted high. Similarly each input pattern produces a single input module neuron at binary 1. The training of this neural network is confined to connecting one output neuron of the input module at binary 1 that represents a code for the input pattern and one input neuron of the output module that produces the desired associated output pattern. Thus fast and accurate association between input and output pattern pairs can be achieved. These connections can be implemented by a crossbar switch. This crossbar switch acts similar to the thalamus in the brain which is considered to be a relay center. The role of the crossbar switch is generalized to an electric field in the gap between input and output modules and it is postulated that this field may be considered as a bridge between the brain and mental states. The input module encoder is preceded by the extended input circuit which ensures that the inverse of the input matrix exists and at the same time to make the derivation of this inverse of any order a simple task. This circuit mimics the processing function of the region in the brain that process input signals before sending them to the sensory region. Some applications of this neural network are: logical relations, mathematical operations, as a memory device and for pattern association. The number of input neurons can be increased (increased dimensionality) by multiplexing those inputs and using latches and multi-input AND gates. It is concluded that by emulating the major structures of the brain using artificial neural networks the performance of these networks can be enhanced greatly by increasing their speed, increasing their memory capacities and by performing a wide range of applications.

Published in American Journal of Neural Networks and Applications (Volume 4, Issue 1)
DOI 10.11648/j.ajnna.20180401.12
Page(s) 8-14
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2018. Published by Science Publishing Group

Keywords

Architecture, Modular, Pattern Association, Mathematical Operations

References
[1] Wassef, W. A., 2002, Extended Input Neural Network: Applications, Implementation and Learning, Neural Parallel & Scientific Computations, vol. 10, issue 4, pp. 387-410. Dynamic Publishers, Inc. GA.
[2] Wassef, W. A., 2012, Time Series Prediction and Hardware of the Extended Input Neural Network (EINN), Neural, Parallel & Scientific Computations, vol. 20, issue 3/4 pp. 475-482, Dynamic Publishers, Inc. GA.
[3] Bisi,M. And Goyal, N. K., 2017, Artifical Neural Network Applications for Software Reliability Prediction, Wiley-Scrivener, NY USA.
[4] Kosko, B., 1987, Adaptive Bidirectional Associative Memories, Applied Optics, vol. 26, no. 23, pp. 4947–4960.
[5] Kosko, B., 1992, Neural Networks and Fuzzy Systems, Prentice Hall, Englewood Cliffs, NJ, USA.
[6] Glesner, M. And Puchmuller, W., 1994, Neurocomputer, An Overview of Neural networks in VLSI. Chapman and Hall, Hew York, USA, pp. 192–198.
[7] Cover, T. M., 1965, IEEE Transactions on Electronic Computers 14, pp. 326-334.
[8] Anton, H., 2000, Elementary Linear Algebra, 8th ed. John Wiley & Sons, New York, p. 68.
[9] Kurzweil, R., 2012, How to Create a Mind, Viking, Published by the Penguin Group Inc., N.Y., USA, pp. 84-85.
[10] Smythies, J. R., 2014, Brain and Mind: Modern Concepts of the Nature of Mind, Rotledge, N.Y, USA.
[11] Wilkinson, R., 2013, Minds and Bodies, Rotledge, N.Y. USA.
[12] Carter, M. 2007, Minds and Computers, An Introduction to the Philosophy of Artificial Intelligence, Edinburgh University Press, Edinburgh, UK, p. 36.
[13] Armstrong, D. 1968, A Materialist Theory of the Mind, Routledge and Kegan Paul, London.
Cite This Article
  • APA Style

    Wafik Aziz Wassef. (2018). Architecture of the Extended-Input Binary Neural Network and Applications. American Journal of Neural Networks and Applications, 4(1), 8-14. https://doi.org/10.11648/j.ajnna.20180401.12

    Copy | Download

    ACS Style

    Wafik Aziz Wassef. Architecture of the Extended-Input Binary Neural Network and Applications. Am. J. Neural Netw. Appl. 2018, 4(1), 8-14. doi: 10.11648/j.ajnna.20180401.12

    Copy | Download

    AMA Style

    Wafik Aziz Wassef. Architecture of the Extended-Input Binary Neural Network and Applications. Am J Neural Netw Appl. 2018;4(1):8-14. doi: 10.11648/j.ajnna.20180401.12

    Copy | Download

  • @article{10.11648/j.ajnna.20180401.12,
      author = {Wafik Aziz Wassef},
      title = {Architecture of the Extended-Input Binary Neural Network and Applications},
      journal = {American Journal of Neural Networks and Applications},
      volume = {4},
      number = {1},
      pages = {8-14},
      doi = {10.11648/j.ajnna.20180401.12},
      url = {https://doi.org/10.11648/j.ajnna.20180401.12},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajnna.20180401.12},
      abstract = {The proposed architecture of a binary artificial neural network is inspired by the structure and function of the major parts of the brain. Consequently it is divided into an input module that resemble the sensory (stimuli) area and an output module similar to the motor (responses) area. These two modules are single layer feed forward neural networks and have fixed weights to transform input patterns into a simple code and then to convert this code back to output patterns. All possible input and output patterns are stored in the weights of these two modules. Each output pattern can be produced by a single neuron of the output module asserted high. Similarly each input pattern produces a single input module neuron at binary 1. The training of this neural network is confined to connecting one output neuron of the input module at binary 1 that represents a code for the input pattern and one input neuron of the output module that produces the desired associated output pattern. Thus fast and accurate association between input and output pattern pairs can be achieved. These connections can be implemented by a crossbar switch. This crossbar switch acts similar to the thalamus in the brain which is considered to be a relay center. The role of the crossbar switch is generalized to an electric field in the gap between input and output modules and it is postulated that this field may be considered as a bridge between the brain and mental states. The input module encoder is preceded by the extended input circuit which ensures that the inverse of the input matrix exists and at the same time to make the derivation of this inverse of any order a simple task. This circuit mimics the processing function of the region in the brain that process input signals before sending them to the sensory region. Some applications of this neural network are: logical relations, mathematical operations, as a memory device and for pattern association. The number of input neurons can be increased (increased dimensionality) by multiplexing those inputs and using latches and multi-input AND gates. It is concluded that by emulating the major structures of the brain using artificial neural networks the performance of these networks can be enhanced greatly by increasing their speed, increasing their memory capacities and by performing a wide range of applications.},
     year = {2018}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Architecture of the Extended-Input Binary Neural Network and Applications
    AU  - Wafik Aziz Wassef
    Y1  - 2018/07/06
    PY  - 2018
    N1  - https://doi.org/10.11648/j.ajnna.20180401.12
    DO  - 10.11648/j.ajnna.20180401.12
    T2  - American Journal of Neural Networks and Applications
    JF  - American Journal of Neural Networks and Applications
    JO  - American Journal of Neural Networks and Applications
    SP  - 8
    EP  - 14
    PB  - Science Publishing Group
    SN  - 2469-7419
    UR  - https://doi.org/10.11648/j.ajnna.20180401.12
    AB  - The proposed architecture of a binary artificial neural network is inspired by the structure and function of the major parts of the brain. Consequently it is divided into an input module that resemble the sensory (stimuli) area and an output module similar to the motor (responses) area. These two modules are single layer feed forward neural networks and have fixed weights to transform input patterns into a simple code and then to convert this code back to output patterns. All possible input and output patterns are stored in the weights of these two modules. Each output pattern can be produced by a single neuron of the output module asserted high. Similarly each input pattern produces a single input module neuron at binary 1. The training of this neural network is confined to connecting one output neuron of the input module at binary 1 that represents a code for the input pattern and one input neuron of the output module that produces the desired associated output pattern. Thus fast and accurate association between input and output pattern pairs can be achieved. These connections can be implemented by a crossbar switch. This crossbar switch acts similar to the thalamus in the brain which is considered to be a relay center. The role of the crossbar switch is generalized to an electric field in the gap between input and output modules and it is postulated that this field may be considered as a bridge between the brain and mental states. The input module encoder is preceded by the extended input circuit which ensures that the inverse of the input matrix exists and at the same time to make the derivation of this inverse of any order a simple task. This circuit mimics the processing function of the region in the brain that process input signals before sending them to the sensory region. Some applications of this neural network are: logical relations, mathematical operations, as a memory device and for pattern association. The number of input neurons can be increased (increased dimensionality) by multiplexing those inputs and using latches and multi-input AND gates. It is concluded that by emulating the major structures of the brain using artificial neural networks the performance of these networks can be enhanced greatly by increasing their speed, increasing their memory capacities and by performing a wide range of applications.
    VL  - 4
    IS  - 1
    ER  - 

    Copy | Download

Author Information
  • Department of Computer Engineering, Saskatchewan Institute of Applied Science and Technology, Moose Jaw, Canada

  • Sections