Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Neural Concept Network (en)

(A_Y)m
October 28, 2019

Neural Concept Network (en)

(WIP) Neural Concept Network is a directed network for representation, searching, analyzing, learning, and forming concepts currently under development.

(A_Y)m

October 28, 2019
Tweet

More Decks by (A_Y)m

Other Decks in Research

Transcript

  1. Index 1. Introduction 2. Overview 3. Conception 4. Concept Network

    function 5. Neural Network function 6. Neural Concept Network function 7. Reference materials 2
  2. Precaution • This document is under creation. • The slide

    is a note in the process of development and may be deleted in the official version. • This document uses some animations. The slide with the following icons has animation. Use the PowerPoint version because the PDF version can't play animations. animation 4
  3. Self-introduction • Name – Akihiro Yamamoto • Twitter – A_Ym

    • Focus on – AI, Artificial Consciousness (AC), Neural Network, Concept theory, Brain, Quantum Info, OpenCL, C#, MS Azure… And BOOM BOOM SATELLITES!! 5
  4. What’s Neural Concept Network • Neural Concept Network is a

    directed network for representation, searching, analyzing, learning, and forming concepts. • Consisting of the following two functions. – Concept Network: function for representing concepts – Neural Network: function for searching, analyzing, learning, and forming concepts It is abbreviated as NCN as needed. 7
  5. Features of Neural Concept Network • NCN can represent concepts

    in a form that can be understood by those who do not have knowledge of neural networks or mathematics. • It can also represent “relative, hierarchical, context- sensitive, self-inclusion, self-reference” concepts, which is complicated in conventional concept representation. • It has the functionality of a simplified Spiking Neural Network (SNN) and is used for concept search, (top- down, bottom-up) analysis, learning and formation. 8
  6. Conception-1 • NCN is a part of the realization of

    HLAI (Human Level Artificial Intelligence) with the ability to think like human beings. • At present, the distance between Top-down approach and Bottom-up approach, for realization of HLAI is too far, so I thought we needed a middle-out approach. • I thought that it was necessary to implement the representation and the processing of the concept as a function of the middle point. 10
  7. HLAI Roadmap Human Intelligence Artificial Intelligence Artificial Neural Network Human

    Neural Network Concept Semantics Language Thought Planning Recognition Perception Sense Cognition 11
  8. HLAI Roadmap points • The point is that Concept is

    located below Language, Semantics. • At present, there is a lot of flow that it goes on the processing of the higher order concept after the language processing is realized. • However, I think that the language processing cannot be achieved if the concept processing and then the processing of the meaning are not realized first. 12
  9. Conception-2 • When I thought about the directed network structure

    for concept representation, I was able to create something like a neural network. • When I tried and built SNN functionality and ran it, I found useful results for the analysis of concepts. 13
  10. Define the concept • It is difficult to define the

    concept of the concept. • Therefore, it is defined by the following circulation representation. A concept is to represent a certain concept by relation to other concepts. • The following sections describe the relation representation (RR) of concepts in NCN. 15
  11. RR1 – basic-relation • NCN represent that the concept "A"

    relate to the concept "B" in the node and the directed edge as follows. • A node is called concept, and a directed edge is called relation. • A concept of starting/ending point of relation is called source/destination concept. A B (source) concept (destination) concept relation (origin) 16
  12. Objects of concept • Here, the concept “A” and the

    concept “B”, this “A”, “B” is just a label for clarity, in fact it may be a text, voice, image or other neural networks. • Internally, it is identified by the UUID. 17
  13. Example of multiple-relations: A human relate to a hand and

    a foot. human hand foot foot hand human or 21
  14. RR3 – sub-relation • Further qualifying a relation with other

    concepts is called sub-relation. • The following is a representation of concept "A", which has a relation of "C" to "B". If relation1 is the origin, relation2 is a sub-relation. A B C relation2 (sub-relation) relation1 (origin) 22
  15. • Conversely, if relation2 is the origin, relation1 is called

    super-relation. A B C relation2 (origin) relation1 (super-relation) 23
  16. Example of sub-relation: human relate to hand that have. →

    (A) human has(have) (a) hand. human hand have 24
  17. Sentence type and order of relation (1/2) • In this

    example, the relationship is defined in a Japanese sentence type order, but in NCN the sequence of the relations has no grammatical meaning, so you may define the relationship in the English sentence order. • What meaning is found from these conceptual structures depends on the interpreting side. human hand have human have hand Japanese sentence type English sentence type 25
  18. Sentence type and order of relation (2/2) • When active

    and passive is preferred, it is easy to do the relative-representation (described later) by the sentence type order of English. • For the intransitive verb, a Japanese sentence type order is easier to express in relative terms. 26
  19. RR4 – nested-relation • The nested relation to further qualify

    the sub-relation in the sub-relation is represented as follows. A B C D 27
  20. A B C D • It is called "owner-relation" is

    the top-level relationship when relation3 is origin. relation3 (origin) relation2 (super-relation) relation1 (owner-relation) 28
  21. Example1 of nested-relation: human relate to hand that have qualified

    by two. → (A) human (have)has two hand(s). human hand have two 29
  22. Example2 of nested-relation: human relate to hand and foot that

    have qualified by two. → (A) human (have)has two hand(s) and foot(feet). human hand have two foot 30
  23. Limits of three-term expression • Relation representation of concepts is

    similar to RDF (Resource Description Framework) or a graph database. – ex: RDF represents relation by triple (subject, predict, object). • When dealing with real information, the three-term expression is not enough. • NCN also represents the relation itself, which corresponds to RDF predicates, and can be combined with sub-relation to create a more realistic representation of a concept. 32
  24. RR5 - relativity • The degree of relationship (relativity) of

    a sub-relation is represented by 0.0 < relativity < 1.0. • The closer to 0.0, the closer to the source concept, and the closer to 1.0, the closer to the destination concept relation. A B C D relativity = 0.25 relativity = 0.5 E relativity = 0.75 33
  25. Example of relativity: (A) human (have)has two hand(s) with five

    fingers and foot(feet). human hand have two finger five 34
  26. • The relativity and the order of the sub-relation do

    not have grammatical meaning. • As shown below, the order of the things you want to emphasize may be different even with the same fact. Mr. Smith yesterday this road passed through yesterday Mr. Smith this road passed through yesterday this road Mr. Smith passed through Mr. Smith passed through this road on yesterday. Yesterday, Mr. Smith passed through this road. Yesterday and this road, Mr. smith passed through. Mr. Smith が昨日この道を通った。 昨日 Mr. Smith がこの道を通った。 昨日この道を Mr. Smith が通った。 35
  27. • Since English grammar and Japanese grammar may not be

    able to reproduce the order of emphasis of the concept, it is necessary to supplement it with the decoration of the character in the case of sentences by inflection and gesture, etc. in an actual conversation. Mr. Smith passed through this road on yesterday. Mr. Smith が昨日この道を ”通った” 。 Mr. Smith yesterday this road passed through 36
  28. • On the other hand, a sentence with a restriction,

    such as poetry, might recall multiple complex conceptual structures. 37
  29. Sliding vector representation of conceptual networks • If a concept

    network is interpreted as a sliding (or liner) vector, it can be handled by calculation graph neural networks? co1 co3 co4 co2 Ԧ Ԧ Ԧ = 2 − 1 = 3 − Ԧ ∗ 0.5 Ԧ = 4 − ∗ 0.5 38
  30. Grammatical representation • The idea that grammatical information is explicitly

    added separately. Mr. Smith passed through yesterday this road S V O M grammar 39
  31. • Sub-relation can be represented relatively as follows. • (In

    the context of A,) B is related to C. A RR6 - relative-representation A C C B B 40
  32. A C B D • When the nested-relation is represented

    relatively, it becomes as follows. • (In the context of A,) B is related to C that D. A C B D 41
  33. • When the sub-relation in the relative-representation is further relative-representation,

    it becomes a hierarchical representation as follows. • (In the context of A-B,) C is related to D. A C B D A B C D 42
  34. relativity-representation in relative representation • In relative representation, the information

    of the relativity disappears, so it is necessary to think about the representation. A A C C B B D E D E Idea1. Color representation A C B D E Idea2. 3D representation 43
  35. Examples of relative-relation: • A human has two legs with

    five fingers. • An insect has six legs with finger. human foot have two finger five insect six human foot have two finger five insect foot have six finger 44
  36. Example1 of complex relative-representations: Self-reference, self-inclusion representation (In me,) He

    may think I'm a delicate man, but I’m bold. I (me) he bold delicate I (me) he I delicate bold 45
  37. Exception representation WIP • Generally, crows are black, and swans

    is white. • However, there are exceptions such as albino and melanism, and white crows and black swans exist. • It is necessary to represent it while preventing catastrophic forgetting (interference) due to such exceptions. 49
  38. Examples of existing logical representations NCN can support existing logical

    representation, such as: • Top-down analysis – Fish bone diagram – Mind Map • Bottom-up analysis – KJ method • Meaning description – RDF/OWL • Structure description – UML – ER diagram – Graph Database 50
  39. Comparing association representations in UML UML (Class Diagram) NCN ParentClass

    ChildClass ParentChildfood 1 0..* Parent Child ParentChildfood 1 0..* 51
  40. Chomsky’s Generative grammar • When grammatical information is added to

    the nested relation representation of the concept, the grammatical structure can be represented. WIP 52
  41. Summary of relation representation of concepts • Frame representation is

    incorporated into the network structure itself, and recursive relative representations of concepts that were difficult in conventional logical representations can be made. 53
  42. Neural Network function • SNN utilize the temporal change of

    the neural potential to express and process information and is closer to biological neurons than conventional computational graph neural networks, allowing for more flexible information representation and processing. • NCN also provides information processing power by branched structures equivalent to neurite (or nerve), such as axon and dendrite. 55
  43. Pros & Cons of Neural Network function • Pros –

    Dynamic network can be formed. – Signal, processing is superposition-able. • Cons – It is costly to calculate. – There may be similar restrictions as humans. 56
  44. Spiking Neural Network Function NCN has parameters for the spiking

    neural network in addition to the parameters of conventional calculation graph neural networks. • conventional calculation graph neural networks function – Weight: Synaptic Weights. Positive and negative real number – Potential: Positive and negative real number (mV) – Threshold: Positive real number (mV) • spiking neural networks function – Attenuation rate: Time attenuation rate of potential. Positive real number (mV/msec) – Refractory period: Positive real number (msec) 57
  45. • The Neural Network feature changes the name of the

    structure used in the Concept Network feature as follows. neuron neurite synapse 58
  46. Firing function • NCN does not have an input or

    an output layer, and any neuron can be treated as an input or an output neuron. • Represent the directed edge and relation used for I/O separately. input/output relation animation 59
  47. Combination with other NN • It is also assumed that

    the input and output are combined with other types of neural networks, such as DNN. 60
  48. 13.0 pps 13.0 pps firing rate 1/1 6.5 pps firing

    rate 1/2 0.0 pps No further firing due to the attenuation characteristics of the potential • When the input frequency and amount of the signal exceeds the attenuation of the potential, the firing frequency become 1/n of the input frequency. • This is because even one input amount is less than the threshold value, it can fire beyond the threshold when input as two or three times. pps: pulse per second threshold = 1.0 weight = 1.0 weight = 0.9 weight = 0.9 input 13.0 pps animation 61
  49. • By attenuation characteristics of the potential, it will not

    respond when the distance (number of stages) from the input neuron becomes distant. • This is an important feature to prevent infinite firing loops in NCN that can be circulated networks. • Similarly, the higher input frequency of the signal causes a wider range of propagation. • This feature is utilized to control affect range of the input signal. 62
  50. Harmonic sound • It may be related to the mechanism

    of harmonic sound; whose frequency component is an integer multiple of the fundamental tone or one integer fraction. 63
  51. Frequency Coding • There are various theories about the representation

    of information in the brain, and here are two. – rate coding theory – temporal coding theory • NCN uses frequency coding that combines these features. • Frequency coding detects how much it fires in sync with the frequency of the input signal as the degree of the relation. • By using coprime frequencies with sufficiently large least common multiple for input, it is possible to determine how much the network reacts to which input frequency, regardless of the propagation path. 64
  52. Information expression using frequency • The information expression using the

    frequency has the following. – PM: Phase modulation – FM: Frequency modulation – AM: Amplitude modulation • In these, AM seems to require a population expression of multiple neurons rather than a single neuron from the firing characteristics of the neural network (all or none law). 65
  53. Image of the firing cycle (13.0pps) Time 13.00 pps →

    13/1 6.50 pps → 13/2 4.33 pps → 13/3 3.25 pps → 13/4 2.60 pps → 13/5 2.16 pps → 13/6 1.0s animation 66
  54. Image of the firing cycle (11.0pps) Time 11.0 pps →

    11/1 5.5 pps → 11/2 3.66 pps → 11/3 2.75 pps → 11/4 2.2 pps → 11/5 1.83 pps → 11/6 1.0s animation 67
  55. Image of the firing cycle (7.0pps) Time 7.0 pps →

    7/1 3.5 pps → 7/2 2.33 pps → 7/3 1.75 pps → 7/4 1.4 pps → 7/5 1.6 pps → 7/6 1.0s animation 68
  56. • The following is an example of the reaction when

    coprime frequencies are colored in RGB as a channel and input from different neurons. Input 13.0 pps channel A Input 11.0 pps channel B Input 7.0 pps channel C 6.5 pps channel A 50% 3.25 pps channel A 25% channel A 25% channel B 25% channel B 25% channel C 25% 5.5 pps channel B 50% 3.5 pps channel C 50% 1.75 pps channel C 25% channel A 12.5% channel B 12.5% channel C 12.5% channel A 25% channel C 25% 3.25 pps channel B 25% 71
  57. Summary of frequency coding • By using coprime frequencies for

    input channels, the possibility that each frequency interferes in the unit time is very low. • Using this characteristic, it is possible to separate or superimpose multiple channels in the input and output and processing of information. • By analyzing the output signal for each frequency channel, it is possible to determine whether the neurons are reacting to which input frequency channel regardless on the path. • Therefore, compared to the calculation graph NN, it is possible to prevented to become a black box. 72
  58. Actual behavior when frequency is superposition • In fact, it

    does not go so well because it is affected by the potential that rose at another frequency when the frequency is superpositioned nearby. • There is some way to eliminate the affect, but there is a possibility that it can be made some information expression, and it is necessary to verify which is better. 73
  59. Frequency Channel Combinations • The combination requirement of the frequency

    channel is that it is “coprime, and the ratio of the differences is small". • Three sequential numbers starting from any odd number are coprime. • Examples: (1, 2, 3), (3, 4, 5), (5, 6, 7), (7, 8, 9), (9, 10, 11), (11, 12, 13), (13, 14, 15), …, (41, 42, 43), (43, 44, 45) • Higher frequencies reduce the ratio of frequency differences. • However, when the time resolution is 0.1ms interference occurred at the combination (43, 44, 45) or higher. 74
  60. Relationship with Gödel Number • Combining a coprime integer with

    a frequency of 1/n or n times to express the conceptual structure is similar to the idea of "Gödel's incompleteness theorems". 75
  61. Periodic firing, consciousness, concentration Is the control of the frequency

    channel related to • consciousness, selection, and concentration • gamma rhythm, burst firing • capacity of short-term memory is related to 4±1 chunks* • Some say it has nothing to do with binding problems. • The boundary between consciousness and the unconscious may be as simple as sound. Human beings feel that a certain pattern of sound pressure changes is more than a certain number of times, and if repeated in a period below a certain level, they may feel that it is a single sound, but consciousness may also be the same. * It used to be a 7±2 chunk and was called a magic number. 76
  62. Frequency Representation and Parallelism • The affect range control and

    the channel representation by the frequency can achieve the same processing by adding the influence range information and channel information directly to the signal. • But parallel processing becomes difficult, and there is a possibility that the performance falls when the scale increases. • I think that it is good to express everything by "wave superposition and time" such as quantum mechanics and process it into particles only when obtaining information. 77
  63. Back-firing • NCN has a back-firing function in which the

    backward propagation of the signal in the opposite direction of the relation in addition to the forward propagates for top- down, bottom-up analysis of concepts represented by the Concept Network function. • The following slide shows the order of the process of forward firing and back firing. 78
  64. • (forward) firing animation 1. Input signal 2. Increase/decrease in

    potential 3. propagate potential 4. increase/decrease in potential by weight 79
  65. • back-firing animation 1. Input backward signal 2. Increase/decrease in

    potential 3. back propagate potential 4. increase/decrease in potential 80
  66. Types of ions • Use virtual ions and ion channels

    to achieve forward and back firing. • In addition to this, three frequency channels, phase, and input amount are combined to form an input signal. forward backward Excitatory fo-ex (+1eV) ba-ex (+1eV) inhibitory fo-in (-1eV) ba-in (-1eV) 81
  67. Types of Ions and Quantum chromodynamics • The combination of

    frequency channels, forward/retrograde, and excitability/inhibitory properties may be linked to the color charge and top/bottom, strange/charm, up/down, which is the nature of the quark. 82
  68. Reproduction of back firing function in biological neurons • Back

    propagation by electrical synapses – In a chemical synapse, signals propagate forward only. – However, in an electrical synapse, the signal propagates forward and backward. – Electrical synapses are present in inhibitory neurons in the hippocampus and cerebral cortex. • Back propagation on dendrites – Potential change may propagate backward in the dendrites. → It seems to be difficult to reproduce with a simple network. 83
  69. Closer to biological neurons • To reproduce the propagation velocity

    on axons and dendrites, Introduce a parameter called width (thickness?) . width = 0.5 width = 1.0 width = 2.0 animation 84
  70. Axons and dendrites • The propagation velocity of the axon

    (myelinated nerve) is fast. • The propagation velocity of the dendrites is slow. • Activity potential may also occur on the dendrites, called dendritic-spike. 85
  71. Propagation velocity of biological neurite • A thicker biological neurite

    propagates action potential faster than thinner one. • Potential less than the threshold is propagated while attenuating and, in that case, a thicker neurite may propagates potential slowly more than thinner one. 86
  72. Collision between forward, and back propagation signal • NCN uses

    back propagation not only for learning, but also for analyzing concepts. • When the forward signal from the source neuron(concept) and the backward signal from the destination neuron (concept) are input at the same (or 1/n or n times) frequency, the collision point on neurite changes by changing the phase of input signals. 87
  73. • It is possible to estimate the structure of the

    neural (concept) networks by analyzing reactions of each neurons. Input forward signal Input backward signal animation 88
  74. Signal collisions less than the threshold • When the forward

    propagation signal and the back- propagation signal of less than the threshold collides and if the potential exceeds the threshold, action potential occurs in the middle of the neurite. • This state can also be utilized to analyze the networks structure. 89
  75. Relative representation of concepts by phase control • Controlling the

    collision point of the signal by phase control of the forward signal and retrograde signal corresponds to changing the position of the relative representation in the conceptual network. 90
  76. Structural representation by reaction timing • The changing the frequency

    and the phase of the top-down and the bottom-up signal and by analyzing its firing reaction it is possible to infer the structure of the network. • This means that the network structure can be coded at the firing timing and rate. • Of course, there is no point in inferencing and knowing the structure of a predefined network. • If this signal can be processed by a neural network, it means that a meta neural network that dynamically represents and processes a virtual neural network can be realized. 91
  77. Similarity between consciousness, concentration and radar • There is a

    possibility to make use of the similarity of the function of the radar in consideration and concentration. • Radar scanning mode – Lock-on mode • TWS (Track While Scan) : Multiple target can be tracked at the same time. • STT (Single Target Track) : Only single target is tracked. It may be related to the concentration of consciousness. • Phased Array Radar – It is possible to have a directivity to the synthetic wave by shifting the oscillation timing of many radar arrays. – It may be possible to transmit directed signals by the same mechanism in the brain. 96
  78. Forming new concept from concept loop WIP 102 A B

    C D F E G H A B C D F E G H “I”
  79. Concept, relation and superstring theory WIP • Concept has a

    size. • Concept and Relation can be converted to each other. • Relation becomes Concept when rounded. • Concept becomes Relation when make smaller and stretched it. 103
  80. Reference materials No. Title 1 I Am A Strange loop

    2 The Neural Code of Pitch and Harmony 3 4 105