Upgrade to Pro — share decks privately, control downloads, hide ads and more …

HEARING COLOURS AND SEEING SOUNDS USING PYTHON

HEARING COLOURS AND SEEING SOUNDS USING PYTHON

We humans are the masterpieces. We are born with awesome capabilities and applaudable talents making us different from one another. As an individual , nobody thinks the way others do , they have their own intriguing ways to look at this world. Few of us have this exceptional quality to be able to visualise sounds and map colours with the words. It is termed as Synesthesia. It is fascinating to know how human brain can see sounds and hear colours, how it thinks of a word, and sees blocks of colours, and then these blocks get translated into letters. There are few who see the letters with a halo-effect. The letters give- off almost a glow of a certain colour.It is a striking fact that the synesthetic brains are wired a little differently from the ones of non-synesthetes.
This talk will cover the interesting applications of python programming by harnessing simple mathematical techniques to simulate the synesthetic ( bridging modalities) behaviour of brain. It is really appealing to be able to look at the world wearing colourful shades and building an AI that does bridge these modalities (vision , hearing ,etc ) is truly astounding. The talk covers an interesting viewpoint on how python can be used widely to analyse some of the really complex human brain behaviours ,in addition to solving comparatively trivial real world problems.

AAKANKSHA CHOUHAN

May 03, 2019
Tweet

Other Decks in Programming

Transcript

  1. 1. Handwritten letters database - EMNIST 2. Synesthesia Color-Letter Pairs

    - Each grayscale letter image is converted to a 3-channel (R,G,B) image using experimental statistics A DEEP LEARNING MODEL (Bock et Al. 2018)
  2. ❏ A GAN is developed here - which both D

    and G are implemented as multilayer convolutional neural networks, ❏ The trained generator network G is a model for grapheme-color synesthesia. ❏ Achromatic letter images - induce computational synesthesia. ❏ G learns to create coloured version .
  3. 1.The conditional GAN model of grapheme-color synesthesia perception - a

    deep convolutional neural network (CNN) implementation. 2. The generator network encodes the input image by six successive hidden layers, each outputting a reduced-dimensional image relative to the preceding layer. 3. The representation of features of the original input image is increasingly abstracted and noise-filtered after each encoding layer of processing
  4. • Automatic induction of the psychological experience (in the trained

    GAN) • Invariance of colors across changes in form of inducing symbols • Stability of synesthetic associations over time and • Learnability of synesthesia by non-synesthetes .
  5. References • https://github.com/Enderb/chromestesia • http://sound-of-pixels.csail.mit.edu • https://www.researchgate.net/publication/259353806_Numerical_synesthesia_is_ more_than_just_a_symbol-induced_phenomenon • https://www.academia.edu/18816690/Audio_visual_mapping_with_cross-modal_h

    idden_Markov_models • https://deepmind.com/blog/objects-that-sound/ • https://www.researchgate.net/publication/323737970_A_Deep_Learning_Model_of _Perception_in_Color-Letter_Synesthesia • https://github.com/sfpc-amd/synesthesia-exhibition • https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3606019/ • http://home.wlu.edu/~lambertk/classes/196/ImageAndSound.html • https://medium.com/sap-machine-learning-research/cross-modal-hallucination-fo r-few-shot-fine-grained-recognition-d89099187818