Associative memory

edited July 2017 in Share Your Work

I don't know if this is the simplest form of associative memory ever but there ain't much to it. Almost nothing in fact! https://discourse.numenta.org/t/independent-value-storage-by-binarization/2384/9

AssociativeMemory am=new AssociativeMemory(512,15,1234567);
float[][] examples=new float[4][512];
float[] x=new float[512];

void setup() {
  size(512, 440);
  background(0);
  frameRate(4);
  for(int i=0;i<512;i++){
    examples[0][i]=50f*sin(0.1f*i);
    examples[1][i]=50f*cos(0.3f*i);
    examples[2][i]=10f*(i%11)-50;
    examples[3][i]=0.0003f*i*i-50;
  }
}

void draw() {
  clear();
  for(int i=0;i<4;i++){
    am.recallVec(x,examples[(i-1)&3]);
    for(int j=0;j<512;j++){
      set(j,60+i*110+(int)x[j],color(255));
    }
  }
  for(int i=0;i<4;i++){
    am.trainVec(examples[(i+1)&3],examples[i]);
  }
}

class AssociativeMemory {

  int vecLen;
  int density;
  int hash;
  float[][] weights;
  float[][] bipolar;
  float[] workA;
  float[] workB;
// vecLen must be 2,4,8,16,32.....
  AssociativeMemory(int vecLen, int density, int hash) {
    this.vecLen=vecLen;
    this.density=density;
    this.hash=hash;
    weights=new float[density][vecLen];
    bipolar=new float[density][vecLen];
    workA=new float[vecLen];
    workB=new float[vecLen];
  }

  void trainVec(float[] targetVec, float[] inVec) {
    float rate=1f/density;
    recallVec(workB, inVec);
    for(int i=0;i<vecLen;i++){
      workB[i]=targetVec[i]-workB[i];
    }
    for (int i=0; i<density; i++) {
      for (int j=0; j<vecLen; j++) {
        weights[i][j]+=workB[j]*bipolar[i][j]*rate;
      }
    }
  }

  void recallVec(float[] resultVec, float[] inVec) {
    System.arraycopy(inVec, 0, workA, 0, vecLen);
    java.util.Arrays.fill(resultVec, 0f);
    for (int i=0; i<density; i++) {
      signFlip(workA, hash+i);
      wht(workA);
      signOf(bipolar[i], workA);
      for (int j=0; j<vecLen; j++) {
        resultVec[j]+=weights[i][j]*bipolar[i][j];
      }
    }
  }

  // Walsh Hadamard Transform  vec.length must be (2,4,8,16,32.....)
  void wht(float[] vec) {
    int i, j, hs=1, n=vec.length;
    float a, b, scale=1f/sqrt(n);
    while (hs<n) {
      i=0;
      while (i<n) {
        j=i+hs;
        while (i<j) {
          a=vec[i];
          b=vec[i+hs];
          vec[i]=a+b;
          vec[i+hs]=a-b;
          i+=1;
        }
        i+=hs;
      }
      hs+=hs;
    }
    for ( i=0; i<n; i++) {
      vec[i]*=scale;
    }
  }

  // recomputable random sign flip of the elements of vec 
  void signFlip(float[] vec, int h) {
    for (int i=0; i<vec.length; i++) {
      h*=0x9E3779B9;
      h+=0x6A09E667;
      // Faster than -  if(h<0) vec[i]=-vec[i];
      vec[i]=Float.intBitsToFloat((h&0x80000000)^Float.floatToRawIntBits(vec[i]));
    }
  }

  // converts each element of vec to +1 or -1 according to their sign.
  void signOf(float[] biVec, float[] vec ) {
    int one=Float.floatToRawIntBits(1f);
    for (int i=0; i<biVec.length; i++) {
      biVec[i]=Float.intBitsToFloat(one|(Float.floatToRawIntBits(vec[i])&0x80000000));
    }
  }
}

Comments

  • Hi Sean -- thank you so much for sharing this interesting work!

    Would you be willing to edit your original post and say something about:

    1. What field(s) of research "associative memory" comes from, and (briefly) what it is
    2. What your sketch does -- eg, it clearly creates an AssociativeMemory class and uses it, what is it doing?

    This would really help others thinking about how they might learn from your example.

  • Associative memory has to do with machine learning, AI and probably your brain. There was a lot of interest in the topic in the 1960's, then it lost out to other things. Basically you can teach it pattern pairs (vector to vector associations <vector,vector>.) You can also modify it to learn <vector,scalar> pairs, where a scalar is just some floating point value like 1.1 or 4.5.
    You can think of it as a <vector,vector> hash table if you like. However it will accept inexact (noisy) inputs and still produce identifiable outputs.

  • I see in the sketch that in one of the recalled patterns the pixels move about. I'm guessing but I would say that due to rounding errors like 4.99999 being viewed as 4 and 5.00001 being viewed as 5.

Sign In or Register to comment.