mantel/documents/kohonen.ipynb

275 lines
44 KiB
Plaintext
Raw Normal View History

2024-06-11 21:08:08 +10:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Self Organising Map Challenge\n",
"\n",
"## The Kohonen Network\n",
"\n",
"The Kohonen Self Organising Map (SOM) provides a data visualization technique which helps to understand high dimensional data by reducing the dimensions of data to a map. SOM also represents clustering concept by grouping similar data together.\n",
"\n",
"Unlike other learning technique in neural networks, training a SOM requires no target vector. A SOM learns to classify the training data without any external supervision.\n",
"\n",
"![Network](http://www.pitt.edu/~is2470pb/Spring05/FinalProjects/Group1a/tutorial/kohonen1.gif)\n",
"\n",
"### Structure\n",
"A network has a width and a height that descibes the grid of nodes. For example, the grid may be 4x4, and so there would be 16 nodes.\n",
"\n",
"Each node has a weight for each value in the input vector. A weight is simply a float value that the node multiplies the input value by to determine how influential it is (see below)\n",
"\n",
"Each node has a set of weights that match the size of the input vector. For example, if the input vector has 10 elements, each node would have 10 weights.\n",
"\n",
"### Training \n",
"To train the network\n",
"\n",
"1. Each node's weights are initialized.\n",
"2. We enumerate through the training data for some number of iterations (repeating if necessary). The current value we are training against will be referred to as the `current input vector`\n",
"3. Every node is examined to calculate which one's weights are most like the input vector. The winning node is commonly known as the Best Matching Unit (BMU).\n",
"4. The radius of the neighbourhood of the BMU is now calculated. This is a value that starts large, typically set to the 'radius' of the lattice, but diminishes each time-step. Any nodes found within this radius are deemed to be inside the BMU's neighbourhood.\n",
"5. Each neighbouring node's (the nodes found in step 4) weights are adjusted to make them more like the input vector. The closer a node is to the BMU, the more its weights get altered.\n",
"6. Go to step 2 until we've completed N iterations.\n",
" \n",
"\n",
"### Calculating the Best Matching Unit (BMU)\n",
"\n",
"To determine the best matching unit, one method is to iterate through all the nodes and calculate the Euclidean distance between each node's weight vector and the current input vector. The node with a weight vector closest to the input vector is tagged as the BMU.\n",
"\n",
"The Euclidean distance $\\mathsf{distance}_{i}$ (from the input vector $V$ to the $i$th node's weights $W_i$)is given as (using Pythagoras):\n",
"\n",
"$$ \\mathsf{distance}_{i}=\\sqrt{\\sum_{k=0}^{k=n}(V_k - W_{i_k})^2}$$\n",
"\n",
"where V is the current input vector and $W_i$ is the node's weight vector. $n$ is the size of the input & weight vector.\n",
"\n",
"*Note*: $V$ and $W$ are vectors. $V$ is the input vector, and $W_i$ is the weight vector of the $i$th node. $V_k$ and $W_{i_k}$ represent the $k$'th value within those vectors. \n",
"\n",
"The BMU is the node with the minimal distance for the current input vector\n",
"\n",
"### Calculating the Neighbourhood Radius\n",
"\n",
"The next step is to calculate which of the other nodes are within the BMU's neighbourhood. All these nodes will have their weight vectors altered.\n",
"\n",
"First we calculate what the radius of the neighbourhood should be and then use Pythagoras to determine if each node is within the radial distance or not.\n",
"\n",
"A unique feature of the Kohonen learning algorithm is that the area of the neighbourhood shrinks over time. To do this we use the exponential decay function:\n",
"\n",
"Given a desired number of training iterations $n$:\n",
"$$n_{\\mathsf{max iterations}} = 100$$\n",
"\n",
"Calculate the radius $\\sigma_t$ at iteration number $t$:\n",
"\n",
"$$\\sigma_t = \\sigma_0 \\exp\\left(- \\frac{t}{\\lambda} \\right) \\qquad t = 1,2,3,4... $$\n",
"\n",
"Where $\\sigma_0$ denotes the neighbourhood radius at iteration $t=0$, $t$ is the current iteration. We define $\\sigma_0$ (the initial radius) and $\\lambda$ (the time constant) as below:\n",
"\n",
"$$\\sigma_0 = \\frac{\\max(width,height)}{2} \\qquad \\lambda = \\frac{n_{\\mathsf{max iterations}}}{\\log(\\sigma_0)} $$\n",
"\n",
"Where $width$ & $height$ are the width and height of the grid.\n",
"\n",
"### Calculating the Learning Rate\n",
"\n",
"We define the initial leanring rate $\\alpha_0$ at iteration $t = 0$ as:\n",
"$$\\alpha_0 = 0.1$$\n",
"\n",
"So, we can calculate the learning rate at a given iteration t as:\n",
"\n",
"$$\\alpha_t = \\alpha_0 \\exp \\left(- \\frac{t}{\\lambda} \\right) $$\n",
"\n",
"where $t$ is the iteration number, $\\lambda$ is the time constant (calculated above)\n",
" \n",
"### Calculating the Influence\n",
"\n",
"As well as the learning rate, we need to calculate the influence $\\theta_t$ of the learning/training at a given iteration $t$. \n",
"\n",
"So for each node, we need to caclulate the euclidean distance $d_i$ from the BMU to that node. Similar to when we calculate the distance to find the BMU, we use Pythagoras. The current ($i$th) node's x position is given by $x(W_i)$, and the BMU's x position is, likewise, given by $x(Z)$. Similarly, $y()$ returns the y position of a node.\n",
"\n",
"$$ d_{i}=\\sqrt{(x(W_i) - x(Z))^2 + (y(W_i) - y(Z))^2} $$\n",
"\n",
"Then, the influence decays over time according to:\n",
"\n",
"$$\\theta_t = \\exp \\left( - \\frac{d_{i}^2}{2\\sigma_t^2} \\right) $$\n",
"\n",
"Where $\\sigma_t$ is the neighbourhood radius at iteration $t$ as calculated above. \n",
"\n",
"Note: You will need to come up with an approach to x() and y().\n",
"\n",
"\n",
"### Updating the Weights\n",
"\n",
"To update the weights of a given node, we use:\n",
"\n",
"$$W_{i_{t+1}} = W_{i_t} + \\alpha_t \\theta_t (V_t - W_{i_t})$$\n",
" \n",
"So $W_{i_{t+1}}$ is the new value of the weight for the $i$th node, $V_t$ is the current value of the training data, $W_{i_t}$ is the current weight and $\\alpha_t$ and $\\theta_t$ are the learning rate and influence calculated above.\n",
"\n",
"*Note*: the $W$ and $V$ are vectors "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Example 10x10 network after 100 iterations"
]
},
{
"cell_type": "code",
"execution_count": 147,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x7fce01327978>"
]
},
"execution_count": 147,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAPgAAAD8CAYAAABaQGkdAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAADAFJREFUeJzt3d2LnOUZx/Hfb2b2JbuJxlpLMZFqodhKoSiLqAEP1AOtUk96YEGhpZCD1lcEsaXgPyCiB1IIsT1R9CB6ICJqQT3oSXCNQhtXQdTGVK0pmpjX3Z2Zqwe7hfiSnSfZ++6ze/X7ASG7Tq5cbPabZ3Z29h5HhADk1Gl7AQD1EDiQGIEDiRE4kBiBA4kROJAYgQOJETiQGIEDifVqDJ2a3BSbp88tP3g4KD9TkofD8jNVZ9eO+nXmdurs2+2U/9hKUqdb/hmYrrSrOuV3/dfBeR061veo21UJfPP0ufr1jX8oPrd3+EjxmZLUWThafObE8GDxmZI01TlQZ+74oSpzN06fqDJ3atNC8ZnjlXbtbCg/9zePvd3szy7+JwNYMwgcSIzAgcQIHEiMwIHECBxIrFHgtq+3/Y7td23fX3spAGWMDNx2V9Kjkm6QdImkX9i+pPZiAFavyRX8cknvRsR7EbEg6SlJN9ddC0AJTQLfIunDk97ev/y+L7G93fas7dmj84dL7QdgFZoE/k3Pd/3ak2sjYkdEzETEzPTEptVvBmDVmgS+X9IFJ729VdJHddYBUFKTwF+T9APbF9kel3SLpGfrrgWghJE/TRYRfdu3S3pRUlfSnyJib/XNAKxaox8XjYjnJT1feRcAhfFMNiAxAgcSI3AgMQIHEiNwILEqhy6q25U2bS4+djgceYjkGem4wkmlg275mZJq/ZvsOp8J8lilk3DHyx+66Mn54jMlqVNhrt3sBFiu4EBiBA4kRuBAYgQOJEbgQGIEDiRG4EBiBA4kRuBAYgQOJEbgQGIEDiRG4EBiBA4kRuBAYgQOJEbgQGIEDiRG4EBiBA4kRuBAYlXO0ozemAbnfbf83ImJ4jMlqXP4ay93vmrDE8eKz5SkGI5VmauxOifWarrCibWSYrr8SaUxdaL4TEkaVjgBNjrNPme5ggOJETiQGIEDiRE4kBiBA4kROJDYyMBtX2D7Fdtztvfavut/sRiA1WvyffC+pHsjYo/tTZJet/2XiHir8m4AVmnkFTwiPo6IPcu/PixpTtKW2osBWL3T+hrc9oWSLpW0u8YyAMpqHLjtjZKelnR3RHzxDf9/u+1Z27PHjh0suSOAM9QocNtjWor7iYh45ptuExE7ImImImampjaX3BHAGWryKLolPSZpLiIeqr8SgFKaXMG3SbpN0jW231z+76eV9wJQwMhvk0XEXyVV+llCADXxTDYgMQIHEiNwIDECBxIjcCCxSocu9rT47e8Un9uZniw+U5Jielh8po8cKT5Tksb7n1WZ2+9168ydKv+xlaTFjYvlh05WmCnJvUHxmWEOXQT+7xE4kBiBA4kROJAYgQOJETiQGIEDiRE4kBiBA4kROJAYgQOJETiQGIEDiRE4kBiBA4kROJAYgQOJETiQGIEDiRE4kBiBA4nVOVW101F/eqL83LGp4jMlaaG3sfjMxcnp4jMlKQZ1TpZVd7zO3A11TmsdTpS/No316rwEX6dTfu6w4Uyu4EBiBA4kRuBAYgQOJEbgQGIEDiRG4EBijQO33bX9hu3nai4EoJzTuYLfJWmu1iIAymsUuO2tkm6UtLPuOgBKanoFf1jSfZJO+WrutrfbnrU9e/xInRepB3B6RgZu+yZJn0bE6yvdLiJ2RMRMRMxs2PitYgsCOHNNruDbJP3M9geSnpJ0je3Hq24FoIiRgUfE7yJia0RcKOkWSS9HxK3VNwOwanwfHEjstH4ePCJelfRqlU0AFMcVHEiMwIHECBxIjMCBxAgcSKzKqarDzlBHJ04Unxvd8jMlaeDF4jO7Y1F8piQdr/NXpvmJ8qfgStLiVKWTcMfL/51N1jlUVd0ov+uw2+zazBUcSIzAgcQIHEiMwIHECBxIjMCBxAgcSIzAgcQIHEiMwIHECBxIjMCBxAgcSIzAgcQIHEiMwIHECBxIjMCBxAgcSIzAgcQIHEisyhGdg05fn09+Wn7uwrHiMyVpMPys+EzraPGZkjTRG1SZe/yssSpzT5y9qcrc49Plr01TvcniMyVpQuVPVR2Mv9/odlzBgcQIHEiMwIHECBxIjMCBxAgcSKxR4LY3295l+23bc7avrL0YgNVr+n3wRyS9EBE/tz0uqc5LRgIoamTgts+SdLWkX0pSRCxIWqi7FoASmtxF/76kA5L+bPsN2zttT1feC0ABTQLvSbpM0h8j4lJJRyXd/9Ub2d5ue9b27Pyhg4XXBHAmmgS+X9L+iNi9/PYuLQX/JRGxIyJmImJm4uzNJXcEcIZGBh4Rn0j60PbFy++6VtJbVbcCUETTR9HvkPTE8iPo70n6Vb2VAJTSKPCIeFPSTOVdABTGM9mAxAgcSIzAgcQIHEiMwIHECBxIrMqpqn339fnEgeJzB8P1c6qqBnWerjvRm68yd3GyW2Xu4JwNVeb2N5ffd3GizqmqGzrlfzZrMN4sXa7gQGIEDiRG4EBiBA4kRuBAYgQOJEbgQGIEDiRG4EBiBA4kRuBAYgQOJEbgQGIEDiRG4EBiBA4kRuBAYgQOJEbgQGIEDiRW5dDFoQY63jlSfO4gys+UpP58+QMS49jh4jMlqe/yB/hJUsdVxqq3YbzK3M50+UMXuxN1Dp7sdMpfR6PhXxhXcCAxAgcSI3AgMQIHEiNwIDECBxIjcCCxRoHbvsf2Xtt/t/2k7Tqv0gagqJGB294i6U5JMxHxY0ldSbfUXgzA6jW9i96TtMF2T9KUpI/qrQSglJGBR8Q/JT0oaZ+kjyUdioiXvno729ttz9qeXfyiztM0AZyeJnfRz5F0s6SLJJ0vadr2rV+9XUTsiIiZiJgZO2tT+U0BnLYmd9Gvk/R+RByIiEVJz0i6qu5aAEpoEvg+SVfYnrJtSddKmqu7FoASmnwNvlvSLkl7JP1t+ffsqLwXgAIa/Tx4RDwg6YHKuwAojGeyAYkROJAYgQOJETiQGIEDiVU5VVUx1GDxRPGx/fljxWfWmhsL88VnSlJnYlBlbr/Sv/WLnTonlZ5w+X3HHcVnSlLPw+Izh+JUVeD/HoEDiRE4kBiBA4kROJAYgQOJETiQGIEDiRE4kBiBA4kROJAYgQOJETiQGIEDiRE4kBiBA4kROJAYgQOJETiQGIEDiRE4kJgjyp8kafuApH80uOm3Jf27+AL1rKd919Ou0vrady3s+r2IOG/UjaoE3pTt2YiYaW2B07Se9l1Pu0rra9/1tCt30YHECBxIrO3Ad7T855+u9bTvetpVWl/7rptdW/0aHEBdbV/BAVTUWuC2r7f9ju13bd/f1h6j2L7A9iu252zvtX1X2zs1Ybtr+w3bz7W9y0psb7a9y/bbyx/jK9veaSW271n+PPi77SdtT7a900paCdx2V9Kjkm6QdImkX9i+pI1dGuhLujcifiTpCkm/XcO7nuwuSXNtL9HAI5JeiIgfSvqJ1vDOtrdIulPSTET8WFJX0i3tbrWytq7gl0t6NyLei4gFSU9JurmlXVYUER9HxJ7lXx/W0ifglna3WpntrZJulLSz7V1WYvssSVdLekySImIhIg62u9VIPUkbbPckTUn6qOV9VtRW4FskfXjS2/u1xqORJNsXSrpU0u52NxnpYUn3SSr/wtRlfV/SAUl/Xv5yYqft6baXOpWI+KekByXtk/SxpEMR8VK7W62srcC/6dXL1/TD+bY3Snpa0t0R8UXb+5yK7ZskfRoRr7e9SwM9SZdJ+mNEXCrpqKS1/HjMOVq6p3mRpPMlTdu+td2tVtZW4PslXXDS21u1hu/q2B7TUtxPRMQzbe8zwjZJP7P9gZa+9LnG9uPtrnRK+yXtj4j/3iPapaXg16rrJL0fEQciYlHSM5KuanmnFbUV+GuSfmD7ItvjWnqg4tmWdlmRbWvpa8S5iHio7X1
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"plt.imshow(image_data)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Example 100x100 network after 1000 iterations"
]
},
{
"cell_type": "code",
"execution_count": 141,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x7fce012f8208>"
]
},
"execution_count": 141,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAP4AAAD8CAYAAABXXhlaAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJztvW3MdttVFnrNdd/vBlsPlqJwNm09LUmDEhOFNAhycmIo5giHWH6gQQxpSLV//MCPRAv+0JN4EkmMwI8Tkp3TY6ohp3gqsQSNhlT84Z/KLhD5KAiCaTdUWg1Q0pzj+9xrTX+scV1zjDHn/bz3/uj9vNt7jmTv511rzTXXmGute42va4xRaq2YNGnSbdHy0AxMmjTp+jR/+JMm3SDNH/6kSTdI84c/adIN0vzhT5p0gzR/+JMm3SDNH/6kSTdIL+uHX0r546WUXyil/FIp5T2vFFOTJk367FJ5qQCeUsoBwL8H8McAvADgxwH86Vrrz71y7E2aNOmzQceXce5XAvilWusvA0Ap5f0A3gHg7A//Nb/7c+vrfu9r7520opzZP9pZ7JgdLSUOLm2uc9837q/5XABkpeb5aj+0O+k+vjMPZ9Y8PvnMWJ5S23H+S/N38w3mquNjft4n8lRfzD04x9uIu3wd7h2NHI89exxA6dZYzuwfjLl3Hfffn/u28hx6FUc3tQCf/tTH8f99+r/c/5Lg5f3w3wDg4277BQB/uOOllHcDeDcA/K43vQZ/7t98PeqWf1FtQady2LdphKz72G2JC98HH2zIvrfyXJt3K215K0+0ibfNprCHsNm5/qHYpbHZOfw4bMb/hiWfgrrZ/AuXaNv6NrXBmz4gHJy2t7TtLsbr8EXhD7OuBw3VD387xvnrIayj3Ww3r43hQto63IelxmP6OKTtEf/arHGt/kfCfy86Z7FtO2fNx4HF+NYYzmHr4I+Y4/yxxe5du+4j29/4X+w9WVbuOdq88bpla+ccNn4ceIzn2HXdLWn7+BzIvz2zwjnQU6n4we/6usGBnl7OD3/0Vem+Q7XW5wA8BwBf/BVfUBds+sGsSxt+sAUfbYrVfhUl/wjcFfi8eZM2/VD5Arq7s/GHbR8FPtwlv9C9yOeDWTmmUBLYB8ffCg7hBwWRhh9qzlPydv+BbBJms/nsReB98qKAP0DUONbmK7xvjkl9lzZ+PPW1SAOctpTm0zbi9R37/ssYzg1D9RzTs7Hn2uZ3ZL8I8cAfON8Few+8FOehqrVyTFo7AFT7xdtHgYcONgmF2sH/Mkv86FQ7xtd/cx9e/iBr+mDx3eZz5wdoH1vD30vo5Tj3XgDwJrf9RgC/9jLmmzRp0pXo5Uj8Hwfw1lLKWwD8KoBvAfCtTzppwyJJ5r+6GyglduL37GSfRX5ANydxsPDrHtU3foXXNhKlUNLrZF1536JJ4aff960STrKw0qqc9GiGp61RDMTjblAteT7O1V9NX/ckRbbB6CaJ07zkZRtI8SZabA5Oa3OtjX+thVrTFnngPS/elEjnLt3a/TZFMbWa+N4U3WpvfvBfUVUWD9y9tXVkG7xI20HPo3xCXFvS1g5UO715E01Qrc7u28HtpiZaTA2TZpp8CJvXKKRhXS7xX/IPv9Z6KqX8BQD/Evvv9P+utf7sS51v0qRJ16OXI/FRa/3nAP75K8TLpEmTrkQv64f/YqkAOKACZVfCt6W/PDWwTeo7HYHRqQU0pwwdKyepmPHcfSfNAnpUTZUyJ0kp0fmzH0uqkzRjO3eh57v3OLbwV1SdRz5ROnuk0sqXFdX6ne/iDw2UO+8ds/uxSieO6+jPcKZEVKcVDfFeIXmr07VlSsR17LS4/7uzqa5uPf/Z7Gh+xnyv0T2jtjtGAGTSoFeRC725Gx1pzlRJUYlFzDDSMIrnrXbN/d07JB591GPR+53WvpnhuvRO0KXEd+ISmpDdSZNukK4q8VH2D+PGAHld48F9J4De6cMQ4OJEzknx+32bX1+F5pb21VfMml/UhfOZtrAQPzBw+vCUFNeHHJJOejQR6f7fJGeSLeF8nZkkcsQ7MAxpvKxxzNiRZrRRwsSwZJQUWazG/QW9FtUWaeE1xeR5/5sW1c1rEl6zBkdalq48lRe0sJ53EuvRR82i5pvhtLklLXmzf9C56x2yJeEOqBXqh8TQr9NMj2t891ZqpLqnTjtZ40+ymou6LPG6i3Ndy0eLZRwvHtCU+JMm3SBdV+LXilLXhmDyKDbZYPZVZJiKX18CVg5OuhLcIEMohpG8JOMXsxAJSGlnpx7QnyNEWmenU7pyjkGIjlIvKg1Bogkgksy6FhQbAGDEG6ejuCKK0MeGkmSXhhHjnT7MlKHLzVSOoVN/Td3EFFYaAZCk3ST/AOeNyD0eTNu6UVxzk5iHLSIwBahZ4nM5hjUzNLpvUwtcTAssp7bkFkOMPqEt+WkWx2126ygsuVK79QftPss/w3eAoCX6pny4kNrl6WIQz5T4kybdIF1X4qNgw4JysK+W96wSBy+PPL9cEbIY7S2O4D/2v6ca7bv9PPsrIIpFFuwWULMo7ku6JHt03eJ3sjl/wyfb5skaRC/9JIGTd/q+pBBJ0+TdH/kbst+E12saxcBuz+dqHbynjf+tRlUlBSPatmeJcNuEOR/Ck+NjdZxRyyH4yvsd7JwS17yZM+RgnnXPE11BGbhDjWx1Y+mtoJbE7QYqSn4UOHBaBkdRCwl+GYsAMPq0nIzvqKGG34E06DoKGg1pSvxJk26Qrizxq76IAHDwX0VKekoA2YC0g5I9CW8a5+ywbmgHreSncSn0jlqm3yChpJ1CL7JhC9qq2ilnvrgNuttNLyEnEHHiPySUnJXwgwsTpiyNhP6SbPM7bESNyUydfyOE2fP9tvlMRDNoEyU2fTeR7UVJNN6Sj3KpJg1gmM6cPfN2DyQxyZvLTBJOQskzR04S5tj3RR42WySfA30J/j0v1Ulk95cQ5/D+tHCHXcfGUrdQIMVrvmvj/8Jg/pT4kybdIF09jn84NM+qB2kxxiyvrwKwJR339lC0CzeJgpS33U5vUvwc2sl7oE0SrJI0+zmU9LI1Qyos2S7hb++ZhhJFWhw/SfHROZ2kr2G/txeRpJ7ST5PU8NEViKfMY/LUAy2HnvN7WAY8oswnlPDZICxO9q9fbY0aloiSU9O6Z5agBSnPp53jgX2KrqRkGr16LuavE5dwKuPqSrIJyUw2j6IG9GMx+ci/c0zvZRTFJD0LAKTUZwA4KPpzoYGPKfEnTbpJmj/8SZNukK4M4Nmddkq2CPDYnZRoI80mOkBWr6KVpAsnAI8vLdMq1aQxqeyVB15snaoZ9dOlC6k5uLDNv2xRRfZqow8dxnlMlU3OSr9khScTiMarjdnZmU2HJYF/AO/4iyAWhTv9A9B9Z6JTZo7rdM7JdA+zE3HxZgHifdcjU1jMns/m5w+HmkNN5Zl6GDEBQMoJosrPkVuzYWQi8vkmKC0dbcUloDXTId1DOvmc81ZlukpJYxj+FGZXtAqzuyG8LPfQlPiTJt0gXTmch91hQyikcwYpnEeJw9RXhU16UM5GBwgLINJpSO3ASRqep1p+qW6egBfekbREqbQlidlJL7s6gFZsMVXtCY66c5IxaxLhI540lJyYNAgN6fQSeZBwDdVoIr9VAoaS2SfUphgf16xlcL+TmK6qnD+naQuO/6RA5HuoIpshAzfdtC3eU4Fd/EkG6pEGl4BNm7+nBO7YPVu3k43d30E62rC6l1sFNBmai+9cgU8Ft1Ns/kN6T5X+60FjhzWOuYCmxJ806QbpASC7B5fA4kAOsuFzYgy/lvumr3PQQnI57kXwg0vhTXZ6q8Kq+N49XJs2shB8wjnOn0oJudb4xQ6QWs3T2+eO1SDx21JrPhSu48c6vLPtj2sPbhOdG23MUVENScYtji35On7NtFFzgs0onLdFidiiprxvsV5i2MdknRS2bXDfoDvaGjnWngev7xOryL92cSw5o43vnoxhfleGV+vJ1s6aez7caX9SKffma6EKNgAIrZfX2Z0Sf9KkG6Sr2/jL1ooXxFwUgzzaJ+sUT5M
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"plt.imshow(image_data)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Challenge\n",
"\n",
"Sam has written an implementation of a Self Organising Map. Consider the following criteria when assessing Sam's code:\n",
"\n",
"- Could the code be made more efficient? A literal interpretation of the instructions above is not necessary.\n",
"- Is the code best structured for later use by other developers and in anticipation of productionisation?\n",
"- How would you approach productionising this application?\n",
"- Anything else you think is relevant."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# kohonen.py\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"def train(input_data, n_max_iterations, width, height):\n",
" σ0 = max(width, height) / 2\n",
" α0 = 0.1\n",
" weights = np.random.random((width, height, 3))\n",
" λ = n_max_iterations / np.log(σ0)\n",
" for t in range(n_max_iterations):\n",
" σt = σ0 * np.exp(-t/λ)\n",
" αt = α0 * np.exp(-t/λ)\n",
" for vt in input_data:\n",
" bmu = np.argmin(np.sum((weights - vt) ** 2, axis=2))\n",
" bmu_x, bmu_y = np.unravel_index(bmu, (width, height))\n",
" for x in range(width):\n",
" for y in range(height):\n",
" di = np.sqrt(((x - bmu_x) ** 2) + ((y - bmu_y) ** 2))\n",
" θt = np.exp(-(di ** 2) / (2*(σt ** 2)))\n",
" weights[x, y] += αt * θt * (vt - weights[x, y])\n",
" return weights\n",
"\n",
"if __name__ == '__main__':\n",
" # Generate data\n",
" input_data = np.random.random((10,3))\n",
" image_data = train(input_data, 100, 10, 10)\n",
"\n",
" plt.imsave('100.png', image_data)\n",
"\n",
" # Generate data\n",
" input_data = np.random.random((10,3))\n",
" image_data = train(input_data, 1000, 100, 100)\n",
"\n",
" plt.imsave('1000.png', image_data)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.1"
}
},
"nbformat": 4,
"nbformat_minor": 2
}