2016

AUTUMN

Started a Doctorate Degree

Initiated a Paper Review System

Experimented with OpenGL in Nim

SUMMER

Analysed the Genome of the Longtail

Created a Simple SPDY Proxy Sever

Tracked Feet Positions from a Camera

SPRING

Performed Basic Programing in Prolog

Started Gathering Literature for Thesis

Simulated a Student Marking Session

WINTER

Built an Object Detector

Developed Machine Learning Models

Created a Cloud-Based Web Application

2015

AUTUMN

September marked the start of my Masters degree at the University of Bristol. A degree entitled Machine Learning, Data Mining, and High Performance Computing. A bit of a mouthful, but reasons exist for this particular choice. My interest in the fields of data science and artificial intelligence built up over time. A kind of interest akin to an ocean wave breaking upon the shore. A wave which was there all along, but only became visible and tall when it reached its destination. A point by which you, the beachgoer, get smacked in the chest and taken by surprise. That’s kind of what happened with me, one day making the decision that this is what I wanted to do. So there I was. Now, let me tell you about what I did.

Performed Image Manipulations

Finding coins in an image using Hough transformations.

One of my most enjoyable courses would be one taught by Tilo Burghardt. He is the most energetic professor I have ever come across. The annunciation and variation in his voice is phenomenal. Tilo aside, his course, Image Processing and Computer Vision covers a lot in a small amount of time. The course outlines what makes up an image, and how one can encode them. He describes Computer Vision as attempting to bridge the semantic gap between pixels and meaning.

With the details aside, my teammate and I created a program which counts coins in an image. It does this by encoding images within a matrix representation. We then performed successive transformations on the matrix to detect coin locations. Each detected location forms a prediction on where the center of a coin is within the image. Also encoded is the amount of support there is for a given circle radius. Given that we have the center points, and a predicted radius, we then draw the circles overlaid on the original image.

The techniques used during this process are:

Solved Problems with Genetic Algorithms

Learned How to Derive Bayes’ Theorem

Deriving Bayes' theorem using Venn Diagrams.

There is another class which, although unenjoyable, has some interesting content. The module is Uncertainty Modelling for Intelligent Systems, taught by Jonathan Lawry. It covers many modelling techniques ranging from probability and information theory, to Dempster-Shafer theory. It is more mathematical than it needs to be, yet the probability aspects are quite intriguing.

One such interesting probabilistic technique is the infamous Bayes’ theorem. Since becoming a rationalist, I’ve been meaning to learn it in an intuitive manner. If you’re in a similar position you usually get pointed towards Eliezer Yudkowsky’s post. So, I tried reading it. It makes sense as you go through it, but when I tried to think back on it, I became even more confused. His article even spawned an article explaining it! This lead to the discovery of a metaphor based approach. Although this was better, it still failed in the same way. I then came across a Venn Diagram based approach. It was at this point that things finally clicked. It seems we all learn things in different ways.

Now here’s the magical part. I was then able to derive Bayes’ theorem, without any prompts. Not only that, but I attempted it a few days later with the same results. Although, to be honest, I still need to learn how to actually apply it in everyday situations. I guess that leaves something to do in the future.