I worked at Lab41 from 2015 and 2017. Part of my job was to write articles for our blog, Gab41. There I covered some of the projects I worked on, but my favorite and most popular posts were for the reading group series where I reviewed some key papers in deep learning.
Lab41 has kindly given me permission to host the articles here. You can find them below:
Matching the same object across separate images is tough, but Siamese networks can learn to do it pretty well! Read on for details.
Finding objects in images can be hard if you have only a little data. In this post I examine a few approaches that work with few training examples!
Want to train a network but unsure about dropout vs. stochastic depth? Should you use a ResNet? Stop worry and use Swapout; it does all that and more!
Word embeddings are great and should be your first stop for doing word based NLP. But what about sentences? Read on to learn about skip-thought vectors, a sentence embedding algorithm!
Inception, AlexNet, VGG... There are so many network architectures, which one should you be using? The one everyone else is: ResNet! Come find out how it works!
Deep learning is the future, but how can I fit a battery-drain, half-gigabyte network on my phone? You compress it! Come find out how deep compression saves space and power!
Dropout successfully regularizes networks by dropping nodes, but what if we went one step further? Find out how stochastic depth improves your network by dropping whole layers!
What cost function would you use to determine if a picture looks real? How about one learned by another network! Find out more with my summary of Generative Adversarial Networks!
Parsing source code is easy; just let the interpreter do it! But what if you want to recommend code snippets? Then you need word embeddings, like my Python2Vec!
Do you want to play around with recommender systems, but you don't have any data? Don't worry, there are tons of great, open source datasets for recommender systems!