Background – How many cats does it take to identify a Cat? In this article, I cover the 12 types of AI problems i.e. I address the question : in which scenarios should you use Artificial Intelligence (AI)?  We cover this space in the  Enterprise AI course Some background: Recently, I conducted... Read more
ConceptNet 5.5 and conceptnet.io
ConceptNet is a large, multilingual knowledge graph about what words mean. This is background knowledge that’s very important in NLP and machine learning, and it remains relevant in a time when the typical thing to do is to shove a terabyte or so of text through a neural net.... Read more
HACKER’S GUIDE TO NEURAL NETWORKS, #2
Chapter 2: Machine Learning In the last chapter we were concerned with real-valued circuits that computed possibly complex expressions of their inputs (the forward pass), and also we could compute the gradients of these expressions on the original inputs (backward pass). In this chapter we will see how useful... Read more
TensorFlow and Queues
There are many ways to implement queue data structures, and TensorFlow has some of its own. FIFO Queue with a list In Python, a list can implement a first-in first-out (FIFO) queue, with slightly awkward syntax: >>> my_list = >>> my_list.insert(0, 'a') >>> my_list.insert(0, 'b') >>> my_list.insert(0, 'c')... Read more
Algorithms are Black Boxes, That is Why We Need Explainable AI
Artificial Intelligence offers a lot of advantages for organisations by creating better and more efficient organisations, improving customer services with conversational AI and reducing a wide variety of risks in different industries. Although we are only at the beginning of the AI revolution that is upon us, we can already see... Read more
Deep Learning, IoT Sensor Data…and Bats!
At the very center of Internet of Things excitement is the sensor. Not just one sensor, mind you, but a sensor that normally would just be sending a data stream to who knows where would now have access to the information from another sensor measuring something completely different. Now... Read more
Attention and Memory in Deep Learning and NLP
A recent trend in Deep Learning are Attention Mechanisms. In an interview, Ilya Sutskever, now the research director of OpenAI, mentioned that Attention Mechanisms are one of the most exciting advancements, and that they are here to stay. That sounds exciting. But what are Attention Mechanisms? Attention Mechanisms in... Read more
Amazon Enters The Open-Source Deep Learning Fray
The Synergy Research Group’s last report of 2015 attributed 31% of the cloud computing market to Amazon’s Amazon Web Services (AWS), nearly four times as much as its nearest competitor, Microsoft. This would come as no surprise to any programmer, Data Engineer, or Data Scientist, AWS is a mainstay... Read more
Productionizing Deep Learning from the Ground Up – Adam Gibson ODSC Boston 2015
Productionizing Deep Learning From the Ground Up from odsc Presenter Bio Adam Gibson is the founder of Skymind.io and is a deep­ learning specialist based in San Francisco assisting Fortune 500 companies, hedge funds, PR firms and startup accelerators with their machine learning projects. Adam has a strong track... Read more
Recurrent Neural Networks for Text Analysis – Alec Radford ODSC Boston 2015
Recurrent Neural Networks for Text Analysis from odsc Recurrent Neural Networks hold great promise as general sequence learning algorithms. As such, they are a very promising tool for text analysis. However, outside of very specific use cases such as handwriting recognition and recently, machine translation, they have not seen... Read more