fbpx
Using Distillation to Protect Your Neural Networks
Distillation is a hot research area. For distillation, you first train a deep learning model, the teacher network, to solve your task. Then, you train a student network, which can be any model. While the teacher is trained on real data, the student is trained on the teacher’s outputs. It... Read more
AI Trained to Generate Novel Molecular Materials with a Generative Graph Grammar Model
Emerging technologies in the scientific community are helping researchers achieve more goals and make discoveries. Revolutionary tech such as artificial intelligence (AI) and machine learning (ML) have already disrupted various industries, from manufacturing to retail and beyond. ML has expedited the discovery process, especially for grad... Read more
Recurrent Neural Networks for Financial Time Series Prediction
Editor’s note: Nicole Königstein is a speaker for ODSC Europe 2022. Be sure to check out her talk, Dynamic and Context-Dependent Stock Price Prediction Using Attention Modules and News Sentiment, there to learn more about financial time series prediction! The use of neural networks is relatively... Read more
How to Determine the Optimal Learning Rate of Your Neural Network
One of the biggest challenges in building a deep learning model is choosing the right hyper-parameters. If the hyper-parameters aren’t ideal, the network may not be able to produce optimal results or development could be far more challenging. Perhaps the most difficult parameter to determine is... Read more
Google AI Introduces New DeepCTRL Method to Train Models
In early 2022, Google AI began releasing details about an exciting new method for training deep neural networks: DeepCTRL. Google’s AI team found a way to control rule strength and accuracy in deep neural networks, allowing for improvements in some crucial AI applications. DeepCTRL is more... Read more
Matrices and Their Connection to Graphs
Editor’s Note: Eric is a speaker for ODSC East 2022. Be sure to check out his talk, “Network Analysis Made Simple,” there! Graphs, also known as networks, are ubiquitous in our world. But did you know that graphs are also related to matrices and linear algebra?... Read more
Getting Started with Vector-Based Search
Editor’s note: Laura is a speaker for ODSC East 2022. Be sure to check out her talk, “Vector Database Workshop Using Weaviate,” to learn more about vector-based search! Traditional search engines perform a keyword-based search. Such search engines return results that contain an exact match or... Read more
Repo of the Week: Instant Neural Graphics Primitives
A team of researchers from NVIDIA including Thomas Muller, Alex Evans, Christoph Schied, and Alexander Keller,  demonstrated a new method that should enable the efficient use of artificial neural networks for rendering computer graphics.  Rendering is a notoriously slow process so this is a significant development... Read more
Higher-level PyTorch APIs: A short introduction to PyTorch Lightning 
In recent years, the PyTorch community developed several different libraries and APIs on top of PyTorch. PyTorch Lightning (Lightning for short) is one of them, and it makes training deep neural networks simpler by removing much of the boilerplate code. However, while Lightning’s focus lies in... Read more
Benchmarking a Computer Vision Deep Learning Pipeline with Distributed Computing
Editor’s Note: Jennifer is a speaker for ODSC East 2022. Be sure to check out her talk, “Creating a Benchmark for a Large-Scale Image Captioning Pipeline,” to learn more about computer vision deep learning there! Computer vision has an essential role in solving some of the... Read more