How to Determine the Optimal Learning Rate of Your Neural Network
One of the biggest challenges in building a deep learning model is choosing the right hyper-parameters. If the hyper-parameters aren’t ideal, the network may not be able to produce optimal results or development could be far more challenging. Perhaps the most difficult parameter to determine is... Read more
Exploit Your Hyperparameters: Batch Size and Learning Rate as Regularization
Gradient descent is one of the first concepts many learn when studying machine or deep learning. This optimization algorithm underlies most of machine learning, including backpropagation in neural networks. When learning gradient descent, we learn that learning rate and batch size matter. Specifically, increasing the learning rate speeds... Read more
Using NLP to identify Adverse Drug Events (ADEs)
An adverse drug event (ADE) is defined as harm experienced by a patient as a result of exposure to a medication. A significant amount of information about drug-related safety issues such as adverse effects is published in medical case reports that usually can only be explored by... Read more
Causal Reasoning in Machine Learning
Thanks to recent advancements in Artificial Intelligence (AI), we are now able to leverage Machine Learning and Deep Learning technologies in both academic and commercial applications. Although, relying just on correlations between the different features, can possibly lead to wrong conclusions since correlation does not necessarily... Read more
Deep Neural Networks Could Be Key to Ancient Text Restoration and Attribution, Research Shows
Uncovering the truths of ancient history can be complicated. Researchers must study texts inscribed in stone and clay, a process called epigraphy, but these inscriptions can be ineligible after centuries of damage. Recent research suggests that deep neural networks (DNNs) could help with ancient text restoration.... Read more
Running a Data Science Team That Provides Value
This blog post is an excerpt from the O’Reilly report: Leading Data Science Teams. The success of your data science team entirely depends on how well the people on it can fulfill their roles and provide value. Value is a difficult term to define because with... Read more
Reproducible and Shareable Notebooks Across a Data Science Team
At CybelAngel, we are a growing team of data scientists and a machine learning engineer, planning to double in size. Each of us contributes to projects and we use shareable notebooks before code industrialization for production. Notebooks: let’s focus on them. We talk specifically about Jupyter... Read more
The Toolkit Approach to Trustworthy AI
Editor’s Note: Kush R. Varshney is a speaker for ODSC East 2022. Be sure to check out his talk, “A Unified View of Trustworthy AI with the 360 Toolkits,” there! As artificial intelligence (AI) systems are increasingly used to support consequential decisions in high-risk applications such... Read more
Intro to NLP: Topic Modeling and Text Categorization
Editor’s note: Sanghamitra Deb is a speaker for ODSC East 2022. Be sure to check out her talk, “Intro to NLP: Text Categorization and Topic Modeling,” there! Natural Language Processing (NLP) is the basis of machine intelligence. NLP is the process of bringing structure to free-form... Read more
Google AI Introduces New DeepCTRL Method to Train Models
In early 2022, Google AI began releasing details about an exciting new method for training deep neural networks: DeepCTRL. Google’s AI team found a way to control rule strength and accuracy in deep neural networks, allowing for improvements in some crucial AI applications. DeepCTRL is more... Read more