TensorFlow Probability: Building Trustworthy Models (TF Dev Summit ’19)

Posted by


At the TensorFlow Dev Summit ’19, TensorFlow Probability (TFP) was introduced as a powerful tool for integrating probabilistic modeling into deep learning applications. TFP allows users to quantify uncertainty in their models, improving performance and robustness in various machine learning tasks. In this tutorial, we will delve deeper into TFP and explore how we can use it to learn with confidence.

  1. Introduction to TensorFlow Probability:

TensorFlow Probability is an open-source library built on top of TensorFlow that provides tools for probabilistic modeling in machine learning. TFP allows users to define and manipulate probability distributions, making it easier to incorporate uncertainty into their models. This can be especially useful in tasks such as regression, classification, and reinforcement learning, where understanding the uncertainty in predictions is crucial.

  1. Installing TensorFlow Probability:

Before we can start using TensorFlow Probability, we need to install the library. TFP can be installed using pip by running the following command:

pip install tensorflow-probability

Once TFP is installed, we can start using it in our projects.

  1. Working with Probability Distributions:

One of the key features of TensorFlow Probability is its ability to represent and manipulate probability distributions. TFP provides a wide range of distributions, including Gaussian, Bernoulli, Categorical, and more. These distributions can be easily created and manipulated using TFP’s intuitive API.

For example, to create a Gaussian distribution with mean 0 and standard deviation 1, we can use the following code:

import tensorflow_probability as tfp
import tensorflow as tf

tfd = tfp.distributions

normal_dist = tfd.Normal(loc=0, scale=1)

We can then sample from this distribution, compute the log probabilities of samples, and perform other operations using TFP’s API.

  1. Bayesian Neural Networks with TensorFlow Probability:

One of the most powerful applications of TensorFlow Probability is in building Bayesian neural networks. Traditional neural networks make point estimates of model parameters, leading to overconfident predictions and limited generalization. Bayesian neural networks, on the other hand, learn a distribution over parameters, allowing us to quantify uncertainty in predictions.

To build a Bayesian neural network with TFP, we can replace the weight and bias tensors in our network with random variables from a distribution. We can then train the network using variational inference or Markov Chain Monte Carlo (MCMC) to learn a distribution over parameters.

  1. Learning with Confidence:

By incorporating probabilistic modeling with TensorFlow Probability, we can train models that are not only accurate but also provide measures of uncertainty in predictions. This can be extremely useful in tasks such as anomaly detection, where understanding the uncertainty in predictions is crucial.

For example, in a classification task, we can use TFP to obtain predictive posterior distributions over class labels instead of point estimates. By sampling from these distributions, we can make more informed decisions and take into account the variability in predictions.

In summary, TensorFlow Probability provides a powerful framework for integrating probabilistic modeling into deep learning applications. By learning with confidence, we can build more robust and accurate models that are better equipped to handle uncertainty in real-world scenarios.

I hope this tutorial has provided you with a comprehensive overview of TensorFlow Probability and how it can be used to learn with confidence. Experiment with TFP in your projects and explore the many possibilities it offers for enhancing machine learning applications. Happy coding!

0 0 votes
Article Rating

Leave a Reply

33 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
@benjamindeporte3806
1 day ago

Outstanding

@jordia.2970
1 day ago

Amazing man

@youkorii
1 day ago

1. Before training the network it must return the prior distribution of the outputs for any input, am I wrong?

2. If we have (1) how to show that the training procedure is using Bayes Rule to update the priors, I mean replacing X by X|y?

@raminrasoulinezhad
1 day ago

great

@danielkrajnik3817
1 day ago

2:57 glitch in the tensor

@kimchi_taco
1 day ago

Ok, what is 'Unknown known'? a.k.a You don't know what you know.
* Known unknown: data variance, Heteroskedastic, aleatoric uncertainty, you know what you don't know.
* Unknown unknown: Bayesian posterior p(w|x), distribution over the weights, epistemic uncertainty, you don't know what you don't know.

@prof_shixo
1 day ago

In the end, it looks like a Gaussian process with automatic feature extraction layers!

@adammcallister8195
1 day ago

1) Where do you get the "dots" from?
2) Does all data have to be converted to these "dots" or can a CSV file work?
3) How would you create data and convert them into these "dots", is there a special program.
4) What is the best way for tensorflow to predict data? I have been watching a ton of videos and this is the first time I seen these "dot" system all the videos I have been watching have been using CSV files from MNIST Dataset. (either locally or linked to a website"
** I am from a journalistic background and I do appreciate the presentations but a lot of these tutorials are either : Outdated, code does not work on computer, but will work on theirs (teachers)
Not clear on how you get data, how to create your own data, where do you put your data(files) so that tensorflow can access it.?

@malharjajoo7393
1 day ago

7:30 – Looks like a Gaussian Process Deep Neural Network (GPDNN) ?

@malharjajoo7393
1 day ago

7:17 – interestingly, I think the reason why we end up with a line for the mean and variance is because there is only one layer in the network
(and hence, mean = theta0 + theta1 * x1 + theta2*x1 +… and hence this will be equation of a line.)

@starlwe
1 day ago

The sample code doesn't seem to work in TF 2.0? Is there an update for TF 2.0?

@kibbutztradelink2287
1 day ago

how to update all reference to tfp.distributions instead of tf.distributions?

@malharjajoo7393
1 day ago

Is TFP offering a similar functionality (or vice versa) as Uber's Pyro ?

@pole_datadev8389
1 day ago

The more I dig into tensorflow, the more I realize how incredible is this tool. Great job, I am sure investing time learning tensorflow is no waste of time

@autripat
1 day ago

I'm impressed with the presenter's delivery. Well done! And funny!

@myfelicidade
1 day ago

As an applied mathematician, if I had to thank Google for a single thing, it would be this. Really going one step further in scientific inference.

@abhishekshah11
1 day ago

Okay I feel stupid.

@fwbadine
1 day ago

Is there a newer version of the book? Josh said: "check out this book which we rewrote using TF probability" …. where can this newer version be found? Thanks!

@The_Unexplainer
1 day ago

Mixed models

@jamesmckeown4743
1 day ago

And nine days later… AttributeError: module 'tensorflow_probability.python.layers' has no attribute 'VariationGaussianProcess'

33
0
Would love your thoughts, please comment.x
()
x