Posts tagged probability

Gen — programming & modelling langage

programming, GEN, AI, probability, modeling, graphics, statistics, ML, 2019

Probabilistic modeling and inference are core tools in diverse fields including statistics, machine learning, computer vision, cognitive science, robotics, natural language processing, and artificial intelligence. To meet the functional requirements of applications, practitioners use a broad range of modeling techniques and approximate inference algorithms. However, implementing inference algorithms is often difficult and error prone. Gen simplifies the use of probabilistic modeling and inference, by providing modeling languages in which users express models, and high-level programming constructs that automate aspects of inference. Like some probabilistic programming research languages, Gen includes universal modeling languages that can represent any model, including models with stochastic structure, discrete and continuous random variables, and simulators. However, Gen is distinguished by the flexibility that it affords to users for customizing their inference algorithm. It is possible to use built-in algorithms that require only a couple lines of code, as well as develop custom algorithms that are more able to meet scalability and efficiency requirements. Gen’s flexible modeling and inference programming capabilities unify symbolic, neural, probabilistic, and simulation-based approaches to modeling and inference, including causal modeling, symbolic programming, deep learning, hierarchical Bayesiam modeling, graphics and physics engines, and planningand reinforcement learning.

via https://probcomp.github.io/Gen/

Bayesian Methods for Hackers

bayesian, book, statistics, programming, probability

Bayesian Methods for Hackers is designed as a introduction to Bayesian inference from a computational/understanding-first, and mathematics-second, point of view. Of course as an introductory book, we can only leave it at that: an introductory book. For the mathematically trained, they may cure the curiosity this text generates with other texts designed with mathematical analysis in mind. For the enthusiast with less mathematical-background, or one who is not interested in the mathematics but simply the practice of Bayesian methods, this text should be sufficient and entertaining.

via https://camdavidsonpilon.github.io/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/

Where You Cannot Generalize from Knowledge of Parts

Medium, Taleb, probability, averages, information, complexity, non-linearity, generalisation, volatility

Consider the following as a rule. Whenever you have nonlinearity, the average doesn’t matter anymore. Hence:
The more nonlinearity in the response, the less informational the average.
For instance, your benefit from drinking water would be linear if ten glasses of water were ten times as good as one single glass. If that is not the case, then necessarily the average water consumption matters less than something else that we will call “unevenness”, or volatility, or inequality in consumption. Say your average daily consumption needs to be one liter a day and I gave you ten liters one day and none for the remaining nine days, for an average of one liter a day. Odds are you won’t survive. You want your quantity of water to be as evenly distributed as possible. Within the day, you do not need to consume the same amount water every minute, but at the scale of the day, you want maximal evenness.
From an informational standpoint, someone who tells you “We will supply you with one liter of water liter day on average” is not conveying much information at all; there needs to be a second dimension, the variations around such an average. You are quite certain that you will die of thirst if his average comes from a cluster of a hundred liters every hundred days.

via https://medium.com/@nntaleb/where-you-cannot-generalize-from-knowledge-of-parts-continuation-to-the-minority-rule-ce96ca3c5739

On the Difference between Binary Prediction and True Exposure with Implications for Forecasting Tournaments and Decision Making

Taleb, Tetlock, prediction, exposure, decision theory, probability, Predictions, Risk, Decision, Jud

There are serious differences between predictions, bets, and exposures that have a yes/no type of payoff, the “binaries”, and those that have varying payoffs, which we call the “vanilla”. Real world exposures tend to belong to the vanilla category, and are poorly captured by binaries. Vanilla exposures are sensitive to Black Swan effects, model errors, and prediction problems, while the binaries are largely immune to them. The binaries are mathematically tractable, while the vanilla are much less so. Hedging vanilla exposures with binary bets can be disastrous – and because of the human tendency to engage in attribute substitution when confronted by difficult questions, decision-makers and researchers often confuse the vanilla for the binary.

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2284964