-
Philosophical Transactions of the Royal... Aug 2015The generation of observations is a technical process and the advances that have been made in forensic science techniques over the last 50 years have been staggering.... (Review)
Review
The generation of observations is a technical process and the advances that have been made in forensic science techniques over the last 50 years have been staggering. But science is about reasoning-about making sense from observations. For the forensic scientist, this is the challenge of interpreting a pattern of observations within the context of a legal trial. Here too, there have been major advances over recent years and there is a broad consensus among serious thinkers, both scientific and legal, that the logical framework is furnished by Bayesian inference (Aitken et al. Fundamentals of Probability and Statistical Evidence in Criminal Proceedings). This paper shows how the paradigm has matured, centred on the notion of the balanced scientist. Progress through the courts has not been always smooth and difficulties arising from recent judgments are discussed. Nevertheless, the future holds exciting prospects, in particular the opportunities for managing and calibrating the knowledge of the forensic scientists who assign the probabilities that are at the foundation of logical inference in the courtroom.
Topics: Bayes Theorem; Forensic Sciences; Jurisprudence; Logic
PubMed: 26101288
DOI: 10.1098/rstb.2014.0263 -
European Journal of Sport Science Jun 2022This study analyzed the global state of groups formed by cyclists competing in three different points races and considered the behaviour of individual cyclists in those...
This study analyzed the global state of groups formed by cyclists competing in three different points races and considered the behaviour of individual cyclists in those races. We measured the time difference between the front cyclist and the other cyclists when they crossed the centrelines of home and back straights in order to quantify the global configuration of cyclists in terms of their density and features of states, extracted using principal components analysis (PCA). We examined whether the group separation and group density that characterize the cycling race can be extracted by PCA. We interpreted the PCA results to explain the separation and density of the group using the first and second principal components. Then, we defined the state of configuration of the cyclists in each lap in the plane of the first and second principal components. Subsequently, the state transition probabilities were obtained. States 1, 2, 3, and 4 corresponded to the third, second, first, and fourth quadrants, respectively. State 1 represented a state comprising one dense group, state 2 represented one stretched group, state 3 represented a divided group, and state 4 represented an escape group far from a single dense group.HighlightsAn approach to understand the collective behaviour of cycling points races through principal component analysis was effective for quantifying the configuration of the cyclists.Principal component analysis of the global configuration of the cyclists in the points races revealed the fission-fusion dynamics was characterized by two components. The density of a group and number of groups, and transitions among four states was defined by these two components.State transition probabilities indicate that the group separation states were more frequent in the latter half of the sprint interval, and it was difficult to re-combine the separated groups into one.The riders and coaches need to be aware of the stretching and separation of the group, even if it does not occur immediately before the sprint as the positioning of a cyclist in the group would be important at that time.
Topics: Bicycling; Humans; Probability
PubMed: 33739228
DOI: 10.1080/17461391.2021.1905077 -
Nature Human Behaviour Nov 2019A fundamental but rarely contested assumption in economics and neuroeconomics is that decision-makers compute subjective values of risky options by multiplying functions...
A fundamental but rarely contested assumption in economics and neuroeconomics is that decision-makers compute subjective values of risky options by multiplying functions of reward probability and magnitude. By contrast, an additive strategy for valuation allows flexible combination of reward information required in uncertain or changing environments. We hypothesized that the level of uncertainty in the reward environment should determine the strategy used for valuation and choice. To test this hypothesis, we examined choice between risky options in humans and rhesus macaques across three tasks with different levels of uncertainty. We found that whereas humans and monkeys adopted a multiplicative strategy under risk when probabilities are known, both species spontaneously adopted an additive strategy under uncertainty when probabilities must be learned. Additionally, the level of volatility influenced relative weighting of certain and uncertain reward information, and this was reflected in the encoding of reward magnitude by neurons in the dorsolateral prefrontal cortex.
Topics: Adolescent; Animals; Choice Behavior; Comprehension; Decision Making; Female; Humans; Macaca mulatta; Male; Probability; Reward; Risk; Uncertainty; Young Adult
PubMed: 31501543
DOI: 10.1038/s41562-019-0714-3 -
Neuropsychopharmacology : Official... Mar 2015Pathological behaviors toward drugs and food rewards have underlying commonalities. Risk-taking has a fourfold pattern varying as a function of probability and valence...
Pathological behaviors toward drugs and food rewards have underlying commonalities. Risk-taking has a fourfold pattern varying as a function of probability and valence leading to the nonlinearity of probability weighting with overweighting of small probabilities and underweighting of large probabilities. Here we assess these influences on risk-taking in patients with pathological behaviors toward drug and food rewards and examine structural neural correlates of nonlinearity of probability weighting in healthy volunteers. In the anticipation of rewards, subjects with binge eating disorder show greater risk-taking, similar to substance-use disorders. Methamphetamine-dependent subjects had greater nonlinearity of probability weighting along with impaired subjective discrimination of probability and reward magnitude. Ex-smokers also had lower risk-taking to rewards compared with non-smokers. In the anticipation of losses, obesity without binge eating had a similar pattern to other substance-use disorders. Obese subjects with binge eating also have impaired discrimination of subjective value similar to that of the methamphetamine-dependent subjects. Nonlinearity of probability weighting was associated with lower gray matter volume in dorsolateral and ventromedial prefrontal cortex and orbitofrontal cortex in healthy volunteers. Our findings support a distinct subtype of binge eating disorder in obesity with similarities in risk-taking in the reward domain to substance use disorders. The results dovetail with the current approach of defining mechanistically based dimensional approaches rather than categorical approaches to psychiatric disorders. The relationship to risk probability and valence may underlie the propensity toward pathological behaviors toward different types of rewards.
Topics: Analysis of Variance; Body Mass Index; Choice Behavior; Female; Humans; Male; Obesity; Probability; Reward; Risk Factors; Substance-Related Disorders
PubMed: 25270821
DOI: 10.1038/npp.2014.242 -
Health Reports Sep 2023The lifetime probabilities of developing (LP) cancer and dying (LP) from cancer are useful summary statistics that describe the impact of cancer within a population....
BACKGROUND
The lifetime probabilities of developing (LP) cancer and dying (LP) from cancer are useful summary statistics that describe the impact of cancer within a population. This study aims to present detailed LP and LP for cancer by sex and cancer type and to describe changes in these lifetime probabilities over time among the Canadian population.
DATA AND METHODS
Cancer incidence data (1997 to 2018) were obtained from the Canadian Cancer Registry. All-cause and cancer mortality data (1997 to 2020) were obtained from the Canadian Vital Statistics - Death Database. LP and LP were calculated using the DevCan software, and trends over time were estimated using Joinpoint.
RESULTS
The LP for all cancers combined was 44.3% in Canada in 2018, with all results excluding Quebec. At the age of 60, the conditional probability of developing cancer was very similar (44.0% for males and 38.2% for females). The LP was 22.5% among the Canadian population in 2020, while the probability of dying from cancer conditional on surviving until age 60 was 25.1% for males and 20.5% for females. Generally, males experienced higher LP and LP for most specific cancers compared with females.
INTERPRETATION
LP and LP for cancer mirror cancer incidence and mortality rates. Cancer-specific changes in these probabilities over time are indicative of the cancer trends resulting from cancer prevention, screening, detection, and treatment. These changes in LP and LP provide insight into the shifting landscape of the Canadian cancer burden.
Topics: Female; Male; Humans; Middle Aged; Canada; Quebec; Neoplasms; Databases, Factual; Probability
PubMed: 37729062
DOI: 10.25318/82-003-x202300900002-eng -
Cognition Sep 2022Base rate neglect refers to people's apparent tendency to underweight or even ignore base rate information when estimating posterior probabilities for events, such as...
Base rate neglect refers to people's apparent tendency to underweight or even ignore base rate information when estimating posterior probabilities for events, such as the probability that a person with a positive cancer-test outcome actually does have cancer. While often replicated, almost all evidence for the phenomenon comes from studies that used problems with extremely low base rates, high hit rates, and low false alarm rates. It is currently unclear whether the effect generalizes to reasoning problems outside this "corner" of the entire problem space. Another limitation of previous studies is that they have focused on describing empirical patterns of the effect at the group level and not so much on the underlying strategies and individual differences. Here, we address these two limitations by testing participants on a broader problem space and modeling their responses at a single-participant level. We find that the empirical patterns that have served as evidence for base-rate neglect generalize to a larger problem space, albeit with large individual differences in the extent with which participants "neglect" base rates. In particular, we find a bi-modal distribution consisting of one group of participants who almost entirely ignore the base rate and another group who almost entirely account for it. This heterogeneity is reflected in the cognitive modeling results: participants in the former group were best captured by a linear-additive model, while participants in the latter group were best captured by a Bayesian model. We find little evidence for heuristic models. Altogether, these results suggest that the effect known as "base-rate neglect" generalizes to a large set of reasoning problems, but varies largely across participants and may need a reinterpretation in terms of the underlying cognitive mechanisms.
Topics: Bayes Theorem; Cognition; Humans; Probability; Problem Solving
PubMed: 35660344
DOI: 10.1016/j.cognition.2022.105160 -
Journal of the Royal Society, Interface Nov 2023Symmetry arguments are frequently used-often implicitly-in mathematical modelling of natural selection. Symmetry simplifies the analysis of models and reduces the number...
Symmetry arguments are frequently used-often implicitly-in mathematical modelling of natural selection. Symmetry simplifies the analysis of models and reduces the number of distinct population states to be considered. Here, I introduce a formal definition of symmetry in mathematical models of natural selection. This definition applies to a broad class of models that satisfy a minimal set of assumptions, using a framework developed in previous works. In this framework, population structure is represented by a set of sites at which alleles can live, and transitions occur via replacement of some alleles by copies of others. A symmetry is defined as a permutation of sites that preserves probabilities of replacement and mutation. The symmetries of a given selection process form a group, which acts on population states in a way that preserves the Markov chain representing selection. Applying classical results on group actions, I formally characterize the use of symmetry to reduce the states of this Markov chain, and obtain bounds on the number of states in the reduced chain.
Topics: Models, Genetic; Selection, Genetic; Markov Chains; Probability; Mutation
PubMed: 37963562
DOI: 10.1098/rsif.2023.0306 -
American Journal of Human Genetics Jul 1986A probability can be viewed as an estimate of a variable that is sometimes 1 and sometimes 0. To have validity, the probability must equal the expected value of that...
A probability can be viewed as an estimate of a variable that is sometimes 1 and sometimes 0. To have validity, the probability must equal the expected value of that variable. To have utility, the average squared deviation of the probability from the value of that variable should be small. It is shown that probabilities of paternity calculated by the use of Bayes' theorem under appropriate assumptions are valid, but they can vary in utility. In particular, a recently proposed probability of paternity has less utility than the usual one based on the paternity index. Using an arbitrary prior probability in the calculation cannot lead to a valid probability unless, by chance, the chosen prior probability happens to be appropriate. Appropriate assumptions regarding both the prior probability and gene or genotypic frequencies can be estimated from prior experience.
Topics: Female; Humans; Male; Paternity; Phenotype; Probability
PubMed: 3752078
DOI: No ID Found -
PloS One 2022Variability is inherent to cyber systems. Here, we introduce ideas from stochastic population biology to describe the properties of two broad kinds of cyber systems....
Variability is inherent to cyber systems. Here, we introduce ideas from stochastic population biology to describe the properties of two broad kinds of cyber systems. First, we assume that each of N0 components can be in only one of two states: functional or nonfunctional. We model this situation as a Markov process that describes the transitions between functional and nonfunctional states. We derive an equation for the probability that an individual cyber component is functional and use stochastic simulation to develop intuition about the dynamics of individual cyber components. We introduce a metric of performance of the system of N0 components that depends on the numbers of functional and nonfunctional components. We numerically solve the forward Kolmogorov (or Fokker-Planck) equation for the number of functional components at time t, given the initial number of functional components. We derive a Gaussian approximation for the solution of the forward equation so that the properties of the system with many components can be determined from the transition probabilities of an individual component, allowing scaling to very large systems. Second, we consider the situation in which the operating system (OS) of cyber components is updated in time. We motivate the question of OS in use as a function of the most recent OS release with data from a network of desktop computers. We begin the analysis by specifying a temporal schedule of OS updates and the probability of transitioning from the current OS to a more recent one. We use a stochastic simulation to capture the pattern of the motivating data, and derive the forward equation for the OS of an individual computer at any time. We then include compromise of OSs to compute that a cyber component has an unexploited OS at any time. We conclude that an interdisciplinary approach to the variability of cyber systems can shed new light on the properties of those systems and offers new and exciting ways to understand them.
Topics: Stochastic Processes; Computer Simulation; Markov Chains; Software; Probability
PubMed: 36574390
DOI: 10.1371/journal.pone.0279100 -
PloS One 2023Bedaquiline is a core drug for treatment of rifampicin-resistant tuberculosis. Few genomic variants have been statistically associated with bedaquiline resistance....
BACKGROUND
Bedaquiline is a core drug for treatment of rifampicin-resistant tuberculosis. Few genomic variants have been statistically associated with bedaquiline resistance. Alternative approaches for determining the genotypic-phenotypic association are needed to guide clinical care.
METHODS
Using published phenotype data for variants in Rv0678, atpE, pepQ and Rv1979c genes in 756 Mycobacterium tuberculosis isolates and survey data of the opinion of 33 experts, we applied Bayesian methods to estimate the posterior probability of bedaquiline resistance and corresponding 95% credible intervals.
RESULTS
Experts agreed on the role of Rv0678, and atpE, were uncertain about the role of pepQ and Rv1979c variants and overestimated the probability of bedaquiline resistance for most variant types, resulting in lower posterior probabilities compared to prior estimates. The posterior median probability of bedaquiline resistance was low for synonymous mutations in atpE (0.1%) and Rv0678 (3.3%), high for missense mutations in atpE (60.8%) and nonsense mutations in Rv0678 (55.1%), relatively low for missense (31.5%) mutations and frameshift (30.0%) in Rv0678 and low for missense mutations in pepQ (2.6%) and Rv1979c (2.9%), but 95% credible intervals were wide.
CONCLUSIONS
Bayesian probability estimates of bedaquiline resistance given the presence of a specific mutation could be useful for clinical decision-making as it presents interpretable probabilities compared to standard odds ratios. For a newly emerging variant, the probability of resistance for the variant type and gene can still be used to guide clinical decision-making. Future studies should investigate the feasibility of using Bayesian probabilities for bedaquiline resistance in clinical practice.
Topics: Bayes Theorem; Probability; Uncertainty; Genomics
PubMed: 37315052
DOI: 10.1371/journal.pone.0287019