31 resultados para Monotone likelihood ration property
Resumo:
In this paper we propose a highly accurate approximation procedure for ruin probabilities in the classical collective risk model, which is based on a quadrature/rational approximation procedure proposed in [2]. For a certain class of claim size distributions (which contains the completely monotone distributions) we give a theoretical justification for the method. We also show that under weaker assumptions on the claim size distribution, the method may still perform reasonably well in some cases. This in particular provides an efficient alternative to a related method proposed in [3]. A number of numerical illustrations for the performance of this procedure is provided for both completely monotone and other types of random variables.
Resumo:
Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.
Resumo:
The present paper focuses on the analysis and discussion of a likelihood ratio (LR) development for propositions at a hierarchical level known in the context as 'offence level'. Existing literature on the topic has considered LR developments for so-called offender to scene transfer cases. These settings involve-in their simplest form-a single stain found on a crime scene, but with possible uncertainty about the degree to which that stain is relevant (i.e. that it has been left by the offender). Extensions to multiple stains or multiple offenders have also been reported. The purpose of this paper is to discuss a development of a LR for offence level propositions when case settings involve potential transfer in the opposite direction, i.e. victim/scene to offender transfer. This setting has previously not yet been considered. The rationale behind the proposed LR is illustrated through graphical probability models (i.e. Bayesian networks). The role of various uncertain parameters is investigated through sensitivity analyses as well as simulations.
Resumo:
In the last few years, a need to account for molecular flexibility in drug-design methodologies has emerged, even if the dynamic behavior of molecular properties is seldom made explicit. For a flexible molecule, it is indeed possible to compute different values for a given conformation-dependent property and the ensemble of such values defines a property space that can be used to describe its molecular variability; a most representative case is the lipophilicity space. In this review, a number of applications of lipophilicity space and other property spaces are presented, showing that this concept can be fruitfully exploited: to investigate the constraints exerted by media of different levels of structural organization, to examine processes of molecular recognition and binding at an atomic level, to derive informative descriptors to be included in quantitative structure--activity relationships and to analyze protein simulations extracting the relevant information. Much molecular information is neglected in the descriptors used by medicinal chemists, while the concept of property space can fill this gap by accounting for the often-disregarded dynamic behavior of both small ligands and biomacromolecules. Property space also introduces some innovative concepts such as molecular sensitivity and plasticity, which appear best suited to explore the ability of a molecule to adapt itself to the environment variously modulating its property and conformational profiles. Globally, such concepts can enhance our understanding of biological phenomena providing fruitful descriptors in drug-design and pharmaceutical sciences.
Resumo:
Two likelihood ratio (LR) approaches are presented to evaluate the strength of evidence of MDMA tablet comparisons. The first one is based on a more 'traditional' comparison of MDMA tablets by using distance measures (e.g., Pearson correlation distance or a Euclidean distance). In this approach, LRs are calculated using the distribution of distances between tablets of the same-batch and that of different-batches. The second approach is based on methods used in some other fields of forensic comparison. Here LRs are calculated based on the distribution of values of MDMA tablet characteristics within a specific batch and from all batches. The data used in this paper must be seen as examples to illustrate both methods. In future research the methods can be applied to other and more complex data. In this paper, the methods and their results are discussed, considering their performance in evidence evaluation and several practical aspects. With respect to evidence in favor of the correct hypothesis, the second method proved to be better than the first one. It is shown that the LRs in same-batch comparisons are generally higher compared to the first method and the LRs in different-batch comparisons are generally lower. On the other hand, for operational purposes (where quick information is needed), the first method may be preferred, because it is less time consuming. With this method a model has to be estimated only once in a while, which means that only a few measurements have to be done, while with the second method more measurements are needed because each time a new model has to be estimated.
Resumo:
The oligomeric state of BAFF (B cell activing factor), a tumor necrosis factor (TNF) family cytokine that plays a critical role in B cell development and survival, has been the subject of recent debate. Myc-tagged BAFF starting at residue Gln136 was previously reported to crystallize as trimers at pH 4.5, whereas a histidine-tagged construct of BAFF, starting at residue Ala134, formed a virus-like cluster containing 60 monomers when crystallized at pH 9.0. The formation of the BAFF 60-mer was pH dependent, requiring pH >or= 7.0. More recently, 60-mer formation was suggested to be artificially induced by the histidine tag, and it was proposed that BAFF, like all other TNF family members, is trimeric. We report here that a construct of BAFF with no amino-terminal tag (Ala134-BAFF) can form a 60-mer in solution. Using size exclusion chromatography and static light scattering to monitor trimer to 60-mer ratios in BAFF preparations, we find that 60-mer formation is pH-dependent and requires histidine 218 within the DE loop of BAFF. Biacore measurements established that the affinity of Ala134-BAFF for the BAFF receptor BAFFR/BR3 is similar to that of myc-Gln136-BAFF, which is exclusively trimeric in solution. However, Ala134-BAFF is more efficacious than myc-Gln136-BAFF in inducing B cell proliferation in vitro. We additionally show that BAFF that is processed and secreted by 293T cells transfected with full-length BAFF, or by a histiocytic lymphoma cell line (U937) that expresses BAFF endogenously, forms a pH-dependent 60-mer in solution. Our results indicate that the formation of the 60-mer in solution by the BAFF extracellular domain is an intrinsic property of the protein, and therefore that this more active form of BAFF may be physiologically relevant.
Resumo:
OBJECTIVES: This study aimed at measuring the lipophilicity and ionization constants of diastereoisomeric dipeptides, interpreting them in terms of conformational behavior, and developing statistical models to predict them. METHODS: A series of 20 dipeptides of general structure NH(2) -L-X-(L or D)-His-OMe was designed and synthetized. Their experimental ionization constants (pK(1) , pK(2) and pK(3) ) and lipophilicity parameters (log P(N) and log D(7.4) ) were measured by potentiometry. Molecular modeling in three media (vacuum, water, and chloroform) was used to explore and sample their conformational space, and for each stored conformer to calculate their radius of gyration, virtual log P (preferably written as log P(MLP) , meaning obtained by the molecular lipophilicity potential (MLP) method) and polar surface area (PSA). Means and ranges were calculated for these properties, as was their sensitivity (i.e., the ratio between property range and number of rotatable bonds). RESULTS: Marked differences between diastereoisomers were seen in their experimental ionization constants and lipophilicity parameters. These differences are explained by molecular flexibility, configuration-dependent differences in intramolecular interactions, and accessibility of functional groups. Multiple linear equations correlated experimental lipophilicity parameters and ionization constants with PSA range and other calculated parameters. CONCLUSION: This study documents the differences in lipophilicity and ionization constants between diastereoisomeric dipeptides. Such configuration-dependent differences are shown to depend markedly on differences in conformational behavior and to be amenable to multiple linear regression. Chirality 24:566-576, 2012. © 2012 Wiley Periodicals, Inc.
Resumo:
Well developed experimental procedures currently exist for retrieving and analyzing particle evidence from hands of individuals suspected of being associated with the discharge of a firearm. Although analytical approaches (e.g. automated Scanning Electron Microscopy with Energy Dispersive X-ray (SEM-EDS) microanalysis) allow the determination of the presence of elements typically found in gunshot residue (GSR) particles, such analyses provide no information about a given particle's actual source. Possible origins for which scientists may need to account for are a primary exposure to the discharge of a firearm or a secondary transfer due to a contaminated environment. In order to approach such sources of uncertainty in the context of evidential assessment, this paper studies the construction and practical implementation of graphical probability models (i.e. Bayesian networks). These can assist forensic scientists in making the issue tractable within a probabilistic perspective. The proposed models focus on likelihood ratio calculations at various levels of detail as well as case pre-assessment.
Resumo:
We modelled the future distribution in 2050 of 975 endemic plant species in southern Africa distributed among seven life forms, including new methodological insights improving the accuracy and ecological realism of predictions of global changes studies by: (i) using only endemic species as a way to capture the full realized niche of species, (ii) considering the direct impact of human pressure on landscape and biodiversity jointly with climate, and (iii) taking species' migration into account. Our analysis shows important promises for predicting the impacts of climate change in conjunction with land transformation. We have shown that the endemic flora of Southern Africa on average decreases with 41% in species richness among habitats and with 39% on species distribution range for the most optimistic scenario. We also compared the patterns of species' sensitivity with global change across life forms, using ecological and geographic characteristics of species. We demonstrate here that species and life form vulnerability to global changes can be partly explained according to species' (i) geographical distribution along climatic and biogeographic gradients, like climate anomalies, (ii) niche breadth or (iii) proximity to barrier preventing migration. Our results confirm that the sensitivity of a given species to global environmental changes depends upon its geographical distribution and ecological proprieties, and makes it possible to estimate a priori its potential sensitivity to these changes.
Resumo:
PURPOSE: All methods presented to date to map both conductivity and permittivity rely on multiple acquisitions to compute quantitatively the magnitude of radiofrequency transmit fields, B1+. In this work, we propose a method to compute both conductivity and permittivity based solely on relative receive coil sensitivities ( B1-) that can be obtained in one single measurement without the need to neither explicitly perform transmit/receive phase separation nor make assumptions regarding those phases. THEORY AND METHODS: To demonstrate the validity and the noise sensitivity of our method we used electromagnetic finite differences simulations of a 16-channel transceiver array. To experimentally validate our methodology at 7 Tesla, multi compartment phantom data was acquired using a standard 32-channel receive coil system and two-dimensional (2D) and 3D gradient echo acquisition. The reconstructed electric properties were correlated to those measured using dielectric probes. RESULTS: The method was demonstrated both in simulations and in phantom data with correlations to both the modeled and bench measurements being close to identity. The noise properties were modeled and understood. CONCLUSION: The proposed methodology allows to quantitatively determine the electrical properties of a sample using any MR contrast, with the only constraint being the need to have 4 or more receive coils and high SNR. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.