793 resultados para Probabilistic Aggregation Criteria
Resumo:
This journal provides immediate open access to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge.
Resumo:
This article is is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. Attribution-NonCommercial (CC BY-NC) license lets others remix, tweak, and build upon work non-commercially, and although the new works must also acknowledge & be non-commercial.
Resumo:
International Scientific Forum, ISF 2013, ISF 2013, 12-14 December 2013, Tirana.
Resumo:
3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.
Resumo:
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Resumo:
Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.
Resumo:
This paper reviews almost four decades of contributions on the subject of supervised regionalization methods. These methods aggregate a set of areas into a predefined number of spatially contiguous regions while optimizing certain aggregation criteria. The authors present a taxonomic scheme that classifies a wide range of regionalization methods into eight groups, based on the strategy applied for satisfying the spatial contiguity constraint. The paper concludes by providing a qualitative comparison of these groups in terms of a set of certain characteristics, and by suggesting future lines of research for extending and improving these methods.
Resumo:
This paper reviews almost four decades of contributions on the subject of supervised regionalization methods. These methods aggregate a set of areas into a predefined number of spatially contiguous regions while optimizing certain aggregation criteria. The authors present a taxonomic scheme that classifies a wide range of regionalization methods into eight groups, based on the strategy applied for satisfying the spatial contiguity constraint. The paper concludes by providing a qualitative comparison of these groups in terms of a set of certain characteristics, and by suggesting future lines of research for extending and improving these methods.
Resumo:
We developed a gel- and label-free proteomics platform for comparative studies of human serum. The method involves the depletion of the six most abundant proteins, protein fractionation by Off-Gel IEF and RP-HPLC, followed by tryptic digestion, LC-MS/MS, protein identification, and relative quantification using probabilistic peptide match score summation (PMSS). We evaluated performance and reproducibility of the complete platform and the individual dimensions, by using chromatograms of the RP-HPLC runs, PMSS based abundance scores and abundance distributions as objective endpoints. We were interested if a relationship exists between the quantity ratio and the PMSS score ratio. The complete analysis was performed four times with two sets of serum samples containing different concentrations of spiked bovine beta-lactoglobulin (0.1 and 0.3%, w/w). The two concentrations resulted in significantly differing PMSS scores when compared to the variability in PMSS scores of all other protein identifications. We identified 196 proteins, of which 116 were identified four times in corresponding fractions whereof 73 qualified for relative quantification. Finally, we characterized the PMSS based protein abundance distributions with respect to the two dimensions of fractionation and discussed some interesting patterns representing discrete isoforms. We conclude that combination of Off-Gel electrophoresis (OGE) and HPLC is a reproducible protein fractionation technique, that PMSS is applicable for relative quantification, that the number of quantifiable proteins is always smaller than the number of identified proteins and that reproducibility of protein identifications should supplement probabilistic acceptance criteria.
Resumo:
The Water Framework Directive uses the “One-out, all-out” (OAOO) principle in assessing water bodies (i.e. the worst status of the elements used in the assessment determines the final status of the water body). Combination of multiple parameters within a biological quality element (BQEs) can be done in different ways. This study analysed several aggregation conditions within the BQE "Flora other than phytoplankton" (intertidal macroalgae, subtidal macroalgae, eelgrass beds and opportunistic blooms) using monitoring data collected along the Channel and Atlantic coastline. Four aggregation criteria were tested on two sets of data collected between 2004 and 2014: OOAO, average, intermediate method between OOAO and average and a method taking into account an uncertainty value at the threshold "Good/Moderate." Based on available data, the intermediate method appears the most qualified method using first an averaging approach between the natural habitat elements and then applying the OAOO between this mean and the opportunistic blooms, characteristic of an eutrophic environment. Expert judment might be used to ensure in the overall interpretation of results at waterbody level and in the classification outcomes.
Resumo:
Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.
Resumo:
Background: Huntington's disease (HD) is an inherited neurodegenerative disorder triggered by an expanded polyglutamine tract in huntingtin that is thought to confer a new conformational property on this large protein. The propensity of small amino-terminal fragments with mutant, but not wild-type, glutamine tracts to self-aggregate is consistent with an altered conformation but such fragments occur relatively late in the disease process in human patients and mouse models expressing full-length mutant protein. This suggests that the altered conformational property may act within the full-length mutant huntingtin to initially trigger pathogenesis. Indeed, genotypephenotype studies in HD have defined genetic criteria for the disease initiating mechanism, and these are all fulfilled by phenotypes associated with expression of full-length mutant huntingtin, but not amino-terminal fragment, in mouse models. As the in vitro aggregation of amino-terminal mutant huntingtin fragment offers a ready assay to identify small compounds that interfere with the conformation of the polyglutamine tract, we have identified a number of aggregation inhibitors, and tested whether these are also capable of reversing a phenotype caused by endogenous expressionof mutant huntingtin in a striatal cell line from the HdhQ111/Q111 knock-in mouse. Results: We screened the NINDS Custom Collection of 1,040 FDA approved drugs and bioactive compounds for their ability to prevent in vitro aggregation of Q58-htn 1¿171 amino terminal fragment. Ten compounds were identified that inhibited aggregation with IC50 < 15 ¿M, including gossypol, gambogic acid, juglone, celastrol, sanguinarine and anthralin. Of these, both juglone and celastrol were effective in reversing the abnormal cellular localization of full-length mutant huntingtin observed in mutant HdhQ111/Q111 striatal cells. Conclusions: At least some compounds identified as aggregation inhibitors also prevent a neuronal cellular phenotype caused by full-length mutant huntingtin, suggesting that in vitro fragment aggregation can act as a proxy for monitoring the disease-producing conformational property in HD. Thus, identification and testing of compounds that alter in vitro aggregation is a viable approach for defining potential therapeutic compounds that may act on the deleterious conformational property of full-length mutant huntingtin.
Resumo:
Tässä diplomityössä tehtiin Olkiluodon ydinvoimalaitoksella sijaitsevan käytetyn ydinpolttoaineen allasvarastointiin perustuvan välivaraston todennäköisyysperustainen ulkoisten uhkien riskianalyysi. Todennäköisyysperustainen riskianalyysi (PRA) on yleisesti käytetty riskien tunnistus- ja lähestymistapa ydinvoimalaitoksella. Työn tarkoituksena oli laatia täysin uusi ulkoisten uhkien PRA-analyysi, koska Suomessa ei ole aiemmin tehty vastaavanlaisia tämän tutkimusalueen riskitarkasteluja. Riskitarkastelun motiivina ovat myös maailmalla tapahtuneiden luonnonkatastrofien vuoksi korostunut ulkoisten uhkien rooli käytetyn ydinpolttoaineen välivarastoinnin turvallisuudessa. PRA analyysin rakenne pohjautui tutkimuksen alussa luotuun metodologiaan. Analyysi perustuu mahdollisten ulkoisten uhkien tunnistamiseen pois lukien ihmisen aikaansaamat tahalliset vahingot. Tunnistettujen ulkoisten uhkien esiintymistaajuuksien ja vahingoittamispotentiaalin perusteella ulkoiset uhat joko karsittiin pois tutkimuksessa määriteltyjen karsintakriteerien avulla tai analysoitiin tarkemmin. Tutkimustulosten perusteella voitiin todeta, että tiedot hyvin harvoin tapahtuvista ulkoisista uhista ovat epätäydellisiä. Suurinta osaa näistä hyvin harvoin tapahtuvista ulkoisista uhista ei ole koskaan esiintynyt eikä todennäköisesti koskaan tule esiintymään Olkiluodon vaikutusalueella tai edes Suomessa. Esimerkiksi salaman iskujen ja öljyaltistuksen roolit ja vaikutukset erilaisten komponenttien käytettävyyteen ovat epävarmasti tunnettuja. Tutkimuksen tuloksia voidaan pitää kokonaisuudessaan merkittävinä, koska niiden perusteella voidaan osoittaa ne ulkoiset uhat, joiden vaikutuksia olisi syytä tutkia tarkemmin. Yksityiskohtaisempi tietoisuus hyvin harvoin esiintyvistä ulkoisista uhista tarkentaisi alkutapahtumataajuuksien estimaatteja.
Harsanyi’s Social Aggregation Theorem : A Multi-Profile Approach with Variable-Population Extensions
Resumo:
This paper provides new versions of Harsanyi’s social aggregation theorem that are formulated in terms of prospects rather than lotteries. Strengthening an earlier result, fixed-population ex-ante utilitarianism is characterized in a multi-profile setting with fixed probabilities. In addition, we extend the social aggregation theorem to social-evaluation problems under uncertainty with a variable population and generalize our approach to uncertain alternatives, which consist of compound vectors of probability distributions and prospects.
Resumo:
We reconsider the problem of aggregating individual preference orderings into a single social ordering when alternatives are lotteries and individual preferences are of the von Neumann-Morgenstern type. Relative egalitarianism ranks alternatives by applying the leximin ordering to the distributions of (0-1) normalized utilities they generate. We propose an axiomatic characterization of this aggregation rule and discuss related criteria.