958 resultados para Probabilistic generalization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Early detection of breast cancer (BC) with mammography may cause overdiagnosis and overtreatment, detecting tumors which would remain undiagnosed during a lifetime. The aims of this study were: first, to model invasive BC incidence trends in Catalonia (Spain) taking into account reproductive and screening data; and second, to quantify the extent of BC overdiagnosis. Methods: We modeled the incidence of invasive BC using a Poisson regression model. Explanatory variables were: age at diagnosis and cohort characteristics (completed fertility rate, percentage of women that use mammography at age 50, and year of birth). This model also was used to estimate the background incidence in the absence of screening. We used a probabilistic model to estimate the expected BC incidence if women in the population used mammography as reported in health surveys. The difference between the observed and expected cumulative incidences provided an estimate of overdiagnosis. Results: Incidence of invasive BC increased, especially in cohorts born from 1940 to 1955. The biggest increase was observed in these cohorts between the ages of 50 to 65 years, where the final BC incidence rates more than doubled the initial ones. Dissemination of mammography was significantly associated with BC incidence and overdiagnosis. Our estimates of overdiagnosis ranged from 0.4% to 46.6%, for women born around 1935 and 1950, respectively. Conclusions: Our results support the existence of overdiagnosis in Catalonia attributed to mammography usage, and the limited malignant potential of some tumors may play an important role. Women should be better informed about this risk. Research should be oriented towards personalized screening and risk assessment tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To permit the tracking of turbulent flow structures in an Eulerian frame from single-point measurements, we make use of a generalization of conventional two-dimensional quadrant analysis to three-dimensional octants. We characterize flow structures using the sequences of these octants and show how significance may be attached to particular sequences using statistical mull models. We analyze an example experiment and show how a particular dominant flow structure can be identified from the conditional probability of octant sequences. The frequency of this structure corresponds to the dominant peak in the velocity spectra and exerts a high proportion of the total shear stress. We link this structure explicitly to the propensity for sediment entrainment and show that greater insight into sediment entrainment can be obtained by disaggregating those octants that occur within the identified macroturbulence structure from those that do not. Hence, this work goes beyond critiques of Reynolds stress approaches to bed load entrainment that highlight the importance of outward interactions, to identifying and prioritizing the quadrants/octants that define particular flow structures. Key Points <list list-type=''bulleted'' id=''jgrf20196-list-0001''> <list-item id=''jgrf20196-li-0001''>A new method for analysing single point velocity data is presented <list-item id=''jgrf20196-li-0002''>Flow structures are identified by a sequence of flow states (termed octants) <list-item id=''jgrf20196-li-0003''>The identified structure exerts high stresses and causes bed-load entrainment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El model de classes de P. Bourdieu articula la dimensió objectiva —l’estructura de classe— i la dimensió subjectiva —l’acció de classe—. Aquesta anàlisi de classe forma part d’un debat al voltant de la naturalesa de la realitat social, en el qual el punt més important és la connexió del camp de la producció i del camp de la reproducció de subjectes. Bourdieu, en enfasitzar la condició relacional de l’àmbit social, defineix l’acció social com a dependent de l’estructura relacional, en existir una connexió lògica necessària entre la localització dels agents en un conjunt de relacions socials i els seus interessos, objectius i estratègies d’acció. Per tant, per a aquest autor, hi ha una eficàcia estructuradora de l’acció dels agents socials per l’estructura de classe, i es constitueix, així, en una matriu d’acció, o millor dit, en una estructura probabilística de l’acció. És, doncs, fonamental, considerar el paper de l’acció en la construcció de les classes en si, ja que les classes teòriques, agrupacions fictícies que només existeixen sobre el paper, estan predisposades a convertir-se en classes en el sentit marxista del terme. I sols es passa de la classe sobre el paper a clase real a costa d’una labor política de movilització. Així, l’existència de classes, tant en la teoria com en la realitat, és una aposta de lluites en existir un espai social, un espai de diferències, en el qual les classes existeixen en estat virtual, no com quelcom donat, sinó com quelcom que es tracta de construir.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Breast cancer (BC) causes more deaths than any other cancer among women in Catalonia. Early detection has contributed to the observed decline in BC mortality. However, there is debate on the optimal screening strategy. We performed an economic evaluation of 20 screening strategies taking into account the cost over time of screening and subsequent medical costs, including diagnostic confirmation, initial treatment, follow-up and advanced care. Methods: We used a probabilistic model to estimate the effect and costs over time of each scenario. The effect was measured as years of life (YL), quality-adjusted life years (QALY), and lives extended (LE). Costs of screening and treatment were obtained from the Early Detection Program and hospital databases of the IMAS-Hospital del Mar in Barcelona. The incremental cost-effectiveness ratio (ICER) was used to compare the relative costs and outcomes of different scenarios. Results: Strategies that start at ages 40 or 45 and end at 69 predominate when the effect is measured as YL or QALYs. Biennial strategies 50-69, 45-69 or annual 45-69, 40-69 and 40-74 were selected as cost-effective for both effect measures (YL or QALYs). The ICER increases considerably when moving from biennial to annual scenarios. Moving from no screening to biennial 50-69 years represented an ICER of 4,469€ per QALY. Conclusions: A reduced number of screening strategies have been selected for consideration by researchers, decision makers and policy planners. Mathematical models are useful to assess the impact and costs of BC screening in a specific geographical area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The G1-to-S transition of the cell cycle in the yeast Saccharomyces cerevisiae involves an extensive transcriptional program driven by transcription factors SBF (Swi4-Swi6) and MBF (Mbp1-Swi6). Activation of these factors ultimately depends on the G1 cyclin Cln3. Results: To determine the transcriptional targets of Cln3 and their dependence on SBF or MBF, we first have used DNA microarrays to interrogate gene expression upon Cln3 overexpression in synchronized cultures of strains lacking components of SBF and/or MBF. Secondly, we have integrated this expression dataset together with other heterogeneous data sources into a single probabilistic model based on Bayesian statistics. Our analysis has produced more than 200 transcription factor-target assignments, validated by ChIP assays and by functional enrichment. Our predictions show higher internal coherence and predictive power than previous classifications. Our results support a model whereby SBF and MBF may be differentially activated by Cln3. Conclusions: Integration of heterogeneous genome-wide datasets is key to building accurate transcriptional networks. By such integration, we provide here a reliable transcriptional network at the G1-to-S transition in the budding yeast cell cycle. Our results suggest that to improve the reliability of predictions we need to feed our models with more informative experimental data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multisensory memory traces established via single-trial exposures can impact subsequent visual object recognition. This impact appears to depend on the meaningfulness of the initial multisensory pairing, implying that multisensory exposures establish distinct object representations that are accessible during later unisensory processing. Multisensory contexts may be particularly effective in influencing auditory discrimination, given the purportedly inferior recognition memory in this sensory modality. The possibility of this generalization and the equivalence of effects when memory discrimination was being performed in the visual vs. auditory modality were at the focus of this study. First, we demonstrate that visual object discrimination is affected by the context of prior multisensory encounters, replicating and extending previous findings by controlling for the probability of multisensory contexts during initial as well as repeated object presentations. Second, we provide the first evidence that single-trial multisensory memories impact subsequent auditory object discrimination. Auditory object discrimination was enhanced when initial presentations entailed semantically congruent multisensory pairs and was impaired after semantically incongruent multisensory encounters, compared to sounds that had been encountered only in a unisensory manner. Third, the impact of single-trial multisensory memories upon unisensory object discrimination was greater when the task was performed in the auditory vs. visual modality. Fourth, there was no evidence for correlation between effects of past multisensory experiences on visual and auditory processing, suggestive of largely independent object processing mechanisms between modalities. We discuss these findings in terms of the conceptual short term memory (CSTM) model and predictive coding. Our results suggest differential recruitment and modulation of conceptual memory networks according to the sensory task at hand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let (P, Q) be a C 1 vector field defined in a open subset U ⊂ R2 . We call a null divergence factor a C 1 solution V (x, y) of the equation P ∂V + Q ∂V = ∂P + ∂Q V . In previous works ∂x ∂y ∂x ∂y it has been shown that this function plays a fundamental role in the problem of the center and in the determination of the limit cycles. In this paper we show how to construct systems with a given null divergence factor. The method presented in this paper is a generalization of the classical Darboux method to generate integrable systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Para preservar la biodiversidad de los ecosistemas forestales de la Europa mediterránea en escenarios actuales y futuros de cambio global mediante una gestión forestal sostenible es necesario determinar cómo influye el medio ambiente y las propias características de los bosques sobre la biodiversidad que éstos albergan. Con este propósito, se analizó la influencia de diferentes factores ambientales y de estructura y composición del bosque sobre la riqueza de aves forestales a escala 1 × 1 km en Cataluña (NE de España). Se construyeron modelos univariantes y multivariantes de redes neuronales para respectivamente explorar la respuesta individual a las variables y obtener un modelo parsimonioso (ecológicamente interpretable) y preciso. La superficie de bosque (con una fracción de cabida cubierta superior a 5%), la fracción de cabida cubierta media, la temperatura anual y la precipitación estival medias fueron los mejores predictores de la riqueza de aves forestales. La red neuronal multivariante obtenida tuvo una buena capacidad de generalización salvo en las localidades con una mayor riqueza. Además, los bosques con diferentes grados de apertura del dosel arbóreo, más maduros y más diversos en cuanto a su composición de especies arbóreas se asociaron de forma positiva con una mayor riqueza de aves forestales. Finalmente, se proporcionan directrices de gestión para la planificación forestal que permitan promover la diversidad ornítica en esta región de la Europa mediterránea.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Random problem distributions have played a key role in the study and design of algorithms for constraint satisfaction and Boolean satisfiability, as well as in ourunderstanding of problem hardness, beyond standard worst-case complexity. We consider random problem distributions from a highly structured problem domain that generalizes the Quasigroup Completion problem (QCP) and Quasigroup with Holes (QWH), a widely used domain that captures the structure underlying a range of real-world applications. Our problem domain is also a generalization of the well-known Sudoku puz- zle: we consider Sudoku instances of arbitrary order, with the additional generalization that the block regions can have rectangular shape, in addition to the standard square shape. We evaluate the computational hardness of Generalized Sudoku instances, for different parameter settings. Our experimental hardness results show that we can generate instances that are considerably harder than QCP/QWH instances of the same size. More interestingly, we show the impact of different balancing strategies on problem hardness. We also provide insights into backbone variables in Generalized Sudoku instances and how they correlate to problem hardness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method for dealing with monotonicity constraints in optimal control problems is used to generalize some results in the context of monopoly theory, also extending the generalization to a large family of principal-agent programs. Our main conclusion is that many results on diverse economic topics, achieved under assumptions of continuity and piecewise differentiability in connection with the endogenous variables of the problem, still remain valid after replacing such assumptions by two minimal requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkimuksen tarkoitus on selvittää millaista tilintarkastuksen odotuskuilu on, mitkä syyt siihen johtavat ja miten odotuskuilua voidaan kaventaa. Aluksi käydään läpi tilintarkastusta, hyvää tilintarkastustapaa, mitä lakisääteiseen tilintarkastukseen sekä konsultointiin kuuluu ja mitä erikoisuuksia pk-yritysten tilintarkastamiseen kuuluu. Sen jälkeen käsitellään tilintarkastuksen odotuskuilua kirjallisuuden ja artikkelien perusteella. Tässä käydään tarkemmin läpi mitä tilintarkastuksen odotuskuilu on, mitkä syyt siihen johtavat sekä miten sitä voidaan kaventaa. Tämän jälkeen käydään tutkimuksen empiiristä osiota läpi, jossa auktorisoidut tilintarkastajat ovat vastanneet tilintarkastuksen odotuskuilua koskeviin kysymyksiin. Voidaan sanoa, että tilintarkastuksen odotuskuilu on kuilu yleisön odotuksista tilintarkastuksesta verrattuna siihen, mitä lakisääteinen tilintarkastus itse asiassa on. Odotuskuiluun on monia syitä, muun muassa tilintarkastuksen todennäköinen luonne ja yleisön ylisuuret odotukset. Odotuskuilua voidaan kaventaa kahdella tavalla: valistamalla yleisöä tai muuttamalla tilintarkastuksen luonnetta vastaamaan yleisön odotuksia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To compare the cost and effectiveness of the levonorgestrel-releasing intrauterine system (LNG-IUS) versus combined oral contraception (COC) and progestogens (PROG) in first-line treatment of dysfunctional uterine bleeding (DUB) in Spain. STUDY DESIGN: A cost-effectiveness and cost-utility analysis of LNG-IUS, COC and PROG was carried out using a Markov model based on clinical data from the literature and expert opinion. The population studied were women with a previous diagnosis of idiopathic heavy menstrual bleeding. The analysis was performed from the National Health System perspective, discounting both costs and future effects at 3%. In addition, a sensitivity analysis (univariate and probabilistic) was conducted. RESULTS: The results show that the greater efficacy of LNG-IUS translates into a gain of 1.92 and 3.89 symptom-free months (SFM) after six months of treatment versus COC and PROG, respectively (which represents an increase of 33% and 60% of symptom-free time). Regarding costs, LNG-IUS produces savings of 174.2-309.95 and 230.54-577.61 versus COC and PROG, respectively, after 6 months-5 years. Apart from cost savings and gains in SFM, quality-adjusted life months (QALM) are also favourable to LNG-IUS in all scenarios, with a range of gains between 1 and 2 QALM compared to COC and PROG. CONCLUSIONS: The results indicate that first-line use of the LNG-IUS is the dominant therapeutic option (less costly and more effective) in comparison with first-line use of COC or PROG for the treatment of DUB in Spain. LNG-IUS as first line is also the option that provides greatest health-related quality of life to patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Target identification for tractography studies requires solid anatomical knowledge validated by an extensive literature review across species for each seed structure to be studied. Manual literature review to identify targets for a given seed region is tedious and potentially subjective. Therefore, complementary approaches would be useful. We propose to use text-mining models to automatically suggest potential targets from the neuroscientific literature, full-text articles and abstracts, so that they can be used for anatomical connection studies and more specifically for tractography. We applied text-mining models to three structures: two well-studied structures, since validated deep brain stimulation targets, the internal globus pallidus and the subthalamic nucleus and, the nucleus accumbens, an exploratory target for treating psychiatric disorders. We performed a systematic review of the literature to document the projections of the three selected structures and compared it with the targets proposed by text-mining models, both in rat and primate (including human). We ran probabilistic tractography on the nucleus accumbens and compared the output with the results of the text-mining models and literature review. Overall, text-mining the literature could find three times as many targets as two man-weeks of curation could. The overall efficiency of the text-mining against literature review in our study was 98% recall (at 36% precision), meaning that over all the targets for the three selected seeds, only one target has been missed by text-mining. We demonstrate that connectivity for a structure of interest can be extracted from a very large amount of publications and abstracts. We believe this tool will be useful in helping the neuroscience community to facilitate connectivity studies of particular brain regions. The text mining tools used for the study are part of the HBP Neuroinformatics Platform, publicly available at http://connectivity-brainer.rhcloud.com/.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).