17 resultados para probabilistic model

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Early detection of breast cancer (BC) with mammography may cause overdiagnosis andovertreatment, detecting tumors which would remain undiagnosed during a lifetime. The aims of this study were: first, to model invasive BC incidence trends in Catalonia (Spain) taking into account reproductive and screening data; and second, to quantify the extent of BC overdiagnosis. We modeled the incidence of invasive BC using a Poisson regression model. Explanatory variables were:age at diagnosis and cohort characteristics (completed fertility rate, percentage of women that use mammography at age 50, and year of birth). This model also was used to estimate the background incidence in the absence of screening. We used a probabilistic model to estimate the expected BC incidence if women in the population usedmammography as reported in health surveys. The difference between the observed and expected cumulative incidences provided an estimate of overdiagnosis.Incidence of invasive BC increased, especially in cohorts born from 1940 to 1955. The biggest increase was observed in these cohorts between the ages of 50 to 65 years, where the final BC incidence rates more than doubled the initial ones. Dissemination of mammography was significantly associated with BC incidence and overdiagnosis. Our estimates of overdiagnosis ranged from 0.4% to 46.6%, for women born around 1935 and 1950, respectively.Our results support the existence of overdiagnosis in Catalonia attributed to mammography usage, and the limited malignant potential of some tumors may play an important role. Women should be better informed about this risk. Research should be oriented towards personalized screening and risk assessment tools

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Given a set of images of scenes containing different object categories (e.g. grass, roads) our objective is to discover these objects in each image, and to use this object occurrences to perform a scene classification (e.g. beach scene, mountain scene). We achieve this by using a supervised learning algorithm able to learn with few images to facilitate the user task. We use a probabilistic model to recognise the objects and further we classify the scene based on their object occurrences. Experimental results are shown and evaluated to prove the validity of our proposal. Object recognition performance is compared to the approaches of He et al. (2004) and Marti et al. (2001) using their own datasets. Furthermore an unsupervised method is implemented in order to evaluate the advantages and disadvantages of our supervised classification approach versus an unsupervised one

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this study consists in quantifying in money terms the potential reduction in usage of public health care outlets associated to the tenure of double (public plus private) insurance. In order to address the problem, a probabilistic model for visits to physicians is specified and estimated using data from the Catalonian Health Survey. Also, a model for the marginal cost of a visit to a physician is estimated using data from a representative sample of fee-for-service payments from a major insurer. Combining the estimates from the two models it is possible to quantify in money terms the cost/savings of alternative policies which bear an impact on the adoption of double insurance by the population. The results suggest that the private sector absorbs an important volume of demand which would be re-directed to the public sector if consumers cease to hold double insurance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this study consists in quantifying in money terms thepotential reduction in usage of public health care outlets associatedto the tenure of double (public plus private) insurance. In order to address the problem, a probabilistic model for visits to physicians is specified and estimated using data from the Catalonian Health Survey. Also, a model for the marginal cost of a visit to a physician is estimated using data from a representative sample of fee-for-service payments from a major insurer. Combining the estimates from the two models it is possible to quantify in money terms the cost/savings of alternative policies which bear an impact on the adoption of double insurance by the population. The results suggest that the private sector absorbs an important volumeof demand which would be re-directed to the public sector if consumerscease to hold double insurance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Why was England first? And why Europe? We present a probabilistic model that builds on big-push models by Murphy, Shleifer and Vishny (1989), combined with hierarchical preferences. The interaction of exogenous demographic factors (in particular the English low-pressure variant of the European marriage pattern)and redistributive institutions such as the old Poor Law combined to make an Industrial Revolution more likely. Essentially, industrialization is the result of having a critical mass of consumers that is rich enough to afford (potentially) mass-produced goods. Our model is then calibrated to match the main characteristics of the English economy in 1750 and the observed transition until 1850.This allows us to address explicitly one of the key features of the British IndustrialRevolution unearthed by economic historians over the last three decades the slowness of productivity and output change. In our calibration, we find that the probability of Britain industrializing is 5 times larger than France s. Contrary to the recent argument by Pomeranz, China in the 18th century had essentially no chance to industrialize at all. This difference is decomposed into a demographic and a policy component, with the former being far more important than the latter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Early detection of breast cancer (BC) with mammography may cause overdiagnosis and overtreatment, detecting tumors which would remain undiagnosed during a lifetime. The aims of this study were: first, to model invasive BC incidence trends in Catalonia (Spain) taking into account reproductive and screening data; and second, to quantify the extent of BC overdiagnosis. Methods: We modeled the incidence of invasive BC using a Poisson regression model. Explanatory variables were: age at diagnosis and cohort characteristics (completed fertility rate, percentage of women that use mammography at age 50, and year of birth). This model also was used to estimate the background incidence in the absence of screening. We used a probabilistic model to estimate the expected BC incidence if women in the population used mammography as reported in health surveys. The difference between the observed and expected cumulative incidences provided an estimate of overdiagnosis. Results: Incidence of invasive BC increased, especially in cohorts born from 1940 to 1955. The biggest increase was observed in these cohorts between the ages of 50 to 65 years, where the final BC incidence rates more than doubled the initial ones. Dissemination of mammography was significantly associated with BC incidence and overdiagnosis. Our estimates of overdiagnosis ranged from 0.4% to 46.6%, for women born around 1935 and 1950, respectively. Conclusions: Our results support the existence of overdiagnosis in Catalonia attributed to mammography usage, and the limited malignant potential of some tumors may play an important role. Women should be better informed about this risk. Research should be oriented towards personalized screening and risk assessment tools.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Breast cancer (BC) causes more deaths than any other cancer among women in Catalonia. Early detection has contributed to the observed decline in BC mortality. However, there is debate on the optimal screening strategy. We performed an economic evaluation of 20 screening strategies taking into account the cost over time of screening and subsequent medical costs, including diagnostic confirmation, initial treatment, follow-up and advanced care. Methods: We used a probabilistic model to estimate the effect and costs over time of each scenario. The effect was measured as years of life (YL), quality-adjusted life years (QALY), and lives extended (LE). Costs of screening and treatment were obtained from the Early Detection Program and hospital databases of the IMAS-Hospital del Mar in Barcelona. The incremental cost-effectiveness ratio (ICER) was used to compare the relative costs and outcomes of different scenarios. Results: Strategies that start at ages 40 or 45 and end at 69 predominate when the effect is measured as YL or QALYs. Biennial strategies 50-69, 45-69 or annual 45-69, 40-69 and 40-74 were selected as cost-effective for both effect measures (YL or QALYs). The ICER increases considerably when moving from biennial to annual scenarios. Moving from no screening to biennial 50-69 years represented an ICER of 4,469€ per QALY. Conclusions: A reduced number of screening strategies have been selected for consideration by researchers, decision makers and policy planners. Mathematical models are useful to assess the impact and costs of BC screening in a specific geographical area.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The G1-to-S transition of the cell cycle in the yeast Saccharomyces cerevisiae involves an extensive transcriptional program driven by transcription factors SBF (Swi4-Swi6) and MBF (Mbp1-Swi6). Activation of these factors ultimately depends on the G1 cyclin Cln3. Results: To determine the transcriptional targets of Cln3 and their dependence on SBF or MBF, we first have used DNA microarrays to interrogate gene expression upon Cln3 overexpression in synchronized cultures of strains lacking components of SBF and/or MBF. Secondly, we have integrated this expression dataset together with other heterogeneous data sources into a single probabilistic model based on Bayesian statistics. Our analysis has produced more than 200 transcription factor-target assignments, validated by ChIP assays and by functional enrichment. Our predictions show higher internal coherence and predictive power than previous classifications. Our results support a model whereby SBF and MBF may be differentially activated by Cln3. Conclusions: Integration of heterogeneous genome-wide datasets is key to building accurate transcriptional networks. By such integration, we provide here a reliable transcriptional network at the G1-to-S transition in the budding yeast cell cycle. Our results suggest that to improve the reliability of predictions we need to feed our models with more informative experimental data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new model for dealing with decision making under risk by considering subjective and objective information in the same formulation is here presented. The uncertain probabilistic weighted average (UPWA) is also presented. Its main advantage is that it unifies the probability and the weighted average in the same formulation and considering the degree of importance that each case has in the analysis. Moreover, it is able to deal with uncertain environments represented in the form of interval numbers. We study some of its main properties and particular cases. The applicability of the UPWA is also studied and it is seen that it is very broad because all the previous studies that use the probability or the weighted average can be revised with this new approach. Focus is placed on a multi-person decision making problem regarding the selection of strategies by using the theory of expertons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent single-cell studies in monkeys (Romo et al., 2004) show that the activity of neurons in the ventral premotor cortex covaries with the animal's decisions in a perceptual comparison task regarding the frequency of vibrotactile events. The firing rate response of these neurons was dependent only on the frequency differences between the two applied vibrations, the sign of that difference being the determining factor for correct task performance. We present a biophysically realistic neurodynamical model that can account for the most relevant characteristics of this decision-making-related neural activity. One of the nontrivial predictions of this model is that Weber's law will underlie the perceptual discrimination behavior. We confirmed this prediction in behavioral tests of vibrotactile discrimination in humans and propose a computational explanation of perceptual discrimination that accounts naturally for the emergence of Weber's law. We conclude that the neurodynamical mechanisms and computational principles underlying the decision-making processes in this perceptual discrimination task are consistent with a fluctuation-driven scenario in a multistable regime.