858 resultados para Robust Probabilistic Model, Dyslexic Users, Rewriting, Question-Answering
Resumo:
A comprehensive probabilistic model for simulating microstructure formation and evolution during solidification has been developed, based on coupling a Finite Differential Method (FDM) for macroscopic modelling of heat diffusion to a modified Cellular Automaton (mCA) for microscopic modelling of nucleation, growth of microstructures and solute diffusion. The mCA model is similar to Nastac's model for handling solute redistribution in the liquid and solid phases, curvature and growth anisotropy, but differs in the treatment of nucleation and growth. The aim is to improve understanding of the relationship between the solidification conditions and microstructure formation and evolution. A numerical algorithm used for FDM and mCA was developed. At each coarse scale, temperatures at FDM nodes were calculated while nucleation-growth simulation was done at a finer scale, with the temperature at the cell locations being interpolated from those at the coarser volumes. This model takes account of thermal, curvature and solute diffusion effects. Therefore, it can not only simulate microstructures of alloys both on the scale of grain size (macroscopic level) and the dendrite tip length (mesoscopic level), but also investigate nucleation mechanisms and growth kinetics of alloys solidified with various solute concentrations and solidification morphologies. The calculated results are compared with values of grain sizes and solidification morphologies of microstructures obtained from a set of casting experiments of Al-Si alloys in graphite crucibles.
Resumo:
Mapas Conceituais são representações gráficas do conhecimento de uma pessoa num dado momento e área de conhecimento. Por sua natureza investigativa, são utilizados como ferramentas de apoio em abordagens pedagógicas que objetivam promover a aprendizagem significativa. No entanto, o processo de avaliação de um mapa tende a ser custoso pois acarreta uma pesada carga de processamento cognitivo por parte do avaliador, já que este precisa mapear os conceitos e relações em busca de nuances de conhecimento alí presentes. Essa pesquisa tem por objetivo aumentar o nível de abstração nas interações entre o avaliador e os mapas conceituais fornecendo uma camada intermediária de inteligência computacional que favoreça a comunicação por meio de perguntas e respostas em linguagem natural, fornecendo ao avaliador ferramentas que lhe permita examinar o conteúdo do mapa conceitual sem exigir deste o mapeamento visual dos conceitos e relações presentes nos mapas avaliados. Uma ferramenta é prototipada e uma prova de conceito apresentada. A análise da arquitetura proposta permitiu definir uma arquitetura final com características que permitem potencializar o uso de mapas conceituais e facilitar diversas operações pedagógicas com estes. Essa pesquisa situa-se na área de investigação de sistemas de perguntas e resposta, aplicando técnicas de processamento de linguagem natural para análise da pergunta e interpretação do mapa conceitual e aplica técnica de inteligência artificial para inferir respostas às perguntas.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização em Estruturas
Resumo:
Hand-off (or hand-over), the process where mobile nodes select the best access point available to transfer data, has been well studied in wireless networks. The performance of a hand-off process depends on the specific characteristics of the wireless links. In the case of low-power wireless networks, hand-off decisions must be carefully taken by considering the unique properties of inexpensive low-power radios. This paper addresses the design, implementation and evaluation of smart-HOP, a hand-off mechanism tailored for low-power wireless networks. This work has three main contributions. First, it formulates the hard hand-off process for low-power networks (such as typical wireless sensor networks - WSNs) with a probabilistic model, to investigate the impact of the most relevant channel parameters through an analytical approach. Second, it confirms the probabilistic model through simulation and further elaborates on the impact of several hand-off parameters. Third, it fine-tunes the most relevant hand-off parameters via an extended set of experiments, in a realistic experimental scenario. The evaluation shows that smart-HOP performs well in the transitional region while achieving more than 98 percent relative delivery ratio and hand-off delays in the order of a few tens of a milliseconds.
Resumo:
The Evidence Accumulation Clustering (EAC) paradigm is a clustering ensemble method which derives a consensus partition from a collection of base clusterings obtained using different algorithms. It collects from the partitions in the ensemble a set of pairwise observations about the co-occurrence of objects in a same cluster and it uses these co-occurrence statistics to derive a similarity matrix, referred to as co-association matrix. The Probabilistic Evidence Accumulation for Clustering Ensembles (PEACE) algorithm is a principled approach for the extraction of a consensus clustering from the observations encoded in the co-association matrix based on a probabilistic model for the co-association matrix parameterized by the unknown assignments of objects to clusters. In this paper we extend the PEACE algorithm by deriving a consensus solution according to a MAP approach with Dirichlet priors defined for the unknown probabilistic cluster assignments. In particular, we study the positive regularization effect of Dirichlet priors on the final consensus solution with both synthetic and real benchmark data.
Resumo:
Faced with the problem of pricing complex contingent claims, an investor seeks to make his valuations robust to model uncertainty. We construct a notion of a model- uncertainty-induced utility function and show that model uncertainty increases the investor's eff ective risk aversion. Using the model-uncertainty-induced utility function, we extend the \No Good Deals" methodology of Cochrane and Sa a-Requejo [2000] to compute lower and upper good deal bounds in the presence of model uncertainty. We illustrate the methodology using some numerical examples.
Resumo:
In population surveys in wich the Schistosoma mansoni intensity of infection is low, or in localities where the schistosomiasis control program had success the parasitologic methods lack in sensitivity. Despite of some limitations the immunological methods are useful to provide valuable information in such field conditions. Thus, the prevalaence of schistosomiasis in untreated population can be determined by the detection of IgG or IgM antibodies, as well as the incidence by the IgA antibodies , employing mainly immunofluorescence (IF) and immunoenzymatic (ELISA), and in some extent hemagglutination (HA) or even skin test. The true prevalence and incidence of schistosomiasis can be estimated using a probabilistic model equation, since knowing before-hand the sensitivity and specificity of emploved test. The sensitivity and the specificity of serologic test become higher in low aged group, under 14. The geometric mean IF titers also gives a positive correlation with the intensity of infection. Presently there are need of serologic tests wich are economic and pratical in soroepidemiologic inquires, requiring no specialized personnel to collect population blood or serum and also easily interpret the test results. The reagents for such tests are desired to be stable and reproducible. Moreover, it is expected that the tests can distinguish an ative infection.
Resumo:
This research explores the advantages and disadvantages of collaborative learning departing from two different methodological studies. In the first one, we will go deep into the reflections about group work of a student-teacher in her first experiences during a two months practicum in Sabadell's Emily Bronte. In the second one, we will analyze in a more empirical way the interaction that takes place among a trio of students engaged in a question-answering task about a text based on a three minutes vignette recorded on January 2010
Resumo:
In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.
Resumo:
Hidden Markov models (HMMs) are probabilistic models that are well adapted to many tasks in bioinformatics, for example, for predicting the occurrence of specific motifs in biological sequences. MAMOT is a command-line program for Unix-like operating systems, including MacOS X, that we developed to allow scientists to apply HMMs more easily in their research. One can define the architecture and initial parameters of the model in a text file and then use MAMOT for parameter optimization on example data, decoding (like predicting motif occurrence in sequences) and the production of stochastic sequences generated according to the probabilistic model. Two examples for which models are provided are coiled-coil domains in protein sequences and protein binding sites in DNA. A wealth of useful features include the use of pseudocounts, state tying and fixing of selected parameters in learning, and the inclusion of prior probabilities in decoding. AVAILABILITY: MAMOT is implemented in C++, and is distributed under the GNU General Public Licence (GPL). The software, documentation, and example model files can be found at http://bcf.isb-sib.ch/mamot
Resumo:
Early detection of breast cancer (BC) with mammography may cause overdiagnosis andovertreatment, detecting tumors which would remain undiagnosed during a lifetime. The aims of this study were: first, to model invasive BC incidence trends in Catalonia (Spain) taking into account reproductive and screening data; and second, to quantify the extent of BC overdiagnosis. We modeled the incidence of invasive BC using a Poisson regression model. Explanatory variables were:age at diagnosis and cohort characteristics (completed fertility rate, percentage of women that use mammography at age 50, and year of birth). This model also was used to estimate the background incidence in the absence of screening. We used a probabilistic model to estimate the expected BC incidence if women in the population usedmammography as reported in health surveys. The difference between the observed and expected cumulative incidences provided an estimate of overdiagnosis.Incidence of invasive BC increased, especially in cohorts born from 1940 to 1955. The biggest increase was observed in these cohorts between the ages of 50 to 65 years, where the final BC incidence rates more than doubled the initial ones. Dissemination of mammography was significantly associated with BC incidence and overdiagnosis. Our estimates of overdiagnosis ranged from 0.4% to 46.6%, for women born around 1935 and 1950, respectively.Our results support the existence of overdiagnosis in Catalonia attributed to mammography usage, and the limited malignant potential of some tumors may play an important role. Women should be better informed about this risk. Research should be oriented towards personalized screening and risk assessment tools
Resumo:
Given a set of images of scenes containing different object categories (e.g. grass, roads) our objective is to discover these objects in each image, and to use this object occurrences to perform a scene classification (e.g. beach scene, mountain scene). We achieve this by using a supervised learning algorithm able to learn with few images to facilitate the user task. We use a probabilistic model to recognise the objects and further we classify the scene based on their object occurrences. Experimental results are shown and evaluated to prove the validity of our proposal. Object recognition performance is compared to the approaches of He et al. (2004) and Marti et al. (2001) using their own datasets. Furthermore an unsupervised method is implemented in order to evaluate the advantages and disadvantages of our supervised classification approach versus an unsupervised one
Resumo:
Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.
Resumo:
The objective of this study consists in quantifying in money terms the potential reduction in usage of public health care outlets associated to the tenure of double (public plus private) insurance. In order to address the problem, a probabilistic model for visits to physicians is specified and estimated using data from the Catalonian Health Survey. Also, a model for the marginal cost of a visit to a physician is estimated using data from a representative sample of fee-for-service payments from a major insurer. Combining the estimates from the two models it is possible to quantify in money terms the cost/savings of alternative policies which bear an impact on the adoption of double insurance by the population. The results suggest that the private sector absorbs an important volume of demand which would be re-directed to the public sector if consumers cease to hold double insurance.
Resumo:
The objective of this study consists in quantifying in money terms thepotential reduction in usage of public health care outlets associatedto the tenure of double (public plus private) insurance. In order to address the problem, a probabilistic model for visits to physicians is specified and estimated using data from the Catalonian Health Survey. Also, a model for the marginal cost of a visit to a physician is estimated using data from a representative sample of fee-for-service payments from a major insurer. Combining the estimates from the two models it is possible to quantify in money terms the cost/savings of alternative policies which bear an impact on the adoption of double insurance by the population. The results suggest that the private sector absorbs an important volumeof demand which would be re-directed to the public sector if consumerscease to hold double insurance.