903 resultados para Probabilistic latent semantic analysis (PLSA)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of Mycobacterium tuberculosis to establish a latent infection (LTBI) in humans confounds the treatment of tuberculosis. Consequently, there is a need to discover new therapeutic agents that can kill M. tuberculosis both during active disease and LTBI. The streptomycin-dependent strain of M. tuberculosis, 18b, provides a useful tool for this purpose since upon removal of streptomycin (STR) it enters a non-replicating state that mimics latency both in vitro and in animal models. The 4.41 Mb genome sequence of M. tuberculosis 18b was determined and this revealed the strain to belong to clade 3 of the ancient ancestral lineage of the Beijing family. STR-dependence was attributable to insertion of a single cytosine in the 530 loop of the 16S rRNA and to a single amino acid insertion in the N-terminal domain of initiation factor 3. RNA-seq was used to understand the genetic programme activated upon STR-withdrawal and hence to gain insight into LTBI. This revealed reconfiguration of gene expression and metabolic pathways showing strong similarities between non-replicating 18b and M. tuberculosis residing within macrophages, and with the core stationary phase and microaerophilic responses. The findings of this investigation confirm the validity of 18b as a model for LTBI, and provide insight into both the evolution of tubercle bacilli and the functioning of the ribosome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a probabilistic approach to the problem of assigning k indivisible identical objects to a set of agents with single-peaked preferences. Using the ordinal extension of preferences, we characterize the class of uniform probabilistic rules by Pareto efficiency, strategy-proofness, and no-envy. We also show that in this characterization no-envy cannot be replaced by anonymity. When agents are strictly risk averse von-Neumann-Morgenstern utility maximizers, then we reduce the problem of assigning k identical objects to a problem of allocating the amount k of an infinitely divisible commodity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the help of an illustrative general equilibrium (CGE) model of the Moroccan Economy, we test for the significance of simulation results in the case where the exact macromesure is not known with certainty. This is done by computing lower and upper bounds for the simulation resukts, given a priori probabilities attached to three possible closures (Classical, Johansen, Keynesian). Our Conclusion is that, when there is uncertainty on closures several endogenous changes lack significance, which, in turn, limit the use of the model for policy prescriptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first two articles build procedures to simulate vector of univariate states and estimate parameters in nonlinear and non Gaussian state space models. We propose state space speci fications that offer more flexibility in modeling dynamic relationship with latent variables. Our procedures are extension of the HESSIAN method of McCausland[2012]. Thus, they use approximation of the posterior density of the vector of states that allow to : simulate directly from the state vector posterior distribution, to simulate the states vector in one bloc and jointly with the vector of parameters, and to not allow data augmentation. These properties allow to build posterior simulators with very high relative numerical efficiency. Generic, they open a new path in nonlinear and non Gaussian state space analysis with limited contribution of the modeler. The third article is an essay in commodity market analysis. Private firms coexist with farmers' cooperatives in commodity markets in subsaharan african countries. The private firms have the biggest market share while some theoretical models predict they disappearance once confronted to farmers cooperatives. Elsewhere, some empirical studies and observations link cooperative incidence in a region with interpersonal trust, and thus to farmers trust toward cooperatives. We propose a model that sustain these empirical facts. A model where the cooperative reputation is a leading factor determining the market equilibrium of a price competition between a cooperative and a private firm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La present tesi proposa una metodología per a la simulació probabilística de la fallada de la matriu en materials compòsits reforçats amb fibres de carboni, basant-se en l'anàlisi de la distribució aleatòria de les fibres. En els primers capítols es revisa l'estat de l'art sobre modelització matemàtica de materials aleatoris, càlcul de propietats efectives i criteris de fallada transversal en materials compòsits. El primer pas en la metodologia proposada és la definició de la determinació del tamany mínim d'un Element de Volum Representatiu Estadístic (SRVE) . Aquesta determinació es du a terme analitzant el volum de fibra, les propietats elàstiques efectives, la condició de Hill, els estadístics de les components de tensió i defromació, la funció de densitat de probabilitat i les funcions estadístiques de distància entre fibres de models d'elements de la microestructura, de diferent tamany. Un cop s'ha determinat aquest tamany mínim, es comparen un model periòdic i un model aleatori, per constatar la magnitud de les diferències que s'hi observen. Es defineix, també, una metodologia per a l'anàlisi estadístic de la distribució de la fibra en el compòsit, a partir d'imatges digitals de la secció transversal. Aquest anàlisi s'aplica a quatre materials diferents. Finalment, es proposa un mètode computacional de dues escales per a simular la fallada transversal de làmines unidireccionals, que permet obtenir funcions de densitat de probabilitat per a les variables mecàniques. Es descriuen algunes aplicacions i possibilitats d'aquest mètode i es comparen els resultats obtinguts de la simulació amb valors experimentals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structure and size of the eyes generated in numerically simulated tropical cyclones and polar lows have been studied. A primitive-equation numerical model simulated systems in which the structures of the eyes formed were consistent with available observations. Whilst the tropical cyclone eyes generated were usually rapidly rotating, it appeared impossible for an eye formed in a system with a polar environment to develop this type of structure. The polar low eyes were found to be unable to warm through the subsidence of air with high values of potential temperature, as the environment was approximately statically neutral. Factors affecting the size of the eye were investigated through a series of controlled experiments. In mature tropical cyclone systems the size of the eye was insensitive to small changes in initial conditions, surface friction and latent and sensible heating from the ocean. In contrast, the eye size was strongly dependent on these parameters in the mature polar lows. Consistent with the findings, a mechanism is proposed in which the size of the eye in simulated polar lows is controlled by the strength of subsidence within it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Poverty, as defined within development discourse, does not fully capture the reality in which the poor live, which is formed also by values and beliefs specific to a given culture and setting. This article uses a memetic approach to investigating the reality of poverty among pastoralists and urban dwellers in Kenya. By distinguishing the semantic space and the cultural context in which the definitions are framed, it enables the researcher to make sufficient generalisations while also recognising the differences between cultures. The results demonstrate how pastoralists and urban dwellers conceptualise poverty differently particularly in regard to causes. Further, the article suggests that development actors often utilise a Western construct which does not entirely reflect the values and beliefs of the poor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous linguistic operations have been assigned to cortical brain areas, but the contributions of subcortical structures to human language processing are still being discussed. Using simultaneous EEG recordings directly from deep brain structures and the scalp, we show that the human thalamus systematically reacts to syntactic and semantic parameters of auditorily presented language in a temporally interleaved manner in coordination with cortical regions. In contrast, two key structures of the basal ganglia, the globus pallidus internus and the subthalamic nucleus, were not found to be engaged in these processes. We therefore propose that syntactic and semantic language analysis is primarily realized within cortico-thalamic networks, whereas a cohesive basal ganglia network is not involved in these essential operations of language analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous functional imaging studies have shown that facilitated processing of a visual object on repeated, relative to initial, presentation (i.e., repetition priming) is associated with reductions in neural activity in multiple regions, including fusiforin/lateral occipital cortex. Moreover, activity reductions have been found, at diminished levels, when a different exemplar of an object is presented on repetition. In one previous study, the magnitude of diminished priming across exemplars was greater in the right relative to the left fusiform, suggesting greater exemplar specificity in the right. Another previous study, however, observed fusiform lateralization modulated by object viewpoint, but not object exemplar. The present fMRI study sought to determine whether the result of differential fusiform responses for perceptually different exemplars could be replicated. Furthermore, the role of the left fusiform cortex in object recognition was investigated via the inclusion of a lexical/semantic manipulation. Right fusiform cortex showed a significantly greater effect of exemplar change than left fusiform, replicating the previous result of exemplar-specific fusiform lateralization. Right fusiform and lateral occipital cortex were not differentially engaged by the lexical/semantic manipulation, suggesting that their role in visual object recognition is predominantly in the. C visual discrimination of specific objects. Activation in left fusiform cortex, but not left lateral occipital cortex, was modulated by both exemplar change and lexical/semantic manipulation, with further analysis suggesting a posterior-to-anterior progression between regions involved in processing visuoperceptual and lexical/semantic information about objects. The results are consistent with the view that the right fusiform plays a greater role in processing specific visual form information about objects, whereas the left fusiform is also involved in lexical/semantic processing. (C) 2003 Elsevier Science (USA). All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A large volume of visual content is inaccessible until effective and efficient indexing and retrieval of such data is achieved. In this paper, we introduce the DREAM system, which is a knowledge-assisted semantic-driven context-aware visual information retrieval system applied in the film post production domain. We mainly focus on the automatic labelling and topic map related aspects of the framework. The use of the context- related collateral knowledge, represented by a novel probabilistic based visual keyword co-occurrence matrix, had been proven effective via the experiments conducted during system evaluation. The automatically generated semantic labels were fed into the Topic Map Engine which can automatically construct ontological networks using Topic Maps technology, which dramatically enhances the indexing and retrieval performance of the system towards an even higher semantic level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Driven by new network and middleware technologies such as mobile broadband, near-field communication, and context awareness the so-called ambient lifestyle will foster innovative use cases in different domains. In the EU project Hydra high-level security, trust and privacy concerns such as loss of control, profiling and surveillance are considered at the outset. At the end of this project the. Hydra middleware development platform will have been designed so as to enable developers to realise secure ambient scenarios. This paper gives a short introduction to the Hydra project and its approach to ensure security by design. Based on the results of a focus group analysis of the user domain "building automation" typical threats are evaluated and their risks are assessed. Then, specific security requirements with respect to security, privacy, and trust are derived in order to incorporate them into the Hydra Security Meta-Model. How concepts such as context, semantic resolution of security, and virtualisation support the overall Hydra approach will be introduced and illustrated on the basis of it technical building automation scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.