947 resultados para Markov models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A better understanding of the factors that mould ecological community structure is required to accurately predict community composition and to anticipate threats to ecosystems due to global changes. We tested how well stacked climate-based species distribution models (S-SDMs) could predict butterfly communities in a mountain region. It has been suggested that climate is the main force driving butterfly distribution and community structure in mountain environments, and that, as a consequence, climate-based S-SDMs should yield unbiased predictions. In contrast to this expectation, at lower altitudes, climate-based S-SDMs overpredicted butterfly species richness at sites with low plant species richness and underpredicted species richness at sites with high plant species richness. According to two indices of composition accuracy, the Sorensen index and a matching coefficient considering both absences and presences, S-SDMs were more accurate in plant-rich grasslands. Butterflies display strong and often specialised trophic interactions with plants. At lower altitudes, where land use is more intense, considering climate alone without accounting for land use influences on grassland plant richness leads to erroneous predictions of butterfly presences and absences. In contrast, at higher altitudes, where climate is the main force filtering communities, there were fewer differences between observed and predicted butterfly richness. At high altitudes, even if stochastic processes decrease the accuracy of predictions of presence, climate-based S-SDMs are able to better filter out butterfly species that are unable to cope with severe climatic conditions, providing more accurate predictions of absences. Our results suggest that predictions should account for plants in disturbed habitats at lower altitudes but that stochastic processes and heterogeneity at high altitudes may limit prediction success of climate-based S-SDMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Joint-stability in interindustry models relates to the mutual simultaneous consistency of the demand-driven and supply-driven models of Leontief and Ghosh, respectively. Previous work has claimed joint-stability to be an acceptable assumption from the empirical viewpoint, provided only small changes in exogenous variables are considered. We show in this note, however, that the issue has deeper theoretical roots and offer an analytical demonstration that shows the impossibility of consistency between demand-driven and supply-driven models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Entrevistant infants pre-escolars víctimes d’abús sexual i/o maltractament familiar: eficàcia dels models d’entrevista forense Entrevistar infants en edat preescolar que han viscut una situació traumàtica és una tasca complexa que dins l’avaluació psicològica forense necessita d’un protocol perfectament delimitat, clar i temporalitzat. Per això, s’han seleccionat 3 protocols d’entrevista: el Protocol de Menors (PM) de Bull i Birch, el model del National Institute for Children Development (NICHD) de Michel Lamb, a partir del qual es va desenvolupar l’EASI (Evaluación del Abuso Sexual Infantojuvenil) i l’Entrevista Cognitiva (EC) de Fisher i Geiselman. La hipòtesi de partida vol comprovar si els anteriors models permeten obtenir volums informatius diferents en infants preescolars. Conseqüentment, els objectius han estat determinar quin dels models d’entrevista permet obtenir un volum informatiu amb més precisions i menys errors, dissenyar un model d’entrevista propi i consensuar aquest model. En el treball s’afegeixen esquemes pràctics que facilitin l’obertura, desenvolupament i tancament de l’entrevista forense. La metodologia ha reproduït el binomi infant - esdeveniment traumàtic, mitjançant la visualització i l’explicació d’un fet emocionalment significatiu amb facilitat per identificar-se: l’accident en bicicleta d’un infant que cau, es fa mal, sagna i el seu pare el cura. A partir d’aquí, hem entrevistat 135 infants de P3, P4 i P5, mitjançant els 3 models d’entrevista referits, enfrontant-los a una demanda específica: recordar i narrar aquest esdeveniment. S’ha conclòs que el nivell de record correcte, quan s’utilitza un model d’entrevista adequat amb els infants en edat preescolar, oscil•la entre el 70-90%, fet que permet defensar la confiança en els records dels infants. Es constata que el percentatge d’emissions incorrectes dels infants en edat preescolar és mínim, al voltant d’un 5-6%. L’estudi remarca la necessitat d’establir perfectament les regles de l’entrevista i, per últim, en destaca la ineficàcia de les tècniques de memòria de l’entrevista cognitiva en els infants de P3 i P4. En els de P5 es comencen a veure beneficis gràcies a la tècnica de la reinstauració contextual (RC), estant les altres tècniques fora de la comprensió i utilització dels infants d’aquestes edats. Interviewing preschoolers victims of sexual abuse and/or domestic abuse: Effectiveness of forensic interviews models 135 preschool children were interviewed with 3 different interview models in order to remember a significant emotional event. Authors conclude that the correct recall of children ranging from 70-90% and the percentage of error messages is 5-6%. It is necessary to fully establish the rules of the interview. The present research highlights the effectiveness of the cognitive interview techniques in children from P3 and P4. Entrevistando niños preescolares víctimas de abuso sexual y/o maltrato familiar: eficacia de los modelos de entrevista forense Se han entrevistado 135 niños preescolares con 3 modelos de entrevista diferentes para recordar un hecho emocionalmente significativo. Se concluye que el recuerdo correcto de los niños oscila entre el 70-90% y el porcentaje de errores de mensajes es del 5-6%. El estudio remarca la necesidad de establecer perfectamente las reglas de la entrevista y se destaca la ineficacia de las técnicas de la entrevista cognitiva en los niños de P3 y P4.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aspergillus lentulus, an Aspergillus fumigatus sibling species, is increasingly reported in corticosteroid-treated patients. Its clinical significance is unknown, but the fact that A. lentulus shows reduced antifungal susceptibility, mainly to voriconazole, is of serious concern. Heterologous expression of cyp51A from A. fumigatus and A. lentulus was performed in Saccharomyces cerevisiae to assess differences in the interaction of Cyp51A with the azole drugs. The absence of endogenous ERG11 was efficiently complemented in S. cerevisiae by the expression of either Aspergillus cyp51A allele. There was a marked difference between azole minimum inhibitory concentration (MIC) values of the clones expressing each Aspergillus spp. cyp51A. Saccharomyces cerevisiae clones expressing A. lentulus alleles showed higher MICs to all of the azoles tested, supporting the hypothesis that the intrinsic azole resistance of A. lentulus could be associated with Cyp51A. Homology models of A. fumigatus and A. lentulus Cyp51A protein based on the crystal structure of Cyp51p from Mycobacterium tuberculosis in complex with fluconazole were almost identical owing to their mutual high sequence identity. Molecular dynamics (MD) was applied to both three-dimensional protein models to refine the homology modelling and to explore possible differences in the Cyp51A-voriconazole interaction. After 20ns of MD modelling, some critical differences were observed in the putative closed form adopted by the protein upon voriconazole binding. A closer study of the A. fumigatus and A. lentulus voriconazole putative binding site in Cyp51A suggested that some major differences in the protein's BC loop could differentially affect the lock-up of voriconazole, which in turn could correlate with their different azole susceptibility profiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The investigation of unexplained syncope remains a challenging clinical problem. In the present study we sought to evaluate the diagnostic value of a standardized work-up focusing on non invasive tests in patients with unexplained syncope referred to a syncope clinic, and whether certain combinations of clinical parameters are characteristic of rhythmic and reflex causes of syncope. METHODS AND RESULTS: 317 consecutive patients underwent a standardized work-up including a 12-lead ECG, physical examination, detailed history with screening for syncope-related symptoms using a structured questionnaire followed by carotid sinus massage (CSM), and head-up tilt test. Invasive testings including an electrophysiological study and implantation of a loop recorder were only performed in those with structural heart disease or traumatic syncope. Our work-up identified an etiology in 81% of the patients. Importantly, three quarters of the causes were established non invasively combining head-up tilt test, CSM and hyperventilation testing. Invasive tests yielded an additional 7% of diagnoses. Logistic analysis identified age and number of significant prodromes as the only predictive factors of rhythmic syncope. The same two factors, in addition to the duration of the ECG P-wave, were also predictive of vasovagal and psychogenic syncope. These factors, optimally combined in predictive models, showed a high negative and a modest positive predictive value. CONCLUSION: A standardized work-up focusing on non invasive tests allows to establish more than three quarters of syncope causes. Predictive models based on simple clinical parameters may help to distinguish between rhythmic and other causes of syncope

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solving multi-stage oligopoly models by backward induction can easily become a com- plex task when rms are multi-product and demands are derived from a nested logit frame- work. This paper shows that under the assumption that within-segment rm shares are equal across segments, the analytical expression for equilibrium pro ts can be substantially simpli ed. The size of the error arising when this condition does not hold perfectly is also computed. Through numerical examples, it is shown that the error is rather small in general. Therefore, using this assumption allows to gain analytical tractability in a class of models that has been used to approach relevant policy questions, such as for example rm entry in an industry or the relation between competition and location. The simplifying approach proposed in this paper is aimed at helping improving these type of models for reaching more accurate recommendations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L’ictus és un dels reptes sanitaris més importants al nostre país ja que l’únic tractament disponible és l’administració de trombolítics durant les 4,5 primeres hores i menys d’un 10% dels pacients poden beneficiar-se’n. Publicacions anteriors han demostrat que el tractament de l’ictus amb estatines pot reduir l’extensió del teixit infartat i millorar la funció neurològica, per això proposem fer un estudi experimental usant un model d’isquèmia en rata, que evidenciï si el tractament combinat de Simvastatina i rt-PA incrementa el benefici obtingut únicament amb fàrmacs trombolítics i avaluï la seva seguretat quan s’administra durant la fase aguda (transformacions hemorràgiques i incidència d’infeccions).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The lymphatic vascular system, the body's second vascular system present in vertebrates, has emerged in recent years as a crucial player in normal and pathological processes. It participates in the maintenance of normal tissue fluid balance, the immune functions of cellular and antigen trafficking and absorption of fatty acids and lipid-soluble vitamins in the gut. Recent scientific discoveries have highlighted the role of lymphatic system in a number of pathologic conditions, including lymphedema, inflammatory diseases, and tumor metastasis. Development of genetically modified animal models, identification of lymphatic endothelial specific markers and regulators coupled with technological advances such as high-resolution imaging and genome-wide approaches have been instrumental in understanding the major steps controlling growth and remodeling of lymphatic vessels. This review highlights the recent insights and developments in the field of lymphatic vascular biology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models predicting species spatial distribution are increasingly applied to wildlife management issues, emphasising the need for reliable methods to evaluate the accuracy of their predictions. As many available datasets (e.g. museums, herbariums, atlas) do not provide reliable information about species absences, several presence-only based analyses have been developed. However, methods to evaluate the accuracy of their predictions are few and have never been validated. The aim of this paper is to compare existing and new presenceonly evaluators to usual presence/absence measures. We use a reliable, diverse, presence/absence dataset of 114 plant species to test how common presence/absence indices (Kappa, MaxKappa, AUC, adjusted D-2) compare to presenceonly measures (AVI, CVI, Boyce index) for evaluating generalised linear models (GLM). Moreover we propose a new, threshold-independent evaluator, which we call "continuous Boyce index". All indices were implemented in the B10MAPPER software. We show that the presence-only evaluators are fairly correlated (p > 0.7) to the presence/absence ones. The Boyce indices are closer to AUC than to MaxKappa and are fairly insensitive to species prevalence. In addition, the Boyce indices provide predicted-toexpected ratio curves that offer further insights into the model quality: robustness, habitat suitability resolution and deviation from randomness. This information helps reclassifying predicted maps into meaningful habitat suitability classes. The continuous Boyce index is thus both a complement to usual evaluation of presence/absence models and a reliable measure of presence-only based predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the University of California at Berkeley, from September to December 2007. Environmental niche modelling (ENM) techniques are powerful tools to predict species potential distributions. In the last ten years, a plethora of novel methodological approaches and modelling techniques have been developed. During three months, I stayed at the University of California, Berkeley, working under the supervision of Dr. David R. Vieites. The aim of our work was to quantify the error committed by these techniques, but also to test how an increase in the sample size affects the resultant predictions. Using MaxEnt software we generated distribution predictive maps, from different sample sizes, of the Eurasian quail (Coturnix coturnix) in the Iberian Peninsula. The quail is a generalist species from a climatic point of view, but an habitat specialist. The resultant distribution maps were compared with the real distribution of the species. This distribution was obtained from recent bird atlases from Spain and Portugal. Results show that ENM techniques can have important errors when predicting the species distribution of generalist species. Moreover, an increase of sample size is not necessary related with a better performance of the models. We conclude that a deep knowledge of the species’ biology and the variables affecting their distribution is crucial for an optimal modelling. The lack of this knowledge can induce to wrong conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Predictive species distribution modelling (SDM) has become an essential tool in biodiversity conservation and management. The choice of grain size (resolution) of environmental layers used in modelling is one important factor that may affect predictions. We applied 10 distinct modelling techniques to presence-only data for 50 species in five different regions, to test whether: (1) a 10-fold coarsening of resolution affects predictive performance of SDMs, and (2) any observed effects are dependent on the type of region, modelling technique, or species considered. Results show that a 10 times change in grain size does not severely affect predictions from species distribution models. The overall trend is towards degradation of model performance, but improvement can also be observed. Changing grain size does not equally affect models across regions, techniques, and species types. The strongest effect is on regions and species types, with tree species in the data sets (regions) with highest locational accuracy being most affected. Changing grain size had little influence on the ranking of techniques: boosted regression trees remain best at both resolutions. The number of occurrences used for model training had an important effect, with larger sample sizes resulting in better models, which tended to be more sensitive to grain. Effect of grain change was only noticeable for models reaching sufficient performance and/or with initial data that have an intrinsic error smaller than the coarser grain size.