888 resultados para Web modelling methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Macroeconomists working with multivariate models typically face uncertainty over which (if any) of their variables have long run steady states which are subject to breaks. Furthermore, the nature of the break process is often unknown. In this paper, we draw on methods from the Bayesian clustering literature to develop an econometric methodology which: i) finds groups of variables which have the same number of breaks; and ii) determines the nature of the break process within each group. We present an application involving a five-variate steady-state VAR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to model biodiversity patterns is of prime importance in this era of severe environmental crisis. Species assemblage along environmental gradient is subject to the interplay of biotic interactions in complement to abiotic environmental filtering. Accounting for complex biotic interactions for a wide array of species remains so far challenging. Here, we propose to use food web models that can infer the potential interaction links between species as a constraint in species distribution models. Using a plant-herbivore (butterfly) interaction dataset, we demonstrate that this combined approach is able to improve both species distribution and community forecasts. Most importantly, this combined approach is very useful in rendering models of more generalist species that have multiple potential interaction links, where gap in the literature may be recurrent. Our combined approach points a promising direction forward to model the spatial variation of entire species interaction networks. Our work has implications for studies of range shifting species and invasive species biology where it may be unknown how a given biota might interact with a potential invader or in future climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The aim of this study was to assess, at the European level and using digital technology, the inter-pathologist reproducibility of the ISHLT 2004 system and to compare it with the 1990 system We also assessed the reproducibility of the morphologic criteria for diagnosis of antibody-mediated rejection detailed in the 2004 grading system. METHODS: The hematoxylin-eosin-stained sections of 20 sets of endomyocardial biopsies were pre-selected and graded by two pathologists (A.A. and M.B.) and digitized using a telepathology digital pathology system (Aperio ImageScope System; for details refer to http://aperio.com/). Their diagnoses were considered the index diagnoses, which covered all grades of acute cellular rejection (ACR), early ischemic lesions, Quilty lesions, late ischemic lesions and (in the 2005 system) antibody-mediated rejection (AMR). Eighteen pathologists from 16 heart transplant centers in 7 European countries participated in the study. Inter-observer reproducibility was assessed using Fleiss's kappa and Krippendorff's alpha statistics. RESULTS: The combined kappa value of all grades diagnosed by all 18 pathologists was 0.31 for the 1990 grading system and 0.39 for the 2005 grading system, with alpha statistics at 0.57 and 0.55, respectively. Kappa values by grade for 1990/2005, respectively, were: 0 = 0.52/0.51; 1A/1R = 0.24/0.36; 1B = 0.15; 2 = 0.13; 3A/2R = 0.29/0.29; 3B/3R = 0.13/0.23; and 4 = 0.18. For the 2 cases of AMR, 6 of 18 pathologists correctly suspected AMR on the hematoxylin-eosin slides, whereas, in each of 17 of the 18 AMR-negative cases a small percentage of pathologists (range 5% to 33%) overinterpreted the findings as suggestive for AMR. CONCLUSIONS: Reproducibility studies of cardiac biopsies by pathologists in different centers at the international level were feasible using digitized slides rather than conventional histology glass slides. There was a small improvement in interobserver agreement between pathologists of different European centers when moving from the 1990 ISHLT classification to the "new" 2005 ISHLT classification. Morphologic suspicion of AMR in the 2004 system on hematoxylin-eosin-stained slides only was poor, highlighting the need for better standardization of morphologic criteria for AMR. Ongoing educational programs are needed to ensure standardization of diagnosis of both acute cellular and antibody-mediated rejection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hi ha diversos mètodes d'anàlisi que duen a terme una agrupació global de la sèries de mostres de microarrays, com SelfOrganizing Maps, o que realitzen agrupaments locals tenint en compte només un subconjunt de gens coexpressats, com Biclustering, entre d'altres. En aquest projecte s'ha desenvolupat una aplicació web: el PCOPSamplecl, és una eina que pertany als mètodes d'agrupació (clustering) local, que no busca subconjunts de gens coexpresats (anàlisi de relacions linials), si no parelles de gens que davant canvis fenotípics, la seva relació d'expressió pateix fluctuacions. El resultats del PCOPSamplecl seràn les diferents distribucions finals de clusters i les parelles de gens involucrades en aquests canvis fenotípics. Aquestes parelles de gens podràn ser estudiades per trobar la causa i efecte del canvi fenotípic. A més, l'eina facilita l'estudi de les dependències entre les diferents distribucions de clusters que proporciona l'aplicació per poder estudiar la intersecció entre clusters o l'aparició de subclusters (2 clusters d'una mateixa agrupació de clusters poden ser subclusters d'altres clusters de diferents distribucions de clusters). L'eina és disponible al servidor: http://revolutionresearch.uab.es/

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Darunavir is a protease inhibitor that is administered with low-dose ritonavir to enhance its bioavailability. It is prescribed at standard dosage regimens of 600/100 mg twice daily in treatment-experienced patients and 800/100 mg once daily in naive patients. A population pharmacokinetic approach was used to characterize the pharmacokinetics of both drugs and their interaction in a cohort of unselected patients and to compare darunavir exposure expected under alternative dosage regimens. METHODS: The study population included 105 HIV-infected individuals who provided darunavir and ritonavir plasma concentrations. Firstly, a population pharmacokinetic analysis for darunavir and ritonavir was conducted, with inclusion of patients' demographic, clinical and genetic characteristics as potential covariates (NONMEM(®)). Then, the interaction between darunavir and ritonavir was studied while incorporating levels of both drugs into different inhibitory models. Finally, model-based simulations were performed to compare trough concentrations (Cmin) between the recommended dosage regimen and alternative combinations of darunavir and ritonavir. RESULTS: A one-compartment model with first-order absorption adequately characterized darunavir and ritonavir pharmacokinetics. The between-subject variability in both compounds was important [coefficient of variation (CV%) 34% and 47% for darunavir and ritonavir clearance, respectively]. Lopinavir and ritonavir exposure (AUC) affected darunavir clearance, while body weight and darunavir AUC influenced ritonavir elimination. None of the tested genetic variants showed any influence on darunavir or ritonavir pharmacokinetics. The simulations predicted darunavir Cmin much higher than the IC50 thresholds for wild-type and protease inhibitor-resistant HIV-1 strains (55 and 550 ng/mL, respectively) under standard dosing in >98% of experienced and naive patients. Alternative regimens of darunavir/ritonavir 1200/100 or 1200/200 mg once daily also had predicted adequate Cmin (>550 ng/mL) in 84% and 93% of patients, respectively. Reduction of darunavir/ritonavir dosage to 600/50 mg twice daily led to a 23% reduction in average Cmin, still with only 3.8% of patients having concentrations below the IC50 for resistant strains. CONCLUSIONS: The important variability in darunavir and ritonavir pharmacokinetics is poorly explained by clinical covariates and genetic influences. In experienced patients, treatment simplification strategies guided by drug level measurements and adherence monitoring could be proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents an approach for mapping of precipitation data. The main goal is to perform spatial predictions and simulations of precipitation fields using geostatistical methods (ordinary kriging, kriging with external drift) as well as machine learning algorithms (neural networks). More practically, the objective is to reproduce simultaneously both the spatial patterns and the extreme values. This objective is best reached by models integrating geostatistics and machine learning algorithms. To demonstrate how such models work, two case studies have been considered: first, a 2-day accumulation of heavy precipitation and second, a 6-day accumulation of extreme orographic precipitation. The first example is used to compare the performance of two optimization algorithms (conjugate gradients and Levenberg-Marquardt) of a neural network for the reproduction of extreme values. Hybrid models, which combine geostatistical and machine learning algorithms, are also treated in this context. The second dataset is used to analyze the contribution of radar Doppler imagery when used as external drift or as input in the models (kriging with external drift and neural networks). Model assessment is carried out by comparing independent validation errors as well as analyzing data patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protecting native biodiversity against alien invasive species requires powerful methods to anticipate these invasions and to protect native species assumed to be at risk. Here, we describe how species distribution models (SDMs) can be used to identify areas predicted as suitable for rare native species and also predicted as highly susceptible to invasion by alien species, at present and under future climate and land-use scenarios. To assess the condition and dynamics of such conflicts, we developed a combined predictive modelling (CPM) approach, which predicts species distributions by combining two SDMs fitted using subsets of predictors classified as acting at either regional or local scales. We illustrate the CPM approach for an alien invader and a rare species associated to similar habitats in northwest Portugal. Combined models predict a wider variety of potential species responses, providing more informative projections of species distributions and future dynamics than traditional, non-combined models. They also provide more informative insight regarding current and future rare-invasive conflict areas. For our studied species, conflict areas of highest conservation relevance are predicted to decrease over the next decade, supporting previous reports that some invasive species may contract their geographic range and impact due to climate change. More generally, our results highlight the more informative character of the combined approach to address practical issues in conservation and management programs, especially those aimed at mitigating the impact of invasive plants, land-use and climate changes in sensitive regions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The imatinib trough plasma concentration (C(min)) correlates with clinical response in cancer patients. Therapeutic drug monitoring (TDM) of plasma C(min) is therefore suggested. In practice, however, blood sampling for TDM is often not performed at trough. The corresponding measurement is thus only remotely informative about C(min) exposure. Objectives: The objectives of this study were to improve the interpretation of randomly measured concentrations by using a Bayesian approach for the prediction of C(min), incorporating correlation between pharmacokinetic parameters, and to compare the predictive performance of this method with alternative approaches, by comparing predictions with actual measured trough levels, and with predictions obtained by a reference method, respectively. Methods: A Bayesian maximum a posteriori (MAP) estimation method accounting for correlation (MAP-ρ) between pharmacokinetic parameters was developed on the basis of a population pharmacokinetic model, which was validated on external data. Thirty-one paired random and trough levels, observed in gastrointestinal stromal tumour patients, were then used for the evaluation of the Bayesian MAP-ρ method: individual C(min) predictions, derived from single random observations, were compared with actual measured trough levels for assessment of predictive performance (accuracy and precision). The method was also compared with alternative approaches: classical Bayesian MAP estimation assuming uncorrelated pharmacokinetic parameters, linear extrapolation along the typical elimination constant of imatinib, and non-linear mixed-effects modelling (NONMEM) first-order conditional estimation (FOCE) with interaction. Predictions of all methods were finally compared with 'best-possible' predictions obtained by a reference method (NONMEM FOCE, using both random and trough observations for individual C(min) prediction). Results: The developed Bayesian MAP-ρ method accounting for correlation between pharmacokinetic parameters allowed non-biased prediction of imatinib C(min) with a precision of ±30.7%. This predictive performance was similar for the alternative methods that were applied. The range of relative prediction errors was, however, smallest for the Bayesian MAP-ρ method and largest for the linear extrapolation method. When compared with the reference method, predictive performance was comparable for all methods. The time interval between random and trough sampling did not influence the precision of Bayesian MAP-ρ predictions. Conclusion: Clinical interpretation of randomly measured imatinib plasma concentrations can be assisted by Bayesian TDM. Classical Bayesian MAP estimation can be applied even without consideration of the correlation between pharmacokinetic parameters. Individual C(min) predictions are expected to vary less through Bayesian TDM than linear extrapolation. Bayesian TDM could be developed in the future for other targeted anticancer drugs and for the prediction of other pharmacokinetic parameters that have been correlated with clinical outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet is increasingly used as a source of information on health issues and is probably a major source of patients' empowerment. This process is however limited by the frequently poor quality of web-based health information designed for consumers. A better diffusion of information about criteria defining the quality of the content of websites, and about useful methods designed for searching such needed information, could be particularly useful to patients and their relatives. A brief, six-items DISCERN version, characterized by a high specificity for detecting websites with good or very good content quality was recently developed. This tool could facilitate the identification of high-quality information on the web by patients and may improve the empowerment process initiated by the development of the health-related web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QUESTIONS UNDER STUDY: Our aim was to identify the barriers young men face to consult a health professional when they encounter sexual dysfunctions and where they turn to, if so, for answers. METHODS: We conducted an exploratory qualitative research including 12 young men aged 16-20 years old seen in two focus groups. Discussions were triggered through vignettes about sexual dysfunction. RESULTS: Young men preferred not to talk about sexual dysfunction problems with anyone and to solve them alone as it is considered an intimate and embarrassing subject which can negatively impact their masculinity. Confidentiality appeared to be the most important criterion in disclosing an intimate subject to a health professional. Participants raised the problem of males' accessibility to services and lack of reason to consult. Two criteria to address the problem were if it was long-lasting or considered as physical. The Internet was unanimously considered as an initial solution to solve a problem, which could guide them to a face-to-face consultation if necessary. CONCLUSIONS: Results suggest that Internet-based tools should be developed to become an easy access door to sexual health services for young men. Wherever they consult and for whatever problem, sexual health must be on the agenda.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper describes how to integrate audience measurement and site visibility as the main research approaches in outdoor advertising research in a single concept. Details are portrayed on how GPS is used on a large scale in Switzerland for mobility analysis and audience measurement. Furthermore, the development of a software solution is introduced that allows the integration of all mobility data and poster location information. Finally a model and its results is presented for the calculation of coverage of individual poster campaigns and for the calculation of the number of contacts generated by each billboard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En el present treball es definirà el cicle de vida i la metodologia que s'ha de seguir, segons el disseny centrat en l'usuari, per a un projecte web orientat a continguts específics per a persones de la tercera edat, per a això, es determinarà i justificarà quins mètodes i tècniques d'avaluació de la usabilitat són els més adequats per poder dissenyar un lloc web d'aquest tipus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim, Location Although the alpine mouse Apodemus alpicola has been given species status since 1989, no distribution map has ever been constructed for this endemic alpine rodent in Switzerland. Based on redetermined museum material and using the Ecological-Niche Factor Analysis (ENFA), habitat-suitability maps were computed for A. alpicola, and also for the co-occurring A. flavicollis and A. sylvaticus. Methods In the particular case of habitat suitability models, classical approaches (GLMs, GAMs, discriminant analysis, etc.) generally require presence and absence data. The presence records provided by museums can clearly give useful information about species distribution and ecology and have already been used for knowledge-based mapping. In this paper, we apply the ENFA which requires only presence data, to build a habitat-suitability map of three species of Apodemus on the basis of museum skull collections. Results Interspecific niche comparisons showed that A. alpicola is very specialized concerning habitat selection, meaning that its habitat differs unequivocally from the average conditions in Switzerland, while both A. flavicollis and A. sylvaticus could be considered as 'generalists' in the study area. Main conclusions Although an adequate sampling design is the best way to collect ecological data for predictive modelling, this is a time and money consuming process and there are cases where time is simply not available, as for instance with endangered species conservation. On the other hand, museums, herbariums and other similar institutions are treasuring huge presence data sets. By applying the ENFA to such data it is possible to rapidly construct a habitat suitability model. The ENFA method not only provides two key measurements regarding the niche of a species (i.e. marginality and specialization), but also has ecological meaning, and allows the scientist to compare directly the niches of different species.