75 resultados para Native Vegetation Condition, Benchmarking, Bayesian Decision Framework, Regression, Indicators


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In occupational exposure assessment of airborne contaminants, exposure levels can either be estimated through repeated measurements of the pollutant concentration in air, expert judgment or through exposure models that use information on the conditions of exposure as input. In this report, we propose an empirical hierarchical Bayesian model to unify these approaches. Prior to any measurement, the hygienist conducts an assessment to generate prior distributions of exposure determinants. Monte-Carlo samples from these distributions feed two level-2 models: a physical, two-compartment model, and a non-parametric, neural network model trained with existing exposure data. The outputs of these two models are weighted according to the expert's assessment of their relevance to yield predictive distributions of the long-term geometric mean and geometric standard deviation of the worker's exposure profile (level-1 model). Bayesian inferences are then drawn iteratively from subsequent measurements of worker exposure. Any traditional decision strategy based on a comparison with occupational exposure limits (e.g. mean exposure, exceedance strategies) can then be applied. Data on 82 workers exposed to 18 contaminants in 14 companies were used to validate the model with cross-validation techniques. A user-friendly program running the model is available upon request.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wide range of numerical models and tools have been developed over the last decades to support the decision making process in environmental applications, ranging from physical models to a variety of statistically-based methods. In this study, a landslide susceptibility map of a part of Three Gorges Reservoir region of China was produced, employing binary logistic regression analyses. The available information includes the digital elevation model of the region, geological map and different GIS layers including land cover data obtained from satellite imagery. The landslides were observed and documented during the field studies. The validation analysis is exploited to investigate the quality of mapping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The 'database search problem', that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain), this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional, purely formulaic expressions. The method's graphical environment, along with its computational and probabilistic architectures, represents a rich package that offers analysts and discussants with additional modes of interaction, concise representation, and coherent communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A ubiquitous assessment of swimming velocity (main metric of the performance) is essential for the coach to provide a tailored feedback to the trainee. We present a probabilistic framework for the data-driven estimation of the swimming velocity at every cycle using a low-cost wearable inertial measurement unit (IMU). The statistical validation of the method on 15 swimmers shows that an average relative error of 0.1 ± 9.6% and high correlation with the tethered reference system (rX,Y=0.91 ) is achievable. Besides, a simple tool to analyze the influence of sacrum kinematics on the performance is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rare species have restricted geographic ranges, habitat specialization, and/or small population sizes. Datasets on rare species distribution usually have few observations, limited spatial accuracy and lack of valid absences; conversely they provide comprehensive views of species distributions allowing to realistically capture most of their realized environmental niche. Rare species are the most in need of predictive distribution modelling but also the most difficult to model. We refer to this contrast as the "rare species modelling paradox" and propose as a solution developing modelling approaches that deal with a sufficiently large set of predictors, ensuring that statistical models aren't overfitted. Our novel approach fulfils this condition by fitting a large number of bivariate models and averaging them with a weighted ensemble approach. We further propose that this ensemble forecasting is conducted within a hierarchic multi-scale framework. We present two ensemble models for a test species, one at regional and one at local scale, each based on the combination of 630 models. In both cases, we obtained excellent spatial projections, unusual when modelling rare species. Model results highlight, from a statistically sound approach, the effects of multiple drivers in a same modelling framework and at two distinct scales. From this added information, regional models can support accurate forecasts of range dynamics under climate change scenarios, whereas local models allow the assessment of isolated or synergistic impacts of changes in multiple predictors. This novel framework provides a baseline for adaptive conservation, management and monitoring of rare species at distinct spatial and temporal scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-penetrating radar (GPR) has the potential to provide valuable information on hydrological properties of the vadose zone because of their strong sensitivity to soil water content. In particular, recent evidence has suggested that the stochastic inversion of crosshole GPR data within a coupled geophysical-hydrological framework may allow for effective estimation of subsurface van-Genuchten-Mualem (VGM) parameters and their corresponding uncertainties. An important and still unresolved issue, however, is how to best integrate GPR data into a stochastic inversion in order to estimate the VGM parameters and their uncertainties, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first introduce a fully Bayesian inversion called Markov-chain-Monte-carlo (MCMC) strategy to perform the stochastic inversion of steady-state GPR data to estimate the VGM parameters and their uncertainties. Within this study, the choice of the prior parameter probability distributions from which potential model configurations are drawn and tested against observed data was also investigated. Analysis of both synthetic and field data collected at the Eggborough (UK) site indicates that the geophysical data alone contain valuable information regarding the VGM parameters. However, significantly better results are obtained when these data are combined with a realistic, informative prior. A subsequent study explore in detail the dynamic infiltration case, specifically to what extent time-lapse ZOP GPR data, collected during a forced infiltration experiment at the Arrenaes field site (Denmark), can help to quantify VGM parameters and their uncertainties using the MCMC inversion strategy. The findings indicate that the stochastic inversion of time-lapse GPR data does indeed allow for a substantial refinement in the inferred posterior VGM parameter distributions. In turn, this significantly improves knowledge of the hydraulic properties, which are required to predict hydraulic behaviour. Finally, another aspect that needed to be addressed involved the comparison of time-lapse GPR data collected under different infiltration conditions (i.e., natural loading and forced infiltration conditions) to estimate the VGM parameters using the MCMC inversion strategy. The results show that for the synthetic example, considering data collected during a forced infiltration test helps to better refine soil hydraulic properties compared to data collected under natural infiltration conditions. When investigating data collected at the Arrenaes field site, further complications arised due to model error and showed the importance of also including a rigorous analysis of the propagation of model error with time and depth when considering time-lapse data. Although the efforts in this thesis were focused on GPR data, the corresponding findings are likely to have general applicability to other types of geophysical data and field environments. Moreover, the obtained results allow to have confidence for future developments in integration of geophysical data with stochastic inversions to improve the characterization of the unsaturated zone but also reveal important issues linked with stochastic inversions, namely model errors, that should definitely be addressed in future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of forensic intelligence relies on the expression of suitable models that better represent the contribution of forensic intelligence in relation to the criminal justice system, policing and security. Such models assist in comparing and evaluating methods and new technologies, provide transparency and foster the development of new applications. Interestingly, strong similarities between two separate projects focusing on specific forensic science areas were recently observed. These observations have led to the induction of a general model (Part I) that could guide the use of any forensic science case data in an intelligence perspective. The present article builds upon this general approach by focusing on decisional and organisational issues. The article investigates the comparison process and evaluation system that lay at the heart of the forensic intelligence framework, advocating scientific decision criteria and a structured but flexible and dynamic architecture. These building blocks are crucial and clearly lay within the expertise of forensic scientists. However, it is only part of the problem. Forensic intelligence includes other blocks with their respective interactions, decision points and tensions (e.g. regarding how to guide detection and how to integrate forensic information with other information). Formalising these blocks identifies many questions and potential answers. Addressing these questions is essential for the progress of the discipline. Such a process requires clarifying the role and place of the forensic scientist within the whole process and their relationship to other stakeholders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discussions at the inaugural meeting of a Trans-European Pedagogic Research Group for Anatomical Sciences highlighted the fact that there exist considerable variations in the legal and ethical frameworks throughout Europe concerning body bequests for anatomical examination. Such differences appear to reflect cultural and religious variations as well as different legal and constitutional frameworks. For example, there are different views concerning the "ownership" of cadavers and concerning the need (perceived by different societies and national politicians) for legislation specifically related to anatomical dissection. Furthermore, there are different views concerning the acceptability of using unclaimed bodies that have not given informed consent. Given that in Europe there have been a series of controversial anatomical exhibitions and also a public (televised) dissection/autopsy, and given that the commercial sale or transport of anatomical material across national boundaries is strongly debated, it would seem appropriate to "harmonise" the situation (at least in the European Union). This paper summarises the legal situation in a variety of European countries and suggests examples of good practice. In particular, it recommends that all countries should adopt clear legal frameworks to regulate the acceptance of donations for medical education and research. It stresses the need for informed consent, with donors being given clear information upon which to base their decision, intentions to bequest being made by the donor before death and encourages donors to discuss their wishes to bequeath with relatives prior to death. Departments are encouraged, where they feel it appropriate, to hold Services of Thanksgiving and Commemoration for those who have donated their bodies. Finally, there needs to be legislation to regulate transport of bodies or body parts across national borders and a discouragement of any moves towards commercialisation in relation to bequests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Almost 30 years ago, Bayesian networks (BNs) were developed in the field of artificial intelligence as a framework that should assist researchers and practitioners in applying the theory of probability to inference problems of more substantive size and, thus, to more realistic and practical problems. Since the late 1980s, Bayesian networks have also attracted researchers in forensic science and this tendency has considerably intensified throughout the last decade. This review article provides an overview of the scientific literature that describes research on Bayesian networks as a tool that can be used to study, develop and implement probabilistic procedures for evaluating the probative value of particular items of scientific evidence in forensic science. Primary attention is drawn here to evaluative issues that pertain to forensic DNA profiling evidence because this is one of the main categories of evidence whose assessment has been studied through Bayesian networks. The scope of topics is large and includes almost any aspect that relates to forensic DNA profiling. Typical examples are inference of source (or, 'criminal identification'), relatedness testing, database searching and special trace evidence evaluation (such as mixed DNA stains or stains with low quantities of DNA). The perspective of the review presented here is not exclusively restricted to DNA evidence, but also includes relevant references and discussion on both, the concept of Bayesian networks as well as its general usage in legal sciences as one among several different graphical approaches to evidence evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Baldwin effect can be observed if phenotypic learning influences the evolutionary fitness of individuals, which can in turn accelerate or decelerate evolutionary change. Evidence for both learning-induced acceleration and deceleration can be found in the literature. Although the results for both outcomes were supported by specific mathematical or simulation models, no general predictions have been achieved so far. Here we propose a general framework to predict whether evolution benefits from learning or not. It is formulated in terms of the gain function, which quantifies the proportional change of fitness due to learning depending on the genotype value. With an inductive proof we show that a positive gain-function derivative implies that learning accelerates evolution, and a negative one implies deceleration under the condition that the population is distributed on a monotonic part of the fitness landscape. We show that the gain-function framework explains the results of several specific simulation models. We also use the gain-function framework to shed some light on the results of a recent biological experiment with fruit flies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Testosterone abuse is conventionally assessed by the urinary testosterone/epitestosterone (T/E) ratio, levels above 4.0 being considered suspicious. A deletion polymorphism in the gene coding for UGT2B17 is strongly associated with reduced testosterone glucuronide (TG) levels in urine. Many of the individuals devoid of the gene would not reach a T/E ratio of 4.0 after testosterone intake. Future test programs will most likely shift from population based- to individual-based T/E cut-off ratios using Bayesian inference. A longitudinal analysis is dependent on an individual's true negative baseline T/E ratio. The aim was to investigate whether it is possible to increase the sensitivity and specificity of the T/E test by addition of UGT2B17 genotype information in a Bayesian framework. A single intramuscular dose of 500mg testosterone enanthate was given to 55 healthy male volunteers with either two, one or no allele (ins/ins, ins/del or del/del) of the UGT2B17 gene. Urinary excretion of TG and the T/E ratio was measured during 15 days. The Bayesian analysis was conducted to calculate the individual T/E cut-off ratio. When adding the genotype information, the program returned lower individual cut-off ratios in all del/del subjects increasing the sensitivity of the test considerably. It will be difficult, if not impossible, to discriminate between a true negative baseline T/E value and a false negative one without knowledge of the UGT2B17 genotype. UGT2B17 genotype information is crucial, both to decide which initial cut-off ratio to use for an individual, and for increasing the sensitivity of the Bayesian analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presented is an accurate swimming velocity estimation method using an inertial measurement unit (IMU) by employing a simple biomechanical constraint of motion along with Gaussian process regression to deal with sensor inherent errors. Experimental validation shows a velocity RMS error of 9.0 cm/s and high linear correlation when compared with a commercial tethered reference system. The results confirm the practicality of the presented method to estimate swimming velocity using a single low-cost, body-worn IMU.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensive agriculture, in which detrimental farming practices lessen food abundance and/or reduce food accessibility for many animal species, has led to a widespread collapse of farmland biodiversity. Vineyards in central and southern Europe are intensively cultivated; though they may still harbour several rare plant and animal species, they remain little studied. Over the past decades, there has been a considerable reduction in the application of insecticides in wine production, with a progressive shift to biological control (integrated production) and, to a lesser extent, organic production. Spraying of herbicides has also diminished, which has led to more vegetation cover on the ground, although most vineyards remain bare, especially in southern Europe. The effects of these potentially positive environmental trends upon biodiversity remain mostly unknown as regards vertebrates. The Woodlark (Lullula arborea) is an endangered, short-distance migratory bird that forages and breeds on the ground. In southern Switzerland (Valais), it occurs mostly in vineyards. We used radiotracking and mixed effects logistic regression models to assess Woodlark response to modern vineyard farming practices, study factors driving foraging micro-habitat selection, and determine optimal habitat profile to inform management. The presence of ground vegetation cover was the main factor dictating the selection of foraging locations, with an optimum around 55% at the foraging patch scale. These conditions are met in integrated production vineyards, but only when grass is tolerated on part of the ground surface, which is the case on ca. 5% of the total Valais vineyard area. In contrast, conventionally managed vineyards covering a parts per thousand yen95% of the vineyard area are too bare because of systematic application of herbicides all over the ground, whilst the rare organic vineyards usually have a too-dense sward. The optimal mosaic with ca. 50% ground vegetation cover is currently achieved in integrated production vineyards where herbicide is applied every second row. In organic production, ca. 50% ground vegetation cover should be promoted, which requires regular mechanical removal of ground vegetation. These measures are likely to benefit general biodiversity in vineyards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.