906 resultados para Probabilistic estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe the version of the GPT planner to be used in the planning competition. This version, called mGPT, solves mdps specified in the ppddllanguage by extracting and using different classes of lower bounds, along with various heuristic-search algorithms. The lower bounds are extracted from deterministic relaxations of the mdp where alternativeprobabilistic effects of an action are mapped into different, independent, deterministic actions. The heuristic-search algorithms, on the other hand, use these lower bounds for focusing the updates and delivering a consistent value function over all states reachable from the initial state with the greedy policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The comparison of cancer prevalence with cancer mortality can lead under some hypotheses to an estimate of registration rate. A method is proposed, where the cases with cancer as a cause of death are divided into 3 categories: (1) cases already known by the registry (2) unknown cases having occured before the registry creation date (3) unknown cases occuring during the registry operates. The estimate is then the number of cases in the first category divided by the total of those in categories 1 and 3 (these only are to be registered). An application is performed on the data of the Canton de Vaud. Survival rates of the Norvegian Cancer Registry are used for computing the number of unknown cases to be included in second and third category, respectively. The discussion focusses on the possible determinants of the obtained comprehensiveness rates for various cancer sites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a very fine grid hydrological model based on the spatiotemporal repartition of precipitation and on the topography. The goal is to estimate the flood on a catchment area, using a Probable Maximum Precipitation (PMP) leading to a Probable Maximum Flood (PMF). The spatiotemporal distribution of the precipitation was realized using six clouds modeled by the advection-diffusion equation. The equation shows the movement of the clouds over the terrain and also gives the evolution of the rain intensity in time. This hydrological modeling is followed by a hydraulic modeling of the surface and subterranean flows, done considering the factors that contribute to the hydrological cycle, such as the infiltration, the exfiltration and the snowmelt. This model was applied to several Swiss basins using measured rain, with results showing a good correlation between the simulated and observed flows. This good correlation proves that the model is valid and gives us the confidence that the results can be extrapolated to phenomena of extreme rainfall of PMP type. In this article we present some results obtained using a PMP rainfall and the developed model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to describe the process and challenges in building exposure scenarios for engineered nanomaterials (ENM), using an exposure scenario format similar to that used for the European Chemicals regulation (REACH). Over 60 exposure scenarios were developed based on information from publicly available sources (literature, books, and reports), publicly available exposure estimation models, occupational sampling campaign data from partnering institutions, and industrial partners regarding their own facilities. The primary focus was on carbon-based nanomaterials, nano-silver (nano-Ag) and nano-titanium dioxide (nano-TiO2), and included occupational and consumer uses of these materials with consideration of the associated environmental release. The process of building exposure scenarios illustrated the availability and limitations of existing information and exposure assessment tools for characterizing exposure to ENM, particularly as it relates to risk assessment. This article describes the gaps in the information reviewed, recommends future areas of ENM exposure research, and proposes types of information that should, at a minimum, be included when reporting the results of such research, so that the information is useful in a wider context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chromosomal anomalies, like Robertsonian and reciprocal translocations represent a big problem in cattle breeding as their presence induces, in the carrier subjects, a well documented fertility reduction. In cattle reciprocal translocations (RCPs, a chromosome abnormality caused by an exchange of material between nonhomologous chromosomes) are considered rare as to date only 19 reciprocal translocations have been described. In cattle it is common knowledge that the Robertsonian translocations represent the most common cytogenetic anomalies, and this is probably due to the existence of the endemic 1;29 Robertsonian translocation. However, these considerations are based on data obtained using techniques that are unable to identify all reciprocal translocations and thus their frequency is clearly underestimated. The purpose of this work is to provide a first realistic estimate of the impact of RCPs in the cattle population studied, trying to eliminate the factors which have caused an underestimation of their frequency so far. We performed this work using a mathematical as well as a simulation approach and, as biological data, we considered the cytogenetic results obtained in the last 15 years. The results obtained show that only 16% of reciprocal translocations can be detected using simple Giemsa techniques and consequently they could be present in no less than 0,14% of cattle subjects, a frequency five times higher than that shown by de novo Robertsonian translocations. This data is useful to open a debate about the need to introduce a more efficient method to identify RCP in cattle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lutetium zoning in garnet within eclogites from the Zermatt-Saas Fee zone, Western Alps, reveal sharp, exponentially decreasing central peaks. They can be used to constrain maximum Lu volume diffusion in garnets. A prograde garnet growth temperature interval of 450-600 A degrees C has been estimated based on pseudosection calculations and garnet-clinopyroxene thermometry. The maximum pre-exponential diffusion coefficient which fits the measured central peak is in the order of D-0= 5.7*10(-6) m(2)/s, taking an estimated activation energy of 270 kJ/mol based on diffusion experiments for other rare earth elements in garnet. This corresponds to a maximum diffusion rate of D (600 A degrees C) = 4.0*10(-22) m(2)/s. The diffusion estimate of Lu can be used to estimate the minimum closure temperature, T-c, for Sm-Nd and Lu-Hf age data that have been obtained in eclogites of the Western Alps, postulating, based on a literature review, that D (Hf) < D (Nd) < D (Sm) a parts per thousand currency sign D (Lu). T-c calculations, using the Dodson equation, yielded minimum closure temperatures of about 630 A degrees C, assuming a rapid initial exhumation rate of 50A degrees/m.y., and an average crystal size of garnets (r = 1 mm). This suggests that Sm/Nd and Lu/Hf isochron age differences in eclogites from the Western Alps, where peak temperatures did rarely exceed 600 A degrees C must be interpreted in terms of prograde metamorphism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A clear and rigorous definition of muscle moment-arms in the context of musculoskeletal systems modelling is presented, using classical mechanics and screw theory. The definition provides an alternative to the tendon excursion method, which can lead to incorrect moment-arms if used inappropriately due to its dependency on the choice of joint coordinates. The definition of moment-arms, and the presented construction method, apply to musculoskeletal models in which the bones are modelled as rigid bodies, the joints are modelled as ideal mechanical joints and the muscles are modelled as massless, frictionless cables wrapping over the bony protrusions, approximated using geometric surfaces. In this context, the definition is independent of any coordinate choice. It is then used to solve a muscle-force estimation problem for a simple 2D conceptual model and compared with an incorrect application of the tendon excursion method. The relative errors between the two solutions vary between 0% and 100%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although extensive research has been conducted on urban freeway capacity estimation methods, minimal research has been carried out for rural highway sections, especially sections within work zones. This study attempted to fill that void for rural highways in Kansas, by estimating capacity of rural highway work zones in Kansas. Six work zone locations were selected for data collection and further analysis. An average of six days’ worth of field data was collected, from mid-October 2013 to late November 2013, at each of these work zone sites. Two capacity estimation methods were utilized, including the Maximum Observed 15-minute Flow Rate Method and the Platooning Method divided into 15-minute intervals. The Maximum Observed 15-minute Flow Rate Method provided an average capacity of 1469 passenger cars per hour per lane (pcphpl) with a standard deviation of 141 pcphpl, while the Platooning Method provided a maximum average capacity of 1195 pcphpl and a standard deviation of 28 pcphpl. Based on observed data and analysis carried out in this study, the suggested maximum capacity can be considered as 1500 pcphpl when designing work zones for rural highways in Kansas. This proposed standard value of rural highway work zone capacity could be utilized by engineers and planners so that they can effectively mitigate congestion at or near work zones that would have otherwise occurred due to construction/maintenance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. METHODS: We looked at about 240,000 IRC measurements carried out in about 150,000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m(3). Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. RESULTS: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. CONCLUSIONS: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Image registration has been proposed as an automatic method for recovering cardiac displacement fields from Tagged Magnetic Resonance Imaging (tMRI) sequences. Initially performed as a set of pairwise registrations, these techniques have evolved to the use of 3D+t deformation models, requiring metrics of joint image alignment (JA). However, only linear combinations of cost functions defined with respect to the first frame have been used. In this paper, we have applied k-Nearest Neighbors Graphs (kNNG) estimators of the -entropy (H ) to measure the joint similarity between frames, and to combine the information provided by different cardiac views in an unified metric. Experiments performed on six subjects showed a significantly higher accuracy (p < 0.05) with respect to a standard pairwise alignment (PA) approach in terms of mean positional error and variance with respect to manually placed landmarks. The developed method was used to study strains in patients with myocardial infarction, showing a consistency between strain, infarction location, and coronary occlusion. This paper also presentsan interesting clinical application of graph-based metric estimators, showing their value for solving practical problems found in medical imaging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new and original variational framework for atlas-based segmentation. The proposed framework integrates both the active contour framework, and the dense deformation fields of optical flow framework. This framework is quite general and encompasses many of the state-of-the-art atlas-based segmentation methods. It also allows to perform the registration of atlas and target images based on only selected structures of interest. The versatility and potentiality of the proposed framework are demonstrated by presenting three diverse applications: In the first application, we show how the proposed framework can be used to simulate the growth of inconsistent structures like a tumor in an atlas. In the second application, we estimate the position of nonvisible brain structures based on the surrounding structures and validate the results by comparing with other methods. In the final application, we present the segmentation of lymph nodes in the Head and Neck CT images, and demonstrate how multiple registration forces can be used in this framework in an hierarchical manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.