931 resultados para probabilistic roadmap
Resumo:
Convection-permitting modelling has led to a step change in forecasting convective events. However, convection occurs within different regimes which exhibit different forecast behaviour. A convective adjustment timescale can be used to distinguish between these regimes and examine their associated predictability. The convective adjustment timescale is calculated from radiosonde ascents and found to be consistent with that derived from convection-permitting model forecasts. The model-derived convective adjustment timescale is then examined for three summers in the British Isles to determine characteristics of the convective regimes for this maritime region. Convection in the British Isles is predominantly in convective quasi-equilibrium with 85%of convection having a timescale less than or equal to three hours. This percentage varies spatially with more non-equilibriumevents occurring in the south and southwest. The convective adjustment timescale exhibits a diurnal cycle over land. The nonequilibrium regime occurs more frequently at mid-range wind speeds and with winds from southerly to westerly sectors. Most non-equilibrium convective events in the British Isles are initiated near large coastal orographic gradients or on the European continent. Thus, the convective adjustment timescale is greatest when the location being examined is immediately downstream of large orographic gradients and decreases with distance from the convective initiation region. The dominance of convective quasiequilibrium conditions over the British Isles argues for the use of large-member ensembles in probabilistic forecasts for this region.
Resumo:
The Plant–Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant–Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant–Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.
Resumo:
Objective: To assess time trends in the contribution of processed foods to food purchases made by Brazilian households and to explore the potential impact on the overall quality of the diet. Design: Application of a new classification of foodstuffs based on extent and purpose of food processing to data collected by comparable probabilistic household budget surveys. The classification assigns foodstuffs to the following groups: unprocessed/minimally processed foods (Group 1); processed culinary ingredients (Group 2); or ultra-processed ready-to-eat or ready-to-heat food products (Group 3). Setting: Eleven metropolitan areas of Brazil. Subjects: Households; n 13 611 in 1987-8, n 16 014 in 1995-5 and n 13 848 in 2002-3. Results: Over the last three decades, the household consumption of Group 1 and Group 2 foods has been steadily replaced by consumption of Group 3 ultra-processed food products, both overall and in lower- and upper-income groups. In the 2002-3 survey, Group 3 items represented more than one-quarter of total energy (more than one-third for higher-income households). The overall nutrient profile of Group 3 items, compared with that of Group 1 and Group 2 items, revealed more added sugar, more saturated fat, more sodium, less fibre and much higher energy density. Conclusions: The high energy density and the unfavourable nutrition profiling of Group 3 food products, and also their potential harmful effects on eating and drinking behaviours, indicate that governments and health authorities should use all possible methods, including legislation and statutory regulation, to halt and reverse the replacement of minimally processed foods and processed culinary ingredients by ultra-processed food products.
Resumo:
The Prospective and Retrospective Memory Questionnaire (PRMQ) has been shown to have acceptable reliability and factorial, predictive, and concurrent validity. However, the PRMQ has never been administered to a probability sample survey representative of all ages in adulthood, nor have previous studies controlled for factors that are known to influence metamemory, such as affective status. Here, the PRMQ was applied in a survey adopting a probabilistic three-stage cluster sample representative of the population of Sao Paulo, Brazil, according to gender, age (20-80 years), and economic status (n=1042). After excluding participants who had conditions that impair memory (depression, anxiety, used psychotropics, and/or had neurological/psychiatric disorders), in the remaining 664 individuals we (a) used confirmatory factor analyses to test competing models of the latent structure of the PRMQ, and (b) studied effects of gender, age, schooling, and economic status on prospective and retrospective memory complaints. The model with the best fit confirmed the same tripartite structure (general memory factor and two orthogonal prospective and retrospective memory factors) previously reported. Women complained more of general memory slips, especially those in the first 5 years after menopause, and there were more complaints of prospective than retrospective memory, except in participants with lower family income.
Resumo:
When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
In this paper a new approach is considered for studying the triangular distribution using the theoretical development behind Skew distributions. Triangular distribution are obtained by a reparametrization of usual triangular distribution. Main probabilistic properties of the distribution are studied, including moments, asymmetry and kurtosis coefficients, and an stochastic representation, which provides a simple and efficient method for generating random variables. Moments estimation is also implemented. Finally, a simulation study is conducted to illustrate the behavior of the estimation approach proposed.
Resumo:
The Birnbaum-Saunders (BS) model is a positively skewed statistical distribution that has received great attention in recent decades. A generalized version of this model was derived based on symmetrical distributions in the real line named the generalized BS (GBS) distribution. The R package named gbs was developed to analyze data from GBS models. This package contains probabilistic and reliability indicators and random number generators from GBS distributions. Parameter estimates for censored and uncensored data can also be obtained by means of likelihood methods from the gbs package. Goodness-of-fit and diagnostic methods were also implemented in this package in order to check the suitability of the GBS models. in this article, the capabilities and features of the gbs package are illustrated by using simulated and real data sets. Shape and reliability analyses for GBS models are presented. A simulation study for evaluating the quality and sensitivity of the estimation method developed in the package is provided and discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
To connect different electrical, network and data devices with the minimum cost and shortest path, is a complex job. In huge buildings, where the devices are placed at different locations on different floors and only some specific routes are available to pass the cables and buses, the shortest path search becomes more complex. The aim of this thesis project is, to develop an application which indentifies the best path to connect all objects or devices by following the specific routes.To address the above issue we adopted three algorithms Greedy Algorithm, Simulated Annealing and Exhaustive search and analyzed their results. The given problem is similar to Travelling Salesman Problem. Exhaustive search is a best algorithm to solve this problem as it checks each and every possibility and give the accurate result but it is an impractical solution because of huge time consumption. If no. of objects increased from 12 it takes hours to search the shortest path. Simulated annealing is emerged with some promising results with lower time cost. As of probabilistic nature, Simulated annealing could be non optimal but it gives a near optimal solution in a reasonable duration. Greedy algorithm is not a good choice for this problem. So, simulated annealing is proved best algorithm for this problem. The project has been implemented in C-language which takes input and store output in an Excel Workbook
Resumo:
The main idea of this research to solve the problem of inventory management for the paper industry SPM PVT limited. The aim of this research was to find a methodology by which the inventory of raw material could be kept at minimum level by means of buffer stock level.The main objective then lies in finding the minimum level of buffer stock according to daily consumption of raw material, finding the Economic Order Quantity (EOQ) reorders point and how much order will be placed in a year to control the shortage of raw material.In this project, we discuss continuous review model (Deterministic EOQ models) that includes the probabilistic demand directly in the formulation. According to the formula, we see the reorder point and the order up to model. The problem was tackled mathematically as well as simulation modeling was used where mathematically tractable solution was not possible.The simulation modeling was done by Awesim software for developing the simulation network. This simulation network has the ability to predict the buffer stock level based on variable consumption of raw material and lead-time. The data collection for this simulation network is taken from the industrial engineering personnel and the departmental studies of the concerned factory. At the end, we find the optimum level of order quantity, reorder point and order days.
Resumo:
A Simple way to improve solar cell efficiency is to enhance the absorption of light and reduce the shading losses. One of the main objectives for the photovoltaic roadmap is the reduction of metalized area on the front side of solar cell by fin lines. Industrial solar cell production uses screen-printing of metal pastes with a limit in line width of 70-80 μm. This paper will show a combination of the technique of laser grooved buried contact (LGBC) and Screen-printing is able to improve in fine lines and higher aspect ratio. Laser grooving is a technique to bury the contact into the surface of silicon wafer. Metallization is normally done with electroless or electrolytic plating method, which a high cost. To decrease the relative cost, more complex manufacturing process was needed, therefore in this project the standard process of buried contact solar cells has been optimized in order to gain a laser grooved buried contact solar cell concept with less processing steps. The laser scribing process is set at the first step on raw mono-crystalline silicon wafer. And then the texturing etch; phosphorus diffusion and SiNx passivation process was needed once. While simultaneously optimizing the laser scribing process did to get better results on screen-printing process with fewer difficulties to fill the laser groove. This project has been done to make the whole production of buried contact solar cell with fewer steps and could present a cost effective opportunity to solar cell industries.
Resumo:
This paper is concerned with the cost efficiency in achieving the Swedish national air quality objectives under uncertainty. To realize an ecologically sustainable society, the parliament has approved a set of interim and long-term pollution reduction targets. However, there are considerable quantification uncertainties on the effectiveness of the proposed pollution reduction measures. In this paper, we develop a multivariate stochastic control framework to deal with the cost efficiency problem with multiple pollutants. Based on the cost and technological data collected by several national authorities, we explore the implications of alternative probabilistic constraints. It is found that a composite probabilistic constraint induces considerably lower abatement cost than separable probabilistic restrictions. The trend is reinforced by the presence of positive correlations between reductions in the multiple pollutants.
Resumo:
The purpose of this thesis is to identify the destination site selection criteria for internationalconferences from the perspectives of the three main players of the conference industry,conference buyers (organizers and delegates) and suppliers. Additionally, the researchidentifies the strengths and weaknesses of the congress cities of Stockholm and Vienna.Through a comparison with Vienna, the top city for hosting international conferences, a roadmap for Stockholm has been designed, to strengthen its congress tourism opportunities, thus,obtaining a higher status as an international congress city. This qualitative research hascombined both primary and secondary data methods, through semi-standardized expertinterviews and secondary studies respectively, to fulfil the study’s aim. The data have beenanalysed by applying the techniques of qualitative content analysis; the secondary dataadopting an inductive approach according to Mayring (2003) while the expert interviewsusing a deductive approach according to Meuser & Nagel (2009). The conclusions of thesecondary data have been further compared and contrasted with the outcomes of the primarydata, to propose fresh discoveries, clarifications, and concepts related to the site selectioncriteria for international conferences, and for the congress tourism industry of Stockholm. Theresearch discusses the discoveries of the site selection criteria, the implications of thestrengths and weaknesses of Stockholm in comparison to Vienna, recommendations forStockholm via a road map, and future research areas in detail. The findings andrecommendation, not only provide specific steps and inceptions that Stockholm as aninternational conference city can apply, but also propose findings, which can aid conferencebuyers and suppliers to cooperate, to strengthen their marketing strategies and developsuccessful international conferences and destinations to help achieve a greater competitiveadvantage.
Resumo:
A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.
Resumo:
A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.