833 resultados para Probabilistic methodology
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.
Resumo:
Nowadays despite improvements in usability and intuitiveness users have to adapt to the proposed systems to satisfy their needs. For instance, they must learn how to achieve tasks, how to interact with the system, and fulfill system's specifications. This paper proposes an approach to improve this situation enabling graphical user interface redefinition through virtualization and computer vision with the aim of increasing the system's usability. To achieve this goal the approach is based on enriched task models, virtualization and picture-driven computing.
Resumo:
This paper presents a study carried out in order to evaluate the students' perception in the development and use of remote Control and Automation education kits developed by two Universities. Three projects, based on real world environments, were implemented, being local and remotely operated. Students implemented the kits using the theoretical and practical knowledge, being the teachers a catalyst in the learning process. When kits were operational, end-user students got acquainted to the kits in the course curricula units. It is the author's believe that successful results were achieved not only in the learning progress on the Automation and Control fields (hard skills) but also on the development of the students soft skills, leading to encouraging and rewarding goals, motivating their future decisions and promoting synergies in their work. The design of learning experimental kits by students, under teacher supervision, for future use in course curricula by enduser students is an advantageous and rewarding experience.
Resumo:
The current level of demand by customers in the electronics industry requires the production of parts with an extremely high level of reliability and quality to ensure complete confidence on the end customer. Automatic Optical Inspection (AOI) machines have an important role in the monitoring and detection of errors during the manufacturing process for printed circuit boards. These machines present images of products with probable assembly mistakes to an operator and him decide whether the product has a real defect or if in turn this was an automated false detection. Operator training is an important aspect for obtaining a lower rate of evaluation failure by the operator and consequently a lower rate of actual defects that slip through to the following processes. The Gage R&R methodology for attributes is part of a Six Sigma strategy to examine the repeatability and reproducibility of an evaluation system, thus giving important feedback on the suitability of each operator in classifying defects. This methodology was already applied in several industry sectors and services at different processes, with excellent results in the evaluation of subjective parameters. An application for training operators of AOI machines was developed, in order to be able to check their fitness and improve future evaluation performance. This application will provide a better understanding of the specific training needs for each operator, and also to accompany the evolution of the training program for new components which in turn present additional new difficulties for the operator evaluation. The use of this application will contribute to reduce the number of defects misclassified by the operators that are passed on to the following steps in the productive process. This defect reduction will also contribute to the continuous improvement of the operator evaluation performance, which is seen as a quality management goal.
Resumo:
LUDA is a research project of Key Action 4 "City of Tomorrow & Cultural Heritage" of the programme "Energy, Environment and Sustainable Development" within the Fifth Framework Programme of the European Commission
Resumo:
Considering that in most developing countries there are still no comprehensive lists of addresses for a given geographical area, there has always been a problem in drawing samples from the community, ensuring randomisation in the selection of the subjects. This article discusses the geographical stratification by socio-economic status used to draw a multistage random sample from a community-based elderly population living in a city like S. Paulo - Brazil. Particular attention is given to the fact that the proportion of elderly people in the total population of a certain area appeared to be a good discriminatory variable for such stratification. The validity of the stratification method is analysed in the light of the socio-economic results obtained in the survey.
Resumo:
Exposure assessment is an important step of risk assessment process and has evolved more quickly than perhaps any aspect of the four-step risk paradigm (hazard identification, exposure assessment, dose-response analysis, and risk characterization). Nevertheless, some epidemiological studies have associated adverse health effects to a chemical exposure with an inadequate or absent exposure quantification. In addition to the metric used, the truly representation of exposure by measurements depends on: the strategy of sampling, random collection of measurements, and similarity between the measured and unmeasured exposure groups. Two environmental monitoring methodologies for formaldehyde occupational exposure were used to assess the influence of metric selection in exposure assessment and, consequently, in risk assessment process.
Resumo:
Collaborative networks are typically formed by heterogeneous and autonomous entities, and thus it is natural that each member has its own set of core-values. Since these values somehow drive the behaviour of the involved entities, the ability to quickly identify partners with compatible or common core-values represents an important element for the success of collaborative networks. However, tools to assess or measure the level of alignment of core-values are lacking. Since the concept of 'alignment' in this context is still ill-defined and shows a multifaceted nature, three perspectives are discussed. The first one uses a causal maps approach in order to capture, structure, and represent the influence relationships among core-values. This representation provides the basis to measure the alignment in terms of the structural similarity and influence among value systems. The second perspective considers the compatibility and incompatibility among core-values in order to define the alignment level. Under this perspective we propose a fuzzy inference system to estimate the alignment level, since this approach allows dealing with variables that are vaguely defined, and whose inter-relationships are difficult to define. Another advantage provided by this method is the possibility to incorporate expert human judgment in the definition of the alignment level. The last perspective uses a belief Bayesian network method, and was selected in order to assess the alignment level based on members' past behaviour. An example of application is presented where the details of each method are discussed.
Resumo:
INTRODUCTION: Previous cross-sectional studies have shown a high prevalence of chronic disease and disability among the elderly. Given Brazils rapid aging process and the obvious consequences of the growing number of old people with chronic diseases and associated disabilities for the provision of health services, a need was felt for a study that would overcome the limitations of cross-sectional data and shed some light on the main factors determining whether a person will live longer and free of disabling diseases, the so-called successful aging. The methodology of the first follow-up study of elderly residents in Brazil is presented. METHOD: The profile of the initial cohort is compared with previous cross-sectional data and an in-depth analysis of nonresponse is carried out in order to assess the validity of future longitudinal analysis. The EPIDOSO (Epidemiologia do Idoso) Study conducted a two-year follow-up of 1,667 elderly people (65+), living in S. Paulo. The study consisted of two waves, each consisting of household, clinical, and biochemical surveys. RESULTS AND CONCLUSIONS: In general, the initial cohort showed a similar profile to previous cross-sectional samples in S. Paulo. There was a majority of women, mostly widows, living in multigenerational households, and a high prevalence of chronic illnesses, psychiatric disturbances, and physical disabilities. Despite all the difficulties inherent in follow-up studies, there was a fairly low rate of nonresponse to the household survey after two years, which did not actually affect the representation of the cohort at the final household assessment, making unbiased longitudinal analysis possible. Concerning the clinical and blood sampling surveys, the respondents tended to be younger and less disabled than the nonrespondents, limiting the use of the clinical and laboratory data to longitudinal analysis aimed at a healthier cohort. It is worth mentioning that gender, education, family support, and socioeconomic status were not important determinants of nonresponse, as is often the case.
Resumo:
There are complex and diverse methodological problems involved in the clinical and epidemiological study of respiratory diseases and their etiological factors. The association of urban growth, industrialization and environmental deterioration with respiratory diseases makes it necessary to pay more attention to this research area with a multidisciplinary approach. Appropriate study designs and statistical techniques to analyze and improve our understanding of the pathological events and their causes must be implemented to reduce the growing morbidity and mortality through better preventive actions and health programs. The objective of the article is to review the most common methodological problems in this research area and to present the most available statistical tools used.
Resumo:
This paper presents a methodology that aims to increase the probability of delivering power to any load point of the electrical distribution system by identifying new investments in distribution components. The methodology is based on statistical failure and repair data of the distribution power system components and it uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A mixed integer non-linear optimization technique is developed to identify adequate investments in distribution networks components that allow increasing the availability level for any customer in the distribution system at minimum cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a real distribution network.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
This paper presents a methodology for distribution networks reconfiguration in outage presence in order to choose the reconfiguration that presents the lower power losses. The methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. Once obtained the system states by Monte Carlo simulation, a logical programming algorithm is applied to get all possible reconfigurations for every system state. In order to evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation a distribution power flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology to a practical case, the paper includes a case study that considers a real distribution network.