868 resultados para methodology for cataloguing
Resumo:
In the last 7 years, a method has been developed to analyse building energy performance using computer simulation, in Brazil. The method combines analysis of building design plans and documentation, walk-through visits, electric and thermal measurements and the use of an energy simulation tool (DOE-2.1E code), The method was used to model more than 15 office buildings (more than 200 000 m(2)), located between 12.5degrees and 27.5degrees South latitude. The paper describes the basic methodology, with data for one building and presents additional results for other six cases. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Nowadays despite improvements in usability and intuitiveness users have to adapt to the proposed systems to satisfy their needs. For instance, they must learn how to achieve tasks, how to interact with the system, and fulfill system's specifications. This paper proposes an approach to improve this situation enabling graphical user interface redefinition through virtualization and computer vision with the aim of increasing the system's usability. To achieve this goal the approach is based on enriched task models, virtualization and picture-driven computing.
Resumo:
This paper presents a study carried out in order to evaluate the students' perception in the development and use of remote Control and Automation education kits developed by two Universities. Three projects, based on real world environments, were implemented, being local and remotely operated. Students implemented the kits using the theoretical and practical knowledge, being the teachers a catalyst in the learning process. When kits were operational, end-user students got acquainted to the kits in the course curricula units. It is the author's believe that successful results were achieved not only in the learning progress on the Automation and Control fields (hard skills) but also on the development of the students soft skills, leading to encouraging and rewarding goals, motivating their future decisions and promoting synergies in their work. The design of learning experimental kits by students, under teacher supervision, for future use in course curricula by enduser students is an advantageous and rewarding experience.
Resumo:
The current level of demand by customers in the electronics industry requires the production of parts with an extremely high level of reliability and quality to ensure complete confidence on the end customer. Automatic Optical Inspection (AOI) machines have an important role in the monitoring and detection of errors during the manufacturing process for printed circuit boards. These machines present images of products with probable assembly mistakes to an operator and him decide whether the product has a real defect or if in turn this was an automated false detection. Operator training is an important aspect for obtaining a lower rate of evaluation failure by the operator and consequently a lower rate of actual defects that slip through to the following processes. The Gage R&R methodology for attributes is part of a Six Sigma strategy to examine the repeatability and reproducibility of an evaluation system, thus giving important feedback on the suitability of each operator in classifying defects. This methodology was already applied in several industry sectors and services at different processes, with excellent results in the evaluation of subjective parameters. An application for training operators of AOI machines was developed, in order to be able to check their fitness and improve future evaluation performance. This application will provide a better understanding of the specific training needs for each operator, and also to accompany the evolution of the training program for new components which in turn present additional new difficulties for the operator evaluation. The use of this application will contribute to reduce the number of defects misclassified by the operators that are passed on to the following steps in the productive process. This defect reduction will also contribute to the continuous improvement of the operator evaluation performance, which is seen as a quality management goal.
Resumo:
LUDA is a research project of Key Action 4 "City of Tomorrow & Cultural Heritage" of the programme "Energy, Environment and Sustainable Development" within the Fifth Framework Programme of the European Commission
Resumo:
Considering that in most developing countries there are still no comprehensive lists of addresses for a given geographical area, there has always been a problem in drawing samples from the community, ensuring randomisation in the selection of the subjects. This article discusses the geographical stratification by socio-economic status used to draw a multistage random sample from a community-based elderly population living in a city like S. Paulo - Brazil. Particular attention is given to the fact that the proportion of elderly people in the total population of a certain area appeared to be a good discriminatory variable for such stratification. The validity of the stratification method is analysed in the light of the socio-economic results obtained in the survey.
Resumo:
Exposure assessment is an important step of risk assessment process and has evolved more quickly than perhaps any aspect of the four-step risk paradigm (hazard identification, exposure assessment, dose-response analysis, and risk characterization). Nevertheless, some epidemiological studies have associated adverse health effects to a chemical exposure with an inadequate or absent exposure quantification. In addition to the metric used, the truly representation of exposure by measurements depends on: the strategy of sampling, random collection of measurements, and similarity between the measured and unmeasured exposure groups. Two environmental monitoring methodologies for formaldehyde occupational exposure were used to assess the influence of metric selection in exposure assessment and, consequently, in risk assessment process.
Resumo:
Collaborative networks are typically formed by heterogeneous and autonomous entities, and thus it is natural that each member has its own set of core-values. Since these values somehow drive the behaviour of the involved entities, the ability to quickly identify partners with compatible or common core-values represents an important element for the success of collaborative networks. However, tools to assess or measure the level of alignment of core-values are lacking. Since the concept of 'alignment' in this context is still ill-defined and shows a multifaceted nature, three perspectives are discussed. The first one uses a causal maps approach in order to capture, structure, and represent the influence relationships among core-values. This representation provides the basis to measure the alignment in terms of the structural similarity and influence among value systems. The second perspective considers the compatibility and incompatibility among core-values in order to define the alignment level. Under this perspective we propose a fuzzy inference system to estimate the alignment level, since this approach allows dealing with variables that are vaguely defined, and whose inter-relationships are difficult to define. Another advantage provided by this method is the possibility to incorporate expert human judgment in the definition of the alignment level. The last perspective uses a belief Bayesian network method, and was selected in order to assess the alignment level based on members' past behaviour. An example of application is presented where the details of each method are discussed.
Resumo:
INTRODUCTION: Previous cross-sectional studies have shown a high prevalence of chronic disease and disability among the elderly. Given Brazils rapid aging process and the obvious consequences of the growing number of old people with chronic diseases and associated disabilities for the provision of health services, a need was felt for a study that would overcome the limitations of cross-sectional data and shed some light on the main factors determining whether a person will live longer and free of disabling diseases, the so-called successful aging. The methodology of the first follow-up study of elderly residents in Brazil is presented. METHOD: The profile of the initial cohort is compared with previous cross-sectional data and an in-depth analysis of nonresponse is carried out in order to assess the validity of future longitudinal analysis. The EPIDOSO (Epidemiologia do Idoso) Study conducted a two-year follow-up of 1,667 elderly people (65+), living in S. Paulo. The study consisted of two waves, each consisting of household, clinical, and biochemical surveys. RESULTS AND CONCLUSIONS: In general, the initial cohort showed a similar profile to previous cross-sectional samples in S. Paulo. There was a majority of women, mostly widows, living in multigenerational households, and a high prevalence of chronic illnesses, psychiatric disturbances, and physical disabilities. Despite all the difficulties inherent in follow-up studies, there was a fairly low rate of nonresponse to the household survey after two years, which did not actually affect the representation of the cohort at the final household assessment, making unbiased longitudinal analysis possible. Concerning the clinical and blood sampling surveys, the respondents tended to be younger and less disabled than the nonrespondents, limiting the use of the clinical and laboratory data to longitudinal analysis aimed at a healthier cohort. It is worth mentioning that gender, education, family support, and socioeconomic status were not important determinants of nonresponse, as is often the case.
Resumo:
There are complex and diverse methodological problems involved in the clinical and epidemiological study of respiratory diseases and their etiological factors. The association of urban growth, industrialization and environmental deterioration with respiratory diseases makes it necessary to pay more attention to this research area with a multidisciplinary approach. Appropriate study designs and statistical techniques to analyze and improve our understanding of the pathological events and their causes must be implemented to reduce the growing morbidity and mortality through better preventive actions and health programs. The objective of the article is to review the most common methodological problems in this research area and to present the most available statistical tools used.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
In many countries the use of renewable energy is increasing due to the introduction of new energy and environmental policies. Thus, the focus on the efficient integration of renewable energy into electric power systems is becoming extremely important. Several European countries have already achieved high penetration of wind based electricity generation and are gradually evolving towards intensive use of this generation technology. The introduction of wind based generation in power systems poses new challenges for the power system operators. This is mainly due to the variability and uncertainty in weather conditions and, consequently, in the wind based generation. In order to deal with this uncertainty and to improve the power system efficiency, adequate wind forecasting tools must be used. This paper proposes a data-mining-based methodology for very short-term wind forecasting, which is suitable to deal with large real databases. The paper includes a case study based on a real database regarding the last three years of wind speed, and results for wind speed forecasting at 5 minutes intervals.