853 resultados para Calculation methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bound and resonance states of HO2 have been calculated quantum mechanically by the Lanczos homogeneous filter diagonalization method [Zhang and Smith, Phys. Chem. Chem. Phys. 3, 2282 (2001); J. Chem. Phys. 115, 5751 (2001)] for nonzero total angular momentum J = 1,2,3. For lower bound states, agreement between the results in this paper and previous work is quite satisfactory; while for high lying bound states and resonances these are the first reported results. A helicity quantum number V assignment (within the helicity conserving approximation) is performed and the results indicate that for lower bound states it is possible to assign the V quantum numbers unambiguously, but for resonances it is impossible to assign the V helicity quantum numbers due to strong mixing. In fact, for the high-lying bound states, the mixing has already appeared. These results indicate that the helicity conserving approximation is not good for the resonance state calculations and exact quantum calculations are needed to accurately describe the reaction dynamics for HO2 system. Analysis of the resonance widths shows that most of the resonances are overlapping and the interferences between them lead to large fluctuations from one resonance to another. In accord with the conclusions from earlier J = 0 calculations, this indicates that the dissociation of HO2 is essentially irregular. (C) 2003 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In modern magnetic resonance imaging (MRI), patients are exposed to strong, nonuniform static magnetic fields outside the central imaging region, in which the movement of the body may be able to induce electric currents in tissues which could be possibly harmful. This paper presents theoretical investigations into the spatial distribution of induced electric fields and currents in the patient when moving into the MRI scanner and also for head motion at various positions in the magnet. The numerical calculations are based on an efficient, quasi-static, finite-difference scheme and an anatomically realistic, full-body, male model. 3D field profiles from an actively shielded 4T magnet system are used and the body model projected through the field profile with a range of velocities. The simulation shows that it possible to induce electric fields/currents near the level of physiological significance under some circumstances and provides insight into the spatial characteristics of the induced fields. The results are extrapolated to very high field strengths and tabulated data shows the expected induced currents and fields with both movement velocity and field strength. (C) 2003 Elsevier Science (USA). All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to compare accumulated oxygen deficit data derived using two different exercise protocols with the aim of producing a less time-consuming test specifically for use with athletes. Six road and four track male endurance cyclists performed two series of cycle ergometer tests. The first series involved five 10 min sub-maximal cycle exercise bouts, a (V) over dotO(2peak) test and a 115% (V) over dotO(2peak) test. Data from these tests were used to estimate the accumulated oxygen deficit according to the calculations of Medbo et al. (1988). In the second series of tests, participants performed a 15 min incremental cycle ergometer test followed, 2 min later, by a 2 min variable resistance test in which they completed as much work as possible while pedalling at a constant rate. Analysis revealed that the accumulated oxygen deficit calculated from the first series of tests was higher (P< 0.02) than that calculated from the second series: 52.3 +/- 11.7 and 43.9 +/- 6.4 ml . kg(-1), respectively (mean +/- s). Other significant differences between the two protocols were observed for (V) over dot O-2peak, total work and maximal heart rate; all were higher during the modified protocol (P

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta pesquisa de mestrado teve como principal objetivo investigar estratégias de cálculo mental, utilizadas por alunos de uma 5ª série/6º ano do ensino fundamental ao resolver cálculos de adição e subtração. Para atingir este objetivo procuramos responder aos questionamentos: Quais estratégias de cálculo mental, alunos da 5ª série/6º ano empregam na resolução de cálculos de adição e subtração? Que relações existem entre o tipo de cálculo envolvido e a estratégia adotada para resolvê-lo? Para respondermos a essas questões, seguimos uma metodologia de natureza qualitativa, configurada como estudo de caso do tipo etnográfico. O trabalho de campo foi desenvolvido em uma turma de 5ª série/6º ano do ensino fundamental de uma escola pública da rede estadual de ensino do município de Serra. A pesquisa aconteceu de maio a dezembro de 2013. Oito alunos resolveram uma atividade diagnóstica composta de quatro sequências de cálculos mentais, a saber, fatos fundamentais do número 5, do número 10, do número 20 e do número 100, dentre adições e subtrações próximas a esses resultados. Todos alunos participaram da etapa de entrevistas. Dos oito alunos, foram escolhidos dados de três que participaram de outras etapas da pesquisa. Os registros realizados pelos alunos na etapa de observação da turma, na etapa diagnóstica e na etapa de intervenção didática, as anotações no caderno de campo e algumas gravações em áudio serviram como fontes de coleta de dados. Utilizamos as estratégias identificadas por Beishuizen (1997), Klein e Beishuizen (1998), Thompson (1999, 2000) e Lucangeli et al. (2003), como categorias de análise. Através da análise de dados, constatamos que as escolhas das estratégias de cálculo mental pelos alunos variaram de acordo com o tipo de sequência de cálculos, a operação aritmética (adição ou subtração) e o estado emocional deles durante a atividade. Foi possível identificar o uso de duas estratégias combinadas, o algoritmo mental e estratégias de contagens nos dedos para grande parte dos cálculos. O uso do algoritmo mental mostrou-se um procedimento de grande sobrecarga mental e, em alguns cálculos de adição sem reserva, serviu apenas como apoio à visualização numérica, sendo executado pelo aluno da esquerda para a direita, semelhantemente à estratégia de decomposição numérica. Os dados deste estudo apontam para: (i) a necessidade de se trabalhar fatos numéricos fundamentais de adição e subtração via cálculo mental de maneira sistemática em sala de aula; (ii) a necessidade de se ensinar estratégias autênticas de cálculo mental para que os alunos não se tornem dependentes de estratégias como contagens e algoritmo mental, que são mais difíceis de serem executadas com êxito; (iii) a importância de entrevistar, individualmente, os alunos a fim de compreender e avaliar o desenvolvimento destes em tarefas de cálculo mental.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually targeting either the imperative or the object oriented paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird- Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general alternative to slicing functional programs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – Castings defects are usually easy to characterize, but to eradicate them can be a difficult task. In many cases, defects are caused by the combined effect of different factors, whose identification is often difficult. Besides, the real non-quality costs are usually unknown, and even neglected. This paper aims to describe the development of a modular tool for quality improvement in foundries, and its main objective is to present the application potential and the foundry process areas that are covered and taken into account. Design/methodology/approach – The integrated model was conceived as an expert system, designated Qualifound, which performs both qualitative and quantitative analyses. For the qualitative analyses mode, the nomenclature and the description of defects are based on the classification suggested by the International Committee of the Foundry Technical Association. Thus, a database of defects was established, enabling one to associate the defects with the relevant process operations and the identification of their possible causes. The quantitative analysis mode deals with the number of produced and rejected castings and includes the calculation of the non-quality costs. Findings – The validation of Qualifound was carried out in a Portuguese foundry, whose quality system had been certified according to the ISO 9000 standards. Qualifound was used in every management area and it was concluded that the application had the required technological requisites to provide the necessary information for the foundry management to improve process quality. Originality/value – The paper presents a successful application of an informatics tool on quality improvement in foundries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays despite improvements in usability and intuitiveness users have to adapt to the proposed systems to satisfy their needs. For instance, they must learn how to achieve tasks, how to interact with the system, and fulfill system's specifications. This paper proposes an approach to improve this situation enabling graphical user interface redefinition through virtualization and computer vision with the aim of increasing the system's usability. To achieve this goal the approach is based on enriched task models, virtualization and picture-driven computing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a study carried out in order to evaluate the students' perception in the development and use of remote Control and Automation education kits developed by two Universities. Three projects, based on real world environments, were implemented, being local and remotely operated. Students implemented the kits using the theoretical and practical knowledge, being the teachers a catalyst in the learning process. When kits were operational, end-user students got acquainted to the kits in the course curricula units. It is the author's believe that successful results were achieved not only in the learning progress on the Automation and Control fields (hard skills) but also on the development of the students soft skills, leading to encouraging and rewarding goals, motivating their future decisions and promoting synergies in their work. The design of learning experimental kits by students, under teacher supervision, for future use in course curricula by enduser students is an advantageous and rewarding experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current level of demand by customers in the electronics industry requires the production of parts with an extremely high level of reliability and quality to ensure complete confidence on the end customer. Automatic Optical Inspection (AOI) machines have an important role in the monitoring and detection of errors during the manufacturing process for printed circuit boards. These machines present images of products with probable assembly mistakes to an operator and him decide whether the product has a real defect or if in turn this was an automated false detection. Operator training is an important aspect for obtaining a lower rate of evaluation failure by the operator and consequently a lower rate of actual defects that slip through to the following processes. The Gage R&R methodology for attributes is part of a Six Sigma strategy to examine the repeatability and reproducibility of an evaluation system, thus giving important feedback on the suitability of each operator in classifying defects. This methodology was already applied in several industry sectors and services at different processes, with excellent results in the evaluation of subjective parameters. An application for training operators of AOI machines was developed, in order to be able to check their fitness and improve future evaluation performance. This application will provide a better understanding of the specific training needs for each operator, and also to accompany the evolution of the training program for new components which in turn present additional new difficulties for the operator evaluation. The use of this application will contribute to reduce the number of defects misclassified by the operators that are passed on to the following steps in the productive process. This defect reduction will also contribute to the continuous improvement of the operator evaluation performance, which is seen as a quality management goal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LUDA is a research project of Key Action 4 "City of Tomorrow & Cultural Heritage" of the programme "Energy, Environment and Sustainable Development" within the Fifth Framework Programme of the European Commission

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considering that in most developing countries there are still no comprehensive lists of addresses for a given geographical area, there has always been a problem in drawing samples from the community, ensuring randomisation in the selection of the subjects. This article discusses the geographical stratification by socio-economic status used to draw a multistage random sample from a community-based elderly population living in a city like S. Paulo - Brazil. Particular attention is given to the fact that the proportion of elderly people in the total population of a certain area appeared to be a good discriminatory variable for such stratification. The validity of the stratification method is analysed in the light of the socio-economic results obtained in the survey.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main intend of this work, is to determinate the Specific Absorption Rate (SAR) on human head tissues exposed to radiation caused by sources of 900 and 1800MHz, since those are the typical frequencies for mobile communications systems nowadays. In order to determinate the SAR, has been used the FDTD (Finite Difference Time Domain), which is a numeric method in time domain, obtained from the Maxwell equations in differential mode. In order to do this, a computational model from the human head in two dimensions made with cells of the smallest possible size was implemented, respecting the limits from computational processing. It was possible to verify the very good efficiency of the FDTD method in the resolution of those types of problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure assessment is an important step of risk assessment process and has evolved more quickly than perhaps any aspect of the four-step risk paradigm (hazard identification, exposure assessment, dose-response analysis, and risk characterization). Nevertheless, some epidemiological studies have associated adverse health effects to a chemical exposure with an inadequate or absent exposure quantification. In addition to the metric used, the truly representation of exposure by measurements depends on: the strategy of sampling, random collection of measurements, and similarity between the measured and unmeasured exposure groups. Two environmental monitoring methodologies for formaldehyde occupational exposure were used to assess the influence of metric selection in exposure assessment and, consequently, in risk assessment process.