884 resultados para Training and Function Description Analysis
Resumo:
The current level of demand by customers in the electronics industry requires the production of parts with an extremely high level of reliability and quality to ensure complete confidence on the end customer. Automatic Optical Inspection (AOI) machines have an important role in the monitoring and detection of errors during the manufacturing process for printed circuit boards. These machines present images of products with probable assembly mistakes to an operator and him decide whether the product has a real defect or if in turn this was an automated false detection. Operator training is an important aspect for obtaining a lower rate of evaluation failure by the operator and consequently a lower rate of actual defects that slip through to the following processes. The Gage R&R methodology for attributes is part of a Six Sigma strategy to examine the repeatability and reproducibility of an evaluation system, thus giving important feedback on the suitability of each operator in classifying defects. This methodology was already applied in several industry sectors and services at different processes, with excellent results in the evaluation of subjective parameters. An application for training operators of AOI machines was developed, in order to be able to check their fitness and improve future evaluation performance. This application will provide a better understanding of the specific training needs for each operator, and also to accompany the evolution of the training program for new components which in turn present additional new difficulties for the operator evaluation. The use of this application will contribute to reduce the number of defects misclassified by the operators that are passed on to the following steps in the productive process. This defect reduction will also contribute to the continuous improvement of the operator evaluation performance, which is seen as a quality management goal.
Resumo:
This paper analyzes the risk-return trade-off in European equities considering both temporal and cross-sectional dimensions. In our analysis, we introduce not only the market portfolio but also 15 industry portfolios comprising the entire market. Several bivariate GARCH models are estimated to obtain the covariance matrix between excess market returns and the industrial portfolios and the existence of a risk-return trade-off is analyzed through a cross-sectional approach using the information in all portfolios. It is obtained evidence for a positive and significant risk-return trade-off in the European market. This conclusion is robust for different GARCH specifications and is even more evident after controlling for the main financial crisis during the sample period.
Resumo:
Translator’s training and assessment has used more and more tools and innovative strategies over the years. The goals and results to achieve haven’t changed much, however: translation quality. In order to accomplish it, the translator and all the tasks and processes he develops appear as crucial, being pre-translation and post-translation processes equally important as the translation itself, namely as far as autonomy, reflexive and critical skills are concerned. Finally, the need and relevance of collaborative tasks and networks amongst virtual translation communities, led us to the decision of implementing ePortfolios as a tool to develop the requested skills and extend the use of Internet in translation. In this paper we describe a case-study of a pilot experiment on the using of e-portfolios as a translation training tool and discuss their role in the definition of a clear set of objectives and phases for the completion of each task, by helping students in the management of the projects deadlines, improving their knowledge on the construction and management of translation resources and deepening their awareness about the concepts related to the development of eportfolios.
Resumo:
This paper analyzes DNA information using entropy and phase plane concepts. First, the DNA code is converted into a numerical format by means of histograms that capture DNA sequence length ranging from one up to ten bases. This strategy measures dynamical evolutions from 4 up to 410 signal states. The resulting histograms are analyzed using three distinct entropy formulations namely the Shannon, Rényie and Tsallis definitions. Charts of entropy versus sequence length are applied to a set of twenty four species, characterizing 486 chromosomes. The information is synthesized and visualized by adapting phase plane concepts leading to a categorical representation of chromosomes and species.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). Methods: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. Results: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. Conclusion: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
The increasing use of ionizing radiation for medical purposes emphasizes the concern about safety and justification of using ionizing radiation. This is linked with the use of new and high-dose X-ray technology (particularly CT). According to the UNSCEAR 2010 Report the total number of diagnostic medical examinations (both medical and dental) is estimated to have risen from 2.4 billion (period 1991–1996) to 3.6 billion (period 1997– 2008) - a marked increase in collective doses. An appropriate use of technology aiming diagnostic or therapy and respecting the ALARA principle is a mandatory requisite to safely perform any radiological procedure. Radiation protection is thus, a concern of all specialists in the radiology field ( radiologists, radiographers, medical physicists, among other professional groups). The importance of education and training of these professionals in reducing patients’ doses while maintaining the desired level of quality in medical exposures, as well as precise therapeutic treatments is well recognized. Education, training and continuing professional development (CPD) constitute a triad pointing towards the radiographers’ development of competences in the radiation protection field. This presentation excludes the radiographer role and competences in the fields of ultrasonography and MRI.
Resumo:
The kraft pulps produced from heartwood and sapwood of Eucalyptus globulus at 130 degrees C, 150 degrees C, and 170 degrees C were characterized by wet chemistry (total lignin as sum of Klason and soluble lignin fractions) and pyrolysis (total lignin denoted as py-lignin). The total lignin content obtained with both methods was similar. In the course of delignification, the py-lignin values were higher (by 2 to 5%) compared to Klason values, which is in line with the importance of soluble lignin for total lignin determination. Pyrolysis analysis presents advantages over wet chemical procedures, and it can be applied to wood and pulps to determine lignin contents at different stages of the delignification process. The py-lignin values were used for kinetic modelling of delignification, with very high predictive value and results similar to those of modelling using wet chemical determinations.
Resumo:
Non-suicidal self-injury (NSSI) is the deliberate, self-inflicted destruction of body tissue without suicidal intent and an important clinical phenomenon. Rates of NSSI appear to be disproportionately high in adolescents and young adults, and is a risk factor for suicidal ideation and behavior. The present study reports the psychometric properties of the Impulse, Self-harm and Suicide Ideation Questionnaire for Adolescents (ISSIQ-A), a measure designed to comprehensively assess the impulsivity, NSSI behaviors and suicide ideation. An additional module of this questionnaire assesses the functions of NSSI. Results of Confirmatory Factor Analysis (CFA) of the scale on 1722 youths showed items' suitability and confirmed a model of four different dimensions (Impulse, Self-harm, Risk-behavior and Suicide ideation) with good fit and validity. Further analysis showed that youth׳s engagement in self-harm may exert two different functions: to create or alleviate emotional states, and to influence social relationships. Our findings contribute to research and assessment on non-suicidal self-injury, suggesting that the ISSIQ-A is a valid and reliable measure to assess impulse, self-harm and suicidal thoughts, in adolescence.
Resumo:
The purpose of this study is to analyse the interlimb relation and the influence of mechanical energy on metabolic energy expenditure during gait. In total, 22 subjects were monitored as to electromyographic activity, ground reaction forces and VO2 consumption (metabolic power) during gait. The results demonstrate a moderate negative correlation between the activity of tibialis anterior, biceps femoris and vastus medialis of the trailing limb during the transition between midstance and double support and that of the leading limb during double support for the same muscles, and between these and gastrocnemius medialis and soleus of the trailing limb during double support. Trailing limb soleus during the transition between mid-stance and double support was positively correlated to leading limb tibialis anterior, vastus medialis and biceps femoris during double support. Also, the trailing limb centre of mass mechanical work was strongly influenced by the leading limbs, although only the mechanical power related to forward progression of both limbs was correlated to metabolic power. These findings demonstrate a consistent interlimb relation in terms of electromyographic activity and centre of mass mechanical work, being the relations occurred in the plane of forward progression the more important to gait energy expenditure.
Resumo:
LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with a smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, little work has been made to illuminate its characteristics upon multiprocessor platforms. In this paper, we identify the dynamics of laxity from the system’s viewpoint and translate the dynamics into LLF multiprocessor schedulability analysis. More specifically, we first characterize laxity properties under LLF scheduling, focusing on laxity dynamics associated with a deadline miss. These laxity dynamics describe a lower bound, which leads to the deadline miss, on the number of tasks of certain laxity values at certain time instants. This lower bound is significant because it represents invariants for highly dynamic system parameters (laxity values). Since the laxity of a task is dependent of the amount of interference of higher-priority tasks, we can then derive a set of conditions to check whether a given task system can go into the laxity dynamics towards a deadline miss. This way, to the author’s best knowledge, we propose the first LLF multiprocessor schedulability test based on its own laxity properties. We also develop an improved schedulability test that exploits slack values. We mathematically prove that the proposed LLF tests dominate the state-of-the-art EDZL tests. We also present simulation results to evaluate schedulability performance of both the original and improved LLF tests in a quantitative manner.
Resumo:
This paper studies the information content of the chromosomes of twenty-three species. Several statistics considering different number of bases for alphabet character encoding are derived. Based on the resulting histograms, word delimiters and character relative frequencies are identified. The knowledge of this data allows moving along each chromosome while evaluating the flow of characters and words. The resulting flux of information is captured by means of Shannon entropy. The results are explored in the perspective of power law relationships allowing a quantitative evaluation of the DNA of the species.
Resumo:
Modeling the fundamental performance limits of Wireless Sensor Networks (WSNs) is of paramount importance to understand their behavior under the worst-case conditions and to make the appropriate design choices. This is particular relevant for time-sensitive WSN applications, where the timing behavior of the network protocols (message transmission must respect deadlines) impacts on the correct operation of these applications. In that direction this paper contributes with a methodology based on Network Calculus, which enables quick and efficient worst-case dimensioning of static or even dynamically changing cluster-tree WSNs where the data sink can either be static or mobile. We propose closed-form recurrent expressions for computing the worst-case end-to-end delays, buffering and bandwidth requirements across any source-destination path in a cluster-tree WSN. We show how to apply our methodology to the case of IEEE 802.15.4/ZigBee cluster-tree WSNs. Finally, we demonstrate the validity and analyze the accuracy of our methodology through a comprehensive experimental study using commercially available technology, namely TelosB motes running TinyOS.
Resumo:
Measurements in civil engineering load tests usually require considerable time and complex procedures. Therefore, measurements are usually constrained by the number of sensors resulting in a restricted monitored area. Image processing analysis is an alternative way that enables the measurement of the complete area of interest with a simple and effective setup. In this article photo sequences taken during load displacement tests were captured by a digital camera and processed with image correlation algorithms. Three different image processing algorithms were used with real images taken from tests using specimens of PVC and Plexiglas. The data obtained from the image processing algorithms were also compared with the data from physical sensors. A complete displacement and strain map were obtained. Results show that the accuracy of the measurements obtained by photogrammetry is equivalent to that from the physical sensors but with much less equipment and fewer setup requirements. © 2015Computer-Aided Civil and Infrastructure Engineering.
Resumo:
Four Cynara cardunculus clones, two from Portugal and two from Spain were studied for biomass production and their lignin was characterized. The clones differed in biomass partitioning: Spanish clones produced more capitula (54.5% vs. 43.9%), and Portuguese clones more stalks (37.2% vs. 25.6%). The heating values (HHV0) of the stalks were similar, ranging from 17.1 to 18.4 MJ/kg. Lignin was studied by analytical pyrolysis (Py-GC/MS(FID)), separately in depithed stalks (stalksDP) and pith. StalksDP had in average higher relative proportions of lignin derived compounds than pith (23.9% vs. 21.8%) with slightly different lignin monomeric composition: pith samples were richer in syringyl units as compared to stalksDP (64% vs. 53%), with S/G ratios of 2.1 and 1.3, respectively. The H:G:S composition was 7:40:53 in stalksDP and 7:29:64 in pith. The lignin content ranged from 18.8% to 25.5%, enabling a differentiation between clones and provenances. © 2015 Elsevier Ltd. All rights reserved.
Resumo:
Comunicação apresentada na 18th Conference International of Health Promotion Hospitals & Health Services "Tackling causes and consequences of inequalities in health: contributions of health services and the HPH network", em Manchester de 14-16 de april de 2010