922 resultados para methods and measurement
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
Today, the use of heavy metals and chemical products industry expanded. The presence of significant amounts of, pollutants in industrial waste water can lead to serious risks to the environment and human health have heavy metals like chromium is one example of the future of salmon knock pond environment. Chromium is an essential element in the diet, but high doses of this element is very dangerous. Hence the use of chemical methods as a tool for the removal of metals from waste water pond be used. The aim of this study was to investigate the mineral kaolin adsorbents for the removal of chromium is water. Thus, the effect of different concentrations of absorbent micro amounts of chromium absorption and variable temperature, pH and electrolytes were studied. During the investigation of spectroscopic instrument (Varian) UV-VIS are used. Comparison of the absorption mechanism of chromium adsorption by the adsorbent with nano-absorbent kaolin kaolin was investigated. According to the studies done in the same conditions of temperature, pH and shaking rate of chromium absorption by nano kaolin kaolin is much more attractive. Therefore, its use as an adsorbent abundant, cheap, accessible, efficient and effective is proposed.
In Situ Characterization of Optical Absorption by Carbonaceous Aerosols: Calibration and Measurement
Resumo:
Light absorption by aerosols has a great impact on climate change. A Photoacoustic spectrometer (PA) coupled with aerosol-based classification techniques represents an in situ method that can quantify the light absorption by aerosols in a real time, yet significant differences have been reported using this method versus filter based methods or the so-called difference method based upon light extinction and light scattering measurements. This dissertation focuses on developing calibration techniques for instruments used in measuring the light absorption cross section, including both particle diameter measurements by the differential mobility analyzer (DMA) and light absorption measurements by PA. Appropriate reference materials were explored for the calibration/validation of both measurements. The light absorption of carbonaceous aerosols was also investigated to provide fundamental understanding to the absorption mechanism. The first topic of interest in this dissertation is the development of calibration nanoparticles. In this study, bionanoparticles were confirmed to be a promising reference material for particle diameter as well as ion-mobility. Experimentally, bionanoparticles demonstrated outstanding homogeneity in mobility compared to currently used calibration particles. A numerical method was developed to calculate the true distribution and to explain the broadening of measured distribution. The high stability of bionanoparticles was also confirmed. For PA measurement, three aerosol with spherical or near spherical shapes were investigated as possible candidates for a reference standard: C60, copper and silver. Comparisons were made between experimental photoacoustic absorption data with Mie theory calculations. This resulted in the identification of C60 particles with a mobility diameter of 150 nm to 400 nm as an absorbing standard at wavelengths of 405 nm and 660 nm. Copper particles with a mobility diameter of 80 nm to 300 nm are also shown to be a promising reference candidate at wavelength of 405 nm. The second topic of this dissertation focuses on the investigation of light absorption by carbonaceous particles using PA. Optical absorption spectra of size and mass selected laboratory generated aerosols consisting of black carbon (BC), BC with non-absorbing coating (ammonium sulfate and sodium chloride) and BC with a weakly absorbing coating (brown carbon derived from humic acid) were measured across the visible to near-IR (500 nm to 840 nm). The manner in which BC mixed with each coating material was investigated. The absorption enhancement of BC was determined to be wavelength dependent. Optical absorption spectra were also taken for size and mass selected smoldering smoke produced from six types of commonly seen wood in a laboratory scale apparatus.
Resumo:
Background: A cross-cultural, randomized study was proposed to observe the effects of a school-based intervention designed to promote physical activity and healthy eating among high school students in 2 cities from different regions in Brazil: Recife and Florianopolis. The objective of this article is to describe the methodology and subjects enrolled in the project. Methods: Ten schools from each region were matched and randomized into intervention and control conditions. A questionnaire and anthropometry were used to collect data in the first and last month of the 2006 school year. The sample (n = 2155 at baseline; 55.7% females; 49.1% in the experimental group) included students 15 to 24 years, attending nighttime classes. The intervention focused on simple environmental/organizational changes, diet and physical activity education, and personnel training. Results: The central aspects of the intervention have been implemented in all 10 intervention schools. Problems during the intervention included teachers' strikes in both sites and lack of involvement of the canteen owners in schools. Conclusions: The Saude no Boa study provides evidence that public high schools in Brazil represent an important environment for health promotion. Its design and simple measurements increase the chances of it being sustained and disseminated to similar schools in Brazil.
Resumo:
Correlations of charged hadrons of 1< p(T) < 10 Gev/c with high pT direct photons and pi(0) mesons in the range 5< p(T) < 15 Gev/c are used to study jet fragmentation in the gamma + jet and dijet channels, respectively. The magnitude of the partonic transverse momentum, k(T), is obtained by comparing to a model incorporating a Gaussian kT smearing. The sensitivity of the associated charged hadron spectra to the underlying fragmentation function is tested and the data are compared to calculations using recent global fit results. The shape of the direct photon-associated hadron spectrum as well as its charge asymmetry are found to be consistent with a sample dominated by quark-gluon Compton scattering. No significant evidence of fragmentation photon correlated production is observed within experimental uncertainties.
Resumo:
This paper deals with the use of simplified methods to predict methane generation in tropical landfills. Methane recovery data obtained on site as part of a research program being carried Out at the Metropolitan Landfill, Salvador, Brazil, is analyzed and used to obtain field methane generation over time. Laboratory data from MSW samples of different ages are presented and discussed: and simplified procedures to estimate the methane generation potential, L(o), and the constant related to the biodegradation rate, k are applied. The first order decay method is used to fit field and laboratory results. It is demonstrated that despite the assumptions and the simplicity of the adopted laboratory procedures, the values L(o) and k obtained are very close to those measured in the field, thus making this kind of analysis very attractive for first approach purposes. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Traumatic brain injury (TBI) may result in a variety of cognitive, behavioural and physical impairments. Dizziness has been reported in up to 80% of cases within the first few days after injury. The literature was reviewed to attempt to delineate prevalence of dizziness as a symptom, impairments causing dizziness, the functional limitations it causes and its measurement. The literature provides widely differing estimates of prevalence and vestibular system dysfunction appears to be the best reported of impairments contributing to this symptom. The variety of results is discussed and other possible causes for dizziness were reviewed. Functional difficulties caused by dizziness were not reported for this population in the literature and review of cognitive impairments suggests that existing measurement tools for dizziness may be problematic in this population. Research on the functional impact of dizziness in the TBI population and measurement of these symptoms appears to be warranted.
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.
Coronary CT angiography using 64 detector rows: methods and design of the multi-centre trial CORE-64
Resumo:
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective ""CORE-64"" trial (""Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors""). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.