136 resultados para Lean Sigma


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to compare the amount of exercise prescribed with the amount completed between two different modes of training intervention and between the sexes. Thirty-two men (mean age = 39.1 years, body mass index = 32.9 kg · m-2) and women (mean age = 39.6 years, body mass index = 32.1 kg · m-2) were prescribed traditional resistance training or light-resistance circuit training for 16 weeks. Lean mass and fat mass were determined by dual-energy X-ray absorptiometry at weeks 1 and 16. A completion index was calculated to provide a measure of the extent to which participants completed exercise training relative to the amount of exercise prescribed. The absolute amount of exercise completed by the circuit training group was significantly greater than the amount prescribed (P < 0.0001). The resistance training group consistently under-completed relative to the amount prescribed, but the difference was not significant. The completion index for the circuit training group (26 ± 21.7%) was significantly different from that of the resistance training group (-7.4 ± 3.0%). The completion index was not significantly different between men and women in either group. These data suggest that overweight and obese individuals participating in light-resistance circuit training complete more exercise than is prescribed. Men and women do not differ in the extent to which they over- or under-complete prescribed exercise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climbing guidebooks have been in existence ever since people started climbing cliffs for recreation. It has only been recently that these guidebooks have started to include photographs to help identification of climbs. To date, there are very few interactive guidebooks that are available online which include the ability to filter climbs and climbing areas based upon specific characteristics. Being able to interrogate a database of climbs and climbing areas by grade, style of climbing, quality of climbing,and length of climbs would be a significant addition to the guidebooks that are currently available. Integrating a fully illustrated database of climbs with open source mapping software such as Google Maps would extend the utility of current guidebooks significantly. As portable devices become more commonplace, the ability to further combine these guidebooks with GPS technology would make the location and identification of climbs much simpler. This study compares conventional hardcopy guidebooks with several online guidebooks. In addition, several Decision Support Systems are analysed to assess the ways in which Geographic Information Systems are integrated to assist in decision making. A prototype interactive guidebook was developed after presenting a survey to a group of climbers to assess what they would find useful in an online resource. This survey found that most climbers would like to see climbs represented on a map of the climbing site in order to aid in locating them. They also suggested that being able to filter climbs by various criteria would be useful. These features were subsequently integrated into the prototype. After review by several climbers it was found that this system has many benefits over conventional hardcopy guidebooks; however, it was also noted that to be even more useful further work needed to be done to improve the functionality of the prototypes. This work would include an ability to print a selection of climbs from those ranges searched.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Violence in health care has been widely reported and health care workers, particularly nurses in acute care settings, are ill-equipped to manage patients who exhibit aggressive traits. Aim: The aim of this systematic review was to establish best practice in the prevention and management of aggressive behaviours in patients admitted to acute hospital settings. Data Sources: An extensive search of the major databases was conducted from 1990 to 2007. The search included published and unpublished studies and papers in English. Review Methods: This review considered any quantitative research study design that evaluated the effectiveness of interventions in the prevention and management of patients who exhibit aggressive behaviours in an acute hospital setting. Each included study was quality assessed by two independent reviewers and data were extracted using the relevant tools developed by the Joanna Briggs Institute. Results: Ten studies met the inclusion criteria and were included in the review. The evidence identified from the studies includes: the benefit of education and training of acute care nurses in aggression management techniques; use of “as required” medications is effective in minimising harm to patients and staff; and that specific interventions such as physical restraint may play a role in managing aggressive behaviours from patients in the acute care setting. Conclusions: This review makes several recommendations for the prevention and management of aggressive behaviours in acute hospital patients. However, due to the lack of high-quality studies conducted in the acute care setting there is huge scope for future research in this area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives The effects of 30 min of exercise on postprandial lipaemia in the overweight and obese are unknown as previous studies have only investigated bouts of at least 60 min in lean, healthy individuals. The aim of this study was to investigate whether a single 30-min bout of resistance, aerobic or combined exercise at moderate-intensity would decrease postprandial lipaemia, glucose and insulin levels as well as increase resting energy expenditure and increase fat oxidation following a high fat meal consumed 14 h after the exercise bout, in overweight and obese individuals compared to no exercise. We also compared the effects of the different exercise modalities. Methods This study was a randomized cross-over design which examined the postprandial effects of 30 min of different types of exercise in the evening prior to a breakfast meal in overweight and obese men and women. Participants were randomized on four occasions, each one-week apart, to each condition; either no exercise, aerobic exercise, resistance exercise or a combination of aerobic exercise and resistance exercise. Results An acute bout of combination training did not have any significant effect on postprandial measurements compared to no exercise. However, aerobic exercise significantly reduced postprandial triglyceride levels by 8% compared to no exercise (p = 0.02) and resistance exercise decreased postprandial insulin levels by 30% compared to aerobic exercise (p = 0.01). Conclusion These results indicate that a single moderate-intensity 30 min bout of aerobic or resistance exercise improves risk factors associated with cardiovascular disease in overweight and obese individuals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Shrinking product lifecycles, tough international competition, swiftly changing technologies, ever increasing customer quality expectation and demanding high variety options are some of the forces that drive next generation of development processes. To overcome these challenges, design cost and development time of product has to be reduced as well as quality to be improved. Design reuse is considered one of the lean strategies to win the race in this competitive environment. design reuse can reduce the product development time, product development cost as well as number of defects which will ultimately influence the product performance in cost, time and quality. However, it has been found that no or little work has been carried out for quantifying the effectiveness of design reuse in product development performance such as design cost, development time and quality. Therefore, in this study we propose a systematic design reuse based product design framework and developed a design leanness index (DLI) as a measure of effectiveness of design reuse. The DLI is a representative measure of reuse effectiveness in cost, development time and quality. Through this index, a clear relationship between reuse measure and product development performance metrics has been established. Finally, a cost based model has been developed to maximise the design leanness index for a product within the given set of constraints achieving leanness in design process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost. In order to reach these goals, they need good quality components from suppliers at optimum price and lead time. This actually forced all the companies to adapt different improvement practices such as lean manufacturing, Just in Time (JIT) and effective supply chain management. Applying new improvement techniques and tools cause higher establishment costs and more Information Delay (ID). On the contrary, these new techniques may reduce the risk of stock outs and affect supply chain flexibility to give a better overall performance. But industry people are unable to measure the overall affects of those improvement techniques with a standard evaluation model .So an effective overall supply chain performance evaluation model is essential for suppliers as well as manufacturers to assess their companies under different supply chain strategies. However, literature on lean supply chain performance evaluation is comparatively limited. Moreover, most of the models assumed random values for performance variables. The purpose of this paper is to propose an effective supply chain performance evaluation model using triangular linguistic fuzzy numbers and to recommend optimum ranges for performance variables for lean implementation. The model initially considers all the supply chain performance criteria (input, output and flexibility), converts the values to triangular linguistic fuzzy numbers and evaluates overall supply chain performance under different situations. Results show that with the proposed performance measurement model, improvement area for each variable can be accurately identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The idea of body weight regulation implies that a biological mechanism exerts control over energy expenditure and food intake. This is a central tenet of energy homeostasis. However, the source and identity of the controlling mechanism have not been identified, although it is often presumed to be some long-acting signal related to body fat, such as leptin. Using a comprehensive experimental platform, we have investigated the relationship between biological and behavioural variables in two separate studies over a 12-week intervention period in obese adults (total n 92). All variables have been measured objectively and with a similar degree of scientific control and precision, including anthropometric factors, body composition, RMR and accumulative energy consumed at individual meals across the whole day. Results showed that meal size and daily energy intake (EI) were significantly correlated with fat-free mass (FFM, P values ,0·02–0·05) but not with fat mass (FM) or BMI (P values 0·11–0·45) (study 1, n 58). In study 2 (n 34), FFM (but not FM or BMI) predicted meal size and daily EI under two distinct dietary conditions (high-fat and low-fat). These data appear to indicate that, under these circumstances, some signal associated with lean mass (but not FM) exerts a determining effect over self-selected food consumption. This signal may be postulated to interact with a separate class of signals generated by FM. This finding may have implications for investigations of the molecular control of food intake and body weight and for the management of obesity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Six sigma has proven itself as a major quality initiative in the last two decades. It is a philosophy which provides a systematic approach to applying numerous tools in the framework of several quality improvement methodologies. The most widely used six sigma methodology is DMAIC, which is best suited for improving existing processes. In order to build quality into the product or service, a proactive approach like Design for Six Sigma (DFSS) is required. This paper provides an overview of DFSS, product innovation, and service innovation. The emphasis is on comparing how DFSS is applied differently in product and service innovation. This paper contributes by analysing the existing literature on DFSS in product and service innovation. The major findings are that the DFSS approach in services and products can be differentiated along the following three dimensions: methodology, characteristics, and technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solo Show is a to-scale model Metro Arts’ gallery, in which it was exhibited. Set upon a timber frame, the model depicts a miniature ‘installation’ within the ‘space’: a foam block that obstructs one of the gallery’s walkways. Developed and produced for a group exhibition that explored the relationship between humour and art, this work explores and pokes fun at ideas of the institution, scale and the artist ego as well as communicating feelings of emergence, insecurity and hesitancy. The work was included in the group show 'Lean Towards Indifference!' at MetroArts, Brisbane, curated by art collective No Frills.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large amounts of money due to product recalls, consumer impact and subsequent loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and microorganisms to enter the package. In the food processing and packaging industry worldwide, there is an increasing demand for cost effective state of the art inspection technologies that are capable of reliably detecting leaky seals and delivering products at six-sigma. The new technology will develop non-destructive testing technology using digital imaging and sensing combined with a differential vacuum technique to assess seal integrity of food packages on a high-speed production line. The cost of leaky packages in Australian food industries is estimated close to AUD $35 Million per year. Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large sums of money due to product recalls, compensation claims and loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and micro-organisms to enter the package. Flexible plastic packages are widely used, and are the least expensive form of retaining the quality of the product. These packets can be used to seal, and therefore maximise, the shelf life of both dry and moist products. The seals of food packages need to be airtight so that the food content is not contaminated due to contact with microorganisms that enter as a result of air leakage. Airtight seals also extend the shelf life of packaged foods, and manufacturers attempt to prevent food products with leaky seals being sold to consumers. There are many current NDT (non-destructive testing) methods of testing the seal of flexible packages best suited to random sampling, and for laboratory purposes. The three most commonly used methods are vacuum/pressure decay, bubble test, and helium leak detection. Although these methods can detect very fine leaks, they are limited by their high processing time and are not viable in a production line. Two nondestructive in-line packaging inspection machines are currently available and are discussed in the literature review. The detailed design and development of the High-Speed Sensing and Detection System (HSDS) is the fundamental requirement of this project and the future prototype and production unit. Successful laboratory testing was completed and a methodical design procedure was needed for a successful concept. The Mechanical tests confirmed the vacuum hypothesis and seal integrity with good consistent results. Electrically, the testing also provided solid results to enable the researcher to move the project forward with a certain amount of confidence. The laboratory design testing allowed the researcher to confirm theoretical assumptions before moving into the detailed design phase. Discussion on the development of the alternative concepts in both mechanical and electrical disciplines enables the researcher to make an informed decision. Each major mechanical and electrical component is detailed through the research and design process. The design procedure methodically works through the various major functions both from a mechanical and electrical perspective. It opens up alternative ideas for the major components that although are sometimes not practical in this application, show that the researcher has exhausted all engineering and functionality thoughts. Further concepts were then designed and developed for the entire HSDS unit based on previous practice and theory. In the future, it would be envisaged that both the Prototype and Production version of the HSDS would utilise standard industry available components, manufactured and distributed locally. Future research and testing of the prototype unit could result in a successful trial unit being incorporated in a working food processing production environment. Recommendations and future works are discussed, along with options in other food processing and packaging disciplines, and other areas in the non-food processing industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Evidence-based practice (EBP) is embraced internationally as an ideal approach to improve patient outcomes and provide cost-effective care. However, despite the support for and apparent benefits of evidence-based practice, it has been shown to be complex and difficult to incorporate into the clinical setting. Research exploring implementation of evidence-based practice has highlighted many internal and external barriers including clinicians’ lack of knowledge and confidence to integrate EBP into their day-to-day work. Nurses in particular often feel ill-equipped with little confidence to find, appraise and implement evidence. Aims: The following study aimed to undertake preliminary testing of the psychometric properties of tools that measure nurses’ self-efficacy and outcome expectancy in regard to evidence-based practice. Methods: A survey design was utilised in which nurses who had either completed an EBP unit or were randomly selected from a major tertiary referral hospital in Brisbane, Australia were sent two newly developed tools: 1) Self-efficacy in Evidence-Based Practice (SE-EBP) scale and 2) Outcome Expectancy for Evidence-Based Practice (OE-EBP) scale. Results: Principal Axis Factoring found three factors with eigenvalues above one for the SE-EBP explaining 73% of the variance and one factor for the OE-EBP scale explaining 82% of the variance. Cronbach’s alpha for SE-EBP, three SE-EBP factors and OE-EBP were all >.91 suggesting some item redundancy. The SE-EBP was able to distinguish between those with no prior exposure to EBP and those who completed an introductory EBP unit. Conclusions: While further investigation of the validity of these tools is needed, preliminary testing indicates that the SE-EBP and OE-EBP scales are valid and reliable instruments for measuring health professionals’ confidence in the process and the outcomes of basing their practice on evidence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional treatments for weight management have focussed on prescribed dietary restriction or regular exercise, or a combination of both. However recidivism for such prescribed treatments remains high, particularly among the overweight and obese. The aim of this thesis was to investigate voluntary dietary changes in the presence of prescribed mixed-mode exercise, conducted over 16 weeks. With the implementation of a single lifestyle change (exercise) it was postulated that the onerous burden of concomitant dietary and exercise compliance would be reduced, leading to voluntary lifestyle changes in such areas as diet. In addition, the failure of exercise as a single weight loss treatment has been reported to be due to compensatory energy intakes, although much of the evidence is from acute exercise studies, necessitating investigation of compensatory intakes during a long-term exercise intervention. Following 16 weeks of moderate intensity exercise, 30 overweight and obese (BMI≥25.00 kg.m-2) men and women showed small but statistically significant decreases in mean dietary fat intakes, without compensatory increases in other macronutrient or total energy intakes. Indeed total energy intakes were significantly lower for men and women following the exercise intervention, due to the decreases in dietary fat intakes. There was a risk that acceptance of the statistical validity of the small changes to dietary fat intakes may have constituted a Type 1 error, with false rejection of the Null hypothesis. Oro-sensory perceptions to changes in fat loads were therefore investigated to determine whether the measured dietary fat changes were detectable by the human palate. The ability to detect small changes in dietary fat provides sensory feedback for self-initiated dietary changes, but lean and overweight participants were unable to distinguish changes to fat loads of similar magnitudes to that measured in the exercise intervention study. Accuracy of the dietary measurement instrument was improved with the effects of random error (day-to-day variability) minimised with the use of a statistically validated 8-day, multiple-pass, 24 hour dietary recall instrument. However systematic error (underreporting) may have masked the magnitude of dietary change, particularly the reduction in dietary fat intakes. A purported biomarker (plasma Apolipoprotein A-IV) (apoA-IV) was subsequently investigated, to monitor systematic error in self-reported dietary intakes. Changes in plasma apoA-IV concentrations were directly correlated with increased and decreased changes to dietary fat intakes, suggesting that this objective marker may be a useful tool to improve the accuracy of dietary measurement in overweight and obese populations, who are susceptible to dietary underreporting.