863 resultados para Design quality
Resumo:
Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.
Resumo:
Dimensional and form inspections are key to the manufacturing and assembly of products. Product verification can involve a number of different measuring instruments operated using their dedicated software. Typically, each of these instruments with their associated software is more suitable for the verification of a pre-specified quality characteristic of the product than others. The number of different systems and software applications to perform a complete measurement of products and assemblies within a manufacturing organisation is therefore expected to be large. This number becomes even larger as advances in measurement technologies are made. The idea of a universal software application for any instrument still appears to be only a theoretical possibility. A need for information integration is apparent. In this paper, a design of an information system to consistently manage (store, search, retrieve, search, secure) measurement results from various instruments and software applications is introduced. Two of the main ideas underlying the proposed system include abstracting structures and formats of measurement files from the data so that complexity and compatibility between different approaches to measurement data modelling is avoided. Secondly, the information within a file is enriched with meta-information to facilitate its consistent storage and retrieval. To demonstrate the designed information system, a web application is implemented. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].
Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.
As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.
More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.
With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.
Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.
With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.
Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.
Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.
Resumo:
Navigation devices used to be bulky and expensive and were not widely commercialized for personal use. Nowadays, all useful electronic devices are turning into being handheld so that they can be conveniently used anytime and anywhere. One can claim that almost any mobile phone, used today, has quite strong navigational capabilities that can efficiently work anywhere in the globe. No matter where you are, you can easily know your exact location and make your way smoothly to wherever you would like to go. This couldn’t have been made possible without the existence of efficient and small microwave circuits responsible for the transmission and reception of high quality navigation signals. This thesis is mainly concerned with the design of novel highly miniaturized and efficient filtering components working in the Global Navigational Satellite Systems (GNSS) frequency band to be integrated within an efficient Radio Frequency (RF) front-end module (FEM). A System-on-Package (SoP) integration technique is adopted for the design of all the components in this thesis. Two novel miniaturized filters are designed, where one of them is a wideband filter targeting the complete GNSS band with a fractional bandwidth of almost 50% at a center frequency of 1.385 GHz. This filter utilizes a direct inductive coupling topology to achieve the required wide band performance. It also has very good out-of-band rejection and low IL. Whereas the other dual band filter will only cover the lower and upper GNSS bands with a rejection notch in between the two bands. It has very good inter band rejection. The well-known “divide and conquer” design methodology was applied for the design of this filter to help save valuable design and optimization time. Moreover, the performance of two commercially available ultra-Low Noise Amplifiers (LNAs) is studied. The complete RF FEM showed promising preliminary performance in terms of noise figure, gain and bandwidth, where it out performed other commercial front-ends in these three aspects. All the designed circuits are fabricated and tested. The measured results are found to be in good agreements with the simulations.
Resumo:
This thesis presents details of the design and development of novel tools and instruments for scanning tunneling microscopy (STM), and may be considered as a repository for several years' worth of development work. The author presents design goals and implementations for two microscopes. First, a novel Pan-type STM was built that could be operated in an ambient environment as a liquid-phase STM. Unique features of this microscope include a unibody frame, for increased microscope rigidity, a novel slider component with large Z-range, a unique wiring scheme and damping mechanism, and a removable liquid cell. The microscope exhibits a high level of mechanical isolation at the tunnel junction, and operates excellently as an ambient tool. Experiments in liquid are on-going. Simultaneously, the author worked on designs for a novel low temperature, ultra-high vacuum (LT-UHV) instrument, and these are presented as well. A novel stick-slip vertical coarse approach motor was designed and built. To gauge the performance of the motor, an in situ motion sensing apparatus was implemented, which could measure the step size of the motor to high precision. A new driving circuit for stick-slip inertial motors is also presented, that o ffers improved performance over our previous driving circuit, at a fraction of the cost. The circuit was shown to increase step size performance by 25%. Finally, a horizontal sample stage was implemented in this microscope. The build of this UHV instrument is currently being fi nalized. In conjunction with the above design projects, the author was involved in a collaborative project characterizing N-heterocyclic carbene (NHC) self-assembled monolayers (SAMs) on Au(111) films. STM was used to characterize Au substrate quality, for both commercial substrates and those manufactured via a unique atomic layer deposition (ALD) process by collaborators. Ambient and UHV STM was then also used to characterize the NHC/Au(111) films themselves, and several key properties of these films are discussed. During this study, the author discovered an unexpected surface contaminant, and details of this are also presented. Finally, two models are presented for the nature of the NHC-Au(111) surface interaction based on the observed film properties, and some preliminary theoretical work by collaborators is presented.
Resumo:
This paper presents an FEM analysis conducted for optimally designing end mill cutters through verifying the cutting tool forces and stresses for milling Titanium alloy Ti-6Al-4 V. Initially, the theoretical tool forces are calculated by considering the cutting edge on a cutting tool as the curve of an intersection over a spherical/flat surface based on the model developed by Lee & Altinas [1]. Considering the lowest tool forces the cutting tool parameters are taken and optimal design of end mill is decided for different sizes. Then the 3D CAD models of the end mills are developed and used for Finite Element Method to verify the cutting forces for milling Ti-6Al-4 V. The cutting tool forces, stress, strain concentration (s), tool wear, and temperature of the cutting tool with the different geometric shapes are simulated considering Ti-6Al-4 V as work piece material. Finally, the simulated and theoretical values are compared and the optimal design of cutting tool for different sizes are validated. The present approach considers to improve the quality of machining surface and tool life with effects of the various parameters concerning the oblique cutting process namely axial, radial and tangential forces. Various simulated test cases are presented to highlight the approach on optimally designing end mill cutters.
Resumo:
Eine effiziente Gestaltung von Materialbereitstellungsprozessen ist eine entscheidende Voraussetzung für die Sicherstellung einer hohen Verfügbarkeit von Materialien in der Montage. Die Auswahl adäquater Bereitstellungsstrategien muss sich stets an den Anforderungen des Materialbereitstellungsprozesses orientieren. Die Leistungsanforderungen an eine effektive Materialbereitstellung werden maßgeblich durch den Montageprozess determiniert. Diesen Leistungsanforderungen ist eine passgenaue Materialbereitstellungsstrategie gegenüberzustellen. Die Formulierung der Leistungsanforderungen kann dabei in qualitativer oder quantitativer Form erfolgen. Allein die Berücksichtigung quantitativer Daten ist unzureichend, denn häufig liegen zum Zeitpunkt der Planung weder belastbare quantitative Daten vor, noch erscheint der Aufwand zu deren Ermittlung angemessen. Zudem weisen die herkömmlichen Methoden, die im Rahmen der Auswahl von Materialbereitstellungsstrategien häufig eingesetzt werden, den Nachteil auf, dass eine Nichterfüllung einer bestimmten Leistungsanforderung durch eine besonders gute Erfüllung einer anderen Leistungsanforderung kompensiert werden kann (Zeit vs. Qualität). Um die Auswahl einer Materialbereitstellungsstrategie unter Berücksichtigung qualitativer und quantitativer Anforderungen durchführen zu können, eignet sich in besonderer Weise die Methode des Fuzzy Axiomatic Designs. Diese Methode erlaubt einen Abgleich von Anforderungen an den Materialbereitstellungsprozess und der Eignung unterschiedlicher Materialbereitstellungsstrategien.
Resumo:
Das Verfahren der Lebensmitteltrocknung wird häufig angewendet, um ein Produkt für längere Zeit haltbar zu machen. Obst und Gemüse sind aufgrund ihres hohen Wassergehalts leicht verderblich durch biochemische Vorgänge innerhalb des Produktes, nicht sachgemäße Lagerung und unzureichende Transportmöglichkeiten. Um solche Verluste zu vermeiden wird die direkte Trocknung eingesetzt, welche die älteste Methode zum langfristigen haltbarmachen ist. Diese Methode ist jedoch veraltet und kann den heutigen Herausforderungen nicht gerecht werden. In der vorliegenden Arbeit wurde ein neuer Chargentrockner, mit diagonalem Luftstömungskanal entlang der Länge des Trocknungsraumes und ohne Leitbleche entwickelt. Neben dem unbestreitbaren Nutzen der Verwendung von Leitblechen, erhöhen diese jedoch die Konstruktionskosten und führen auch zu einer Erhöhung des Druckverlustes. Dadurch wird im Trocknungsprozess mehr Energie verbraucht. Um eine räumlich gleichmäßige Trocknung ohne Leitbleche zu erreichen, wurden die Lebensmittelbehälter diagonal entlang der Länge des Trockners platziert. Das vorrangige Ziel des diagonalen Kanals war, die einströmende, warme Luft gleichmäßig auf das gesamte Produkt auszurichten. Die Simulation des Luftstroms wurde mit ANSYS-Fluent in der ANSYS Workbench Plattform durchgeführt. Zwei verschiedene Geometrien der Trocknungskammer, diagonal und nicht diagonal, wurden modelliert und die Ergebnisse für eine gleichmäßige Luftverteilung aus dem diagonalen Luftströmungsdesign erhalten. Es wurde eine Reihe von Experimenten durchgeführt, um das Design zu bewerten. Kartoffelscheiben dienten als Trocknungsgut. Die statistischen Ergebnisse zeigen einen guten Korrelationskoeffizienten für die Luftstromverteilung (87,09%) zwischen dem durchschnittlich vorhergesagten und der durchschnittlichen gemessenen Strömungsgeschwindigkeit. Um den Effekt der gleichmäßigen Luftverteilung auf die Veränderung der Qualität zu bewerten, wurde die Farbe des Produktes, entlang der gesamten Länge der Trocknungskammer kontaktfrei im on-line-Verfahren bestimmt. Zu diesem Zweck wurde eine Imaging-Box, bestehend aus Kamera und Beleuchtung entwickelt. Räumliche Unterschiede dieses Qualitätsparameters wurden als Kriterium gewählt, um die gleichmäßige Trocknungsqualität in der Trocknungskammer zu bewerten. Entscheidend beim Lebensmittel-Chargentrockner ist sein Energieverbrauch. Dafür wurden thermodynamische Analysen des Trockners durchgeführt. Die Energieeffizienz des Systems wurde unter den gewählten Trocknungsbedingungen mit 50,16% kalkuliert. Die durchschnittlich genutzten Energie in Form von Elektrizität zur Herstellung von 1kg getrockneter Kartoffeln wurde mit weniger als 16,24 MJ/kg und weniger als 4,78 MJ/kg Wasser zum verdampfen bei einer sehr hohen Temperatur von jeweils 65°C und Scheibendicken von 5mm kalkuliert. Die Energie- und Exergieanalysen für diagonale Chargentrockner wurden zudem mit denen anderer Chargentrockner verglichen. Die Auswahl von Trocknungstemperatur, Massenflussrate der Trocknungsluft, Trocknerkapazität und Heiztyp sind die wichtigen Parameter zur Bewertung der genutzten Energie von Chargentrocknern. Die Entwicklung des diagonalen Chargentrockners ist eine nützliche und effektive Möglichkeit um dei Trocknungshomogenität zu erhöhen. Das Design erlaubt es, das gesamte Produkt in der Trocknungskammer gleichmäßigen Luftverhältnissen auszusetzen, statt die Luft von einer Horde zur nächsten zu leiten.
Resumo:
Aims. To validate the Swedish version of the Sheffield Care Environment Assessment Matrix (S-SCEAM). The instrument’s items measure environmental elements important for supporting the needs of older people, and conceptualized within eight domains. Methods. Item relevance was assessed by a group of experts and measured using content validity index (CVI). Test-retest and inter-rater reliability tests were performed. The domain structure was assessed by the inter-rater agreement of a second group of experts, and measured using Fleiss kappa. Results. All items attained a CVI above 0.78, the suggested criteria for excellent content validity. Test-retest reliability showed high stability (96% and 95% for two independent raters respectively), and inter-rater reliability demonstrated high levels of agreement (95% and 94% on two separate rating occasions). Kappa values were very good for test-retest (κ = 0.903 and 0.869) and inter-rater reliability (κ = 0.851 and 0.832). Domain structure was good, Fleiss’ kappa was 0.63 (range 0.45 to 0.75). Conclusion. The S-SCEAM of 210 items and eight domains showed good content validity and construct validity. The instrument is suggested for use in measuring of the quality of the physical environment in residential care facilities for older persons.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Background: Improving the transparency of information about the quality of health care providers is one way to improve health care quality. It is assumed that Internet information steers patients toward better-performing health care providers and will motivate providers to improve quality. However, the effect of public reporting on hospital quality is still small. One of the reasons is that users find it difficult to understand the formats in which information is presented. Objective: We analyzed the presentation of risk-adjusted mortality rate (RAMR) for coronary angiography in the 10 most commonly used German public report cards to analyze the impact of information presentation features on their comprehensibility. We wanted to determine which information presentation features were utilized, were preferred by users, led to better comprehension, and had similar effects to those reported in evidence-based recommendations described in the literature. Methods: The study consisted of 5 steps: (1) identification of best-practice evidence about the presentation of information on hospital report cards; (2) selection of a single risk-adjusted quality indicator; (3) selection of a sample of designs adopted by German public report cards; (4) identification of the information presentation elements used in public reporting initiatives in Germany; and (5) an online panel completed an online questionnaire that was conducted to determine if respondents were able to identify the hospital with the lowest RAMR and if respondents’ hospital choices were associated with particular information design elements. Results: Evidence-based recommendations were made relating to the following information presentation features relevant to report cards: evaluative table with symbols, tables without symbols, bar charts, bar charts without symbols, bar charts with symbols, symbols, evaluative word labels, highlighting, order of providers, high values to indicate good performance, explicit statements of whether high or low values indicate good performance, and incomplete data (“N/A” as a value). When investigating the RAMR in a sample of 10 hospitals’ report cards, 7 of these information presentation features were identified. Of these, 5 information presentation features improved comprehensibility in a manner reported previously in literature. Conclusions: To our knowledge, this is the first study to systematically analyze the most commonly used public reporting card designs used in Germany. Best-practice evidence identified in international literature was in agreement with 5 findings about German report card designs: (1) avoid tables without symbols, (2) include bar charts with symbols, (3) state explicitly whether high or low values indicate good performance or provide a “good quality” range, (4) avoid incomplete data (N/A given as a value), and (5) rank hospitals by performance. However, these findings are preliminary and should be subject of further evaluation. The implementation of 4 of these recommendations should not present insurmountable obstacles. However, ranking hospitals by performance may present substantial difficulties.
Resumo:
Este projeto surgiu com a necessidade da constituição de uma equipa multidisciplinar, incluindo uma pessoa responsável pela qualidade no desenvolvimento de um novo produto da fábrica. O produto é a estrutura metálica de um encosto de trás de um automóvel. Para o desenvolvimento do produto foi usado o método Advanced Product Quality Planning, vastamente utilizado no setor automóvel. Com o intuito de melhor perceber os problemas que podem surgir no novo produto foi estudado um produto fabricado no local de estágio e outro produzido numa outra fábrica do mesmo grupo, que têm algumas semelhanças a nível de processo e de design, respetivamente. Foram utilizadas algumas ferramentas da qualidade para explorar os problemas existentes na fábrica a nível do produto já existente e comprovado se algumas ações de melhoria propostas foram, ou não, bem-sucedidas. Com os outputs retirados da análise dos defeitos internos e externos foi elaborado o Plano de Controlo, que inclui a listagem de controlos necessários para se conseguir prever a qualidade do novo produto.
Resumo:
Thesis (Master's)--University of Washington, 2016-06