868 resultados para design quality
Resumo:
FIU's campus master plan should portray an overall concept of the University's vision. Its design should represent a distinctive sense of institutional purpose. Its architecture should support the campus design in the realization of an ideal academic environment. The present master plan of Florida International University (FIU) offers neither a clear typology of architectural elements nor adequate relationships and connections between buildings. FIU needs to enhance its master plan with an architectural and urban vocabulary that creates a better environment. This thesis will examine FIU's present master plan, explaining the history of its development. Further, it will critically examine the quality of the campus, highlighting the success and failure of its various parts. The unrealized potential of the campus' original vision will be juxtaposed to the built reality. In addition, FlU's planning strategies will be parallel with the planning of several master plans of American universities. Finally, this thesis will propose a set of criteria for the inclusion of a new building in the campus master plan. The Center of International Study will be the catalyst that would bring into focus the university's vision. As a means to prove the validity of these criteria, a new location for the center of international studies will be selected, and a schematic architectural proposal will be made.
Resumo:
This thesis chronicles the design and implementation of a Intemet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearestneighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language.
Resumo:
Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, 1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) ≥7%), borderline (HbA1c 7-8.9%), and poor (HbA1c ≥9%) glycemic control and potentially new risk factors (e.g. work characteristics), and 2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and 3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.
Resumo:
The different characteristics and needs of mobile device users, the situations in which these devices are operated and the limitations and characteristics of these devices are all factors which influence usability and ergonomics; two elements highly required for achieving successful interaction between users and devices. This research aims to identify characteristics of interface design for apps in mobile device applications, focussing on design, visual publishing and content editing, and the actual process of creation of these interfaces, with a view to guarantee quality interaction through touch technology, in observance of service limitations, the opportunities offered by the devices and the application requirements. The study will examine the interface of the mobile device application titled “Brasil 247” which provides news broadcasts using the concept of usability and ergonomics mainly in the field of adaptation, searching and browsing informative articles, as well as clarifying the processes and techniques necessary to carry out interaction tests which seek to evaluate the usability of interface.
Resumo:
Using survey data from 358 online customers, the study finds that the e-service quality construct conforms to the structure of a third-order factor model that links online service quality perceptions to distinct and actionable dimensions, including (1) website design, (2) fulfilment, (3) customer service, and (4) security/privacy. Each dimension is found to consist of several attributes that define the basis of e-service quality perceptions. A comprehensive specification of the construct, which includes attributes not covered in existing scales, is developed. The study contrasts a formative model consisting of 4 dimensions and 16 attributes against a reflective conceptualization. The results of this comparison indicate that studies using an incorrectly specified model overestimate the importance of certain e-service quality attributes. Global fit criteria are also found to support the detection of measurement misspecification. Meta-analytic data from 31,264 online customers are used to show that the developed measurement predicts customer behavior better than widely used scales, such as WebQual and E-S-Qual. The results show that the new measurement enables managers to assess e-service quality more accurately and predict customer behavior more reliably.
Resumo:
Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.
Resumo:
Dimensional and form inspections are key to the manufacturing and assembly of products. Product verification can involve a number of different measuring instruments operated using their dedicated software. Typically, each of these instruments with their associated software is more suitable for the verification of a pre-specified quality characteristic of the product than others. The number of different systems and software applications to perform a complete measurement of products and assemblies within a manufacturing organisation is therefore expected to be large. This number becomes even larger as advances in measurement technologies are made. The idea of a universal software application for any instrument still appears to be only a theoretical possibility. A need for information integration is apparent. In this paper, a design of an information system to consistently manage (store, search, retrieve, search, secure) measurement results from various instruments and software applications is introduced. Two of the main ideas underlying the proposed system include abstracting structures and formats of measurement files from the data so that complexity and compatibility between different approaches to measurement data modelling is avoided. Secondly, the information within a file is enriched with meta-information to facilitate its consistent storage and retrieval. To demonstrate the designed information system, a web application is implemented. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].
Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.
As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.
More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.
With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.
Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.
With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.
Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.
Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.
Resumo:
Navigation devices used to be bulky and expensive and were not widely commercialized for personal use. Nowadays, all useful electronic devices are turning into being handheld so that they can be conveniently used anytime and anywhere. One can claim that almost any mobile phone, used today, has quite strong navigational capabilities that can efficiently work anywhere in the globe. No matter where you are, you can easily know your exact location and make your way smoothly to wherever you would like to go. This couldn’t have been made possible without the existence of efficient and small microwave circuits responsible for the transmission and reception of high quality navigation signals. This thesis is mainly concerned with the design of novel highly miniaturized and efficient filtering components working in the Global Navigational Satellite Systems (GNSS) frequency band to be integrated within an efficient Radio Frequency (RF) front-end module (FEM). A System-on-Package (SoP) integration technique is adopted for the design of all the components in this thesis. Two novel miniaturized filters are designed, where one of them is a wideband filter targeting the complete GNSS band with a fractional bandwidth of almost 50% at a center frequency of 1.385 GHz. This filter utilizes a direct inductive coupling topology to achieve the required wide band performance. It also has very good out-of-band rejection and low IL. Whereas the other dual band filter will only cover the lower and upper GNSS bands with a rejection notch in between the two bands. It has very good inter band rejection. The well-known “divide and conquer” design methodology was applied for the design of this filter to help save valuable design and optimization time. Moreover, the performance of two commercially available ultra-Low Noise Amplifiers (LNAs) is studied. The complete RF FEM showed promising preliminary performance in terms of noise figure, gain and bandwidth, where it out performed other commercial front-ends in these three aspects. All the designed circuits are fabricated and tested. The measured results are found to be in good agreements with the simulations.
Resumo:
This thesis presents details of the design and development of novel tools and instruments for scanning tunneling microscopy (STM), and may be considered as a repository for several years' worth of development work. The author presents design goals and implementations for two microscopes. First, a novel Pan-type STM was built that could be operated in an ambient environment as a liquid-phase STM. Unique features of this microscope include a unibody frame, for increased microscope rigidity, a novel slider component with large Z-range, a unique wiring scheme and damping mechanism, and a removable liquid cell. The microscope exhibits a high level of mechanical isolation at the tunnel junction, and operates excellently as an ambient tool. Experiments in liquid are on-going. Simultaneously, the author worked on designs for a novel low temperature, ultra-high vacuum (LT-UHV) instrument, and these are presented as well. A novel stick-slip vertical coarse approach motor was designed and built. To gauge the performance of the motor, an in situ motion sensing apparatus was implemented, which could measure the step size of the motor to high precision. A new driving circuit for stick-slip inertial motors is also presented, that o ffers improved performance over our previous driving circuit, at a fraction of the cost. The circuit was shown to increase step size performance by 25%. Finally, a horizontal sample stage was implemented in this microscope. The build of this UHV instrument is currently being fi nalized. In conjunction with the above design projects, the author was involved in a collaborative project characterizing N-heterocyclic carbene (NHC) self-assembled monolayers (SAMs) on Au(111) films. STM was used to characterize Au substrate quality, for both commercial substrates and those manufactured via a unique atomic layer deposition (ALD) process by collaborators. Ambient and UHV STM was then also used to characterize the NHC/Au(111) films themselves, and several key properties of these films are discussed. During this study, the author discovered an unexpected surface contaminant, and details of this are also presented. Finally, two models are presented for the nature of the NHC-Au(111) surface interaction based on the observed film properties, and some preliminary theoretical work by collaborators is presented.
Resumo:
This paper presents an FEM analysis conducted for optimally designing end mill cutters through verifying the cutting tool forces and stresses for milling Titanium alloy Ti-6Al-4 V. Initially, the theoretical tool forces are calculated by considering the cutting edge on a cutting tool as the curve of an intersection over a spherical/flat surface based on the model developed by Lee & Altinas [1]. Considering the lowest tool forces the cutting tool parameters are taken and optimal design of end mill is decided for different sizes. Then the 3D CAD models of the end mills are developed and used for Finite Element Method to verify the cutting forces for milling Ti-6Al-4 V. The cutting tool forces, stress, strain concentration (s), tool wear, and temperature of the cutting tool with the different geometric shapes are simulated considering Ti-6Al-4 V as work piece material. Finally, the simulated and theoretical values are compared and the optimal design of cutting tool for different sizes are validated. The present approach considers to improve the quality of machining surface and tool life with effects of the various parameters concerning the oblique cutting process namely axial, radial and tangential forces. Various simulated test cases are presented to highlight the approach on optimally designing end mill cutters.
Resumo:
Eine effiziente Gestaltung von Materialbereitstellungsprozessen ist eine entscheidende Voraussetzung für die Sicherstellung einer hohen Verfügbarkeit von Materialien in der Montage. Die Auswahl adäquater Bereitstellungsstrategien muss sich stets an den Anforderungen des Materialbereitstellungsprozesses orientieren. Die Leistungsanforderungen an eine effektive Materialbereitstellung werden maßgeblich durch den Montageprozess determiniert. Diesen Leistungsanforderungen ist eine passgenaue Materialbereitstellungsstrategie gegenüberzustellen. Die Formulierung der Leistungsanforderungen kann dabei in qualitativer oder quantitativer Form erfolgen. Allein die Berücksichtigung quantitativer Daten ist unzureichend, denn häufig liegen zum Zeitpunkt der Planung weder belastbare quantitative Daten vor, noch erscheint der Aufwand zu deren Ermittlung angemessen. Zudem weisen die herkömmlichen Methoden, die im Rahmen der Auswahl von Materialbereitstellungsstrategien häufig eingesetzt werden, den Nachteil auf, dass eine Nichterfüllung einer bestimmten Leistungsanforderung durch eine besonders gute Erfüllung einer anderen Leistungsanforderung kompensiert werden kann (Zeit vs. Qualität). Um die Auswahl einer Materialbereitstellungsstrategie unter Berücksichtigung qualitativer und quantitativer Anforderungen durchführen zu können, eignet sich in besonderer Weise die Methode des Fuzzy Axiomatic Designs. Diese Methode erlaubt einen Abgleich von Anforderungen an den Materialbereitstellungsprozess und der Eignung unterschiedlicher Materialbereitstellungsstrategien.
Resumo:
Das Verfahren der Lebensmitteltrocknung wird häufig angewendet, um ein Produkt für längere Zeit haltbar zu machen. Obst und Gemüse sind aufgrund ihres hohen Wassergehalts leicht verderblich durch biochemische Vorgänge innerhalb des Produktes, nicht sachgemäße Lagerung und unzureichende Transportmöglichkeiten. Um solche Verluste zu vermeiden wird die direkte Trocknung eingesetzt, welche die älteste Methode zum langfristigen haltbarmachen ist. Diese Methode ist jedoch veraltet und kann den heutigen Herausforderungen nicht gerecht werden. In der vorliegenden Arbeit wurde ein neuer Chargentrockner, mit diagonalem Luftstömungskanal entlang der Länge des Trocknungsraumes und ohne Leitbleche entwickelt. Neben dem unbestreitbaren Nutzen der Verwendung von Leitblechen, erhöhen diese jedoch die Konstruktionskosten und führen auch zu einer Erhöhung des Druckverlustes. Dadurch wird im Trocknungsprozess mehr Energie verbraucht. Um eine räumlich gleichmäßige Trocknung ohne Leitbleche zu erreichen, wurden die Lebensmittelbehälter diagonal entlang der Länge des Trockners platziert. Das vorrangige Ziel des diagonalen Kanals war, die einströmende, warme Luft gleichmäßig auf das gesamte Produkt auszurichten. Die Simulation des Luftstroms wurde mit ANSYS-Fluent in der ANSYS Workbench Plattform durchgeführt. Zwei verschiedene Geometrien der Trocknungskammer, diagonal und nicht diagonal, wurden modelliert und die Ergebnisse für eine gleichmäßige Luftverteilung aus dem diagonalen Luftströmungsdesign erhalten. Es wurde eine Reihe von Experimenten durchgeführt, um das Design zu bewerten. Kartoffelscheiben dienten als Trocknungsgut. Die statistischen Ergebnisse zeigen einen guten Korrelationskoeffizienten für die Luftstromverteilung (87,09%) zwischen dem durchschnittlich vorhergesagten und der durchschnittlichen gemessenen Strömungsgeschwindigkeit. Um den Effekt der gleichmäßigen Luftverteilung auf die Veränderung der Qualität zu bewerten, wurde die Farbe des Produktes, entlang der gesamten Länge der Trocknungskammer kontaktfrei im on-line-Verfahren bestimmt. Zu diesem Zweck wurde eine Imaging-Box, bestehend aus Kamera und Beleuchtung entwickelt. Räumliche Unterschiede dieses Qualitätsparameters wurden als Kriterium gewählt, um die gleichmäßige Trocknungsqualität in der Trocknungskammer zu bewerten. Entscheidend beim Lebensmittel-Chargentrockner ist sein Energieverbrauch. Dafür wurden thermodynamische Analysen des Trockners durchgeführt. Die Energieeffizienz des Systems wurde unter den gewählten Trocknungsbedingungen mit 50,16% kalkuliert. Die durchschnittlich genutzten Energie in Form von Elektrizität zur Herstellung von 1kg getrockneter Kartoffeln wurde mit weniger als 16,24 MJ/kg und weniger als 4,78 MJ/kg Wasser zum verdampfen bei einer sehr hohen Temperatur von jeweils 65°C und Scheibendicken von 5mm kalkuliert. Die Energie- und Exergieanalysen für diagonale Chargentrockner wurden zudem mit denen anderer Chargentrockner verglichen. Die Auswahl von Trocknungstemperatur, Massenflussrate der Trocknungsluft, Trocknerkapazität und Heiztyp sind die wichtigen Parameter zur Bewertung der genutzten Energie von Chargentrocknern. Die Entwicklung des diagonalen Chargentrockners ist eine nützliche und effektive Möglichkeit um dei Trocknungshomogenität zu erhöhen. Das Design erlaubt es, das gesamte Produkt in der Trocknungskammer gleichmäßigen Luftverhältnissen auszusetzen, statt die Luft von einer Horde zur nächsten zu leiten.
Resumo:
Aims. To validate the Swedish version of the Sheffield Care Environment Assessment Matrix (S-SCEAM). The instrument’s items measure environmental elements important for supporting the needs of older people, and conceptualized within eight domains. Methods. Item relevance was assessed by a group of experts and measured using content validity index (CVI). Test-retest and inter-rater reliability tests were performed. The domain structure was assessed by the inter-rater agreement of a second group of experts, and measured using Fleiss kappa. Results. All items attained a CVI above 0.78, the suggested criteria for excellent content validity. Test-retest reliability showed high stability (96% and 95% for two independent raters respectively), and inter-rater reliability demonstrated high levels of agreement (95% and 94% on two separate rating occasions). Kappa values were very good for test-retest (κ = 0.903 and 0.869) and inter-rater reliability (κ = 0.851 and 0.832). Domain structure was good, Fleiss’ kappa was 0.63 (range 0.45 to 0.75). Conclusion. The S-SCEAM of 210 items and eight domains showed good content validity and construct validity. The instrument is suggested for use in measuring of the quality of the physical environment in residential care facilities for older persons.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08