878 resultados para Critical current degradation
Resumo:
Advancements in retinal imaging technologies have drastically improved the quality of eye care in the past couple decades. Scanning laser ophthalmoscopy (SLO) and optical coherence tomography (OCT) are two examples of critical imaging modalities for the diagnosis of retinal pathologies. However current-generation SLO and OCT systems have limitations in diagnostic capability due to the following factors: the use of bulky tabletop systems, monochromatic imaging, and resolution degradation due to ocular aberrations and diffraction.
Bulky tabletop SLO and OCT systems are incapable of imaging patients that are supine, under anesthesia, or otherwise unable to maintain the required posture and fixation. Monochromatic SLO and OCT imaging prevents the identification of various color-specific diagnostic markers visible with color fundus photography like those of neovascular age-related macular degeneration. Resolution degradation due to ocular aberrations and diffraction has prevented the imaging of photoreceptors close to the fovea without the use of adaptive optics (AO), which require bulky and expensive components that limit the potential for widespread clinical use.
In this dissertation, techniques for extending the diagnostic capability of SLO and OCT systems are developed. These techniques include design strategies for miniaturizing and combining SLO and OCT to permit multi-modal, lightweight handheld probes to extend high quality retinal imaging to pediatric eye care. In addition, a method for extending true color retinal imaging to SLO to enable high-contrast, depth-resolved, high-fidelity color fundus imaging is demonstrated using a supercontinuum light source. Finally, the development and combination of SLO with a super-resolution confocal microscopy technique known as optical photon reassignment (OPRA) is demonstrated to enable high-resolution imaging of retinal photoreceptors without the use of adaptive optics.
Resumo:
Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.
Resumo:
Critical bed shear stress for incipient motion has been determined for biogenic free-living coralline algae known as maërl. Maërl from three different sedimentary environments (beach, intertidal, and open marine) in Galway Bay, west of Ireland have been analysed in a rotating annular flume and linear flume. Velocity profile measurements of the benthic boundary layer, using an Acoustic Doppler Velocimeter, have been obtained in four different velocity experiments. The bed shear stress has been determined using three methods: Law of the Wall, Turbulent Kinetic Energy and Reynolds Stress. The critical Shields parameter has been estimated as a non-dimensional mobility number and the results have been compared with the Shields curve for natural sand. Maërl particles fall below this curve because its greater angularity allows grains to be mobilised easier than hydraulically equivalent particles. From previous work, the relationship between grain shape and the settling velocity of maërl suggests that the roughness is greatest for intertidal maërl particles. During critical shear stress determinations, beds of such rough particles exhibited the greatest critical shear stress probably because the particle thalli interlocked and resisted entrainment. The Turbulent Kinetic Energy methodology gives the most consistent results, agreeing with previous comparative studies. Rarely-documented maërl megaripples were observed in the rotating annular flume and are hypothesised to form at velocities ~10 cm s-1 higher than the critical threshold velocity, where tidal currents, oscillatory flow or combined-wave current interaction results in the preferential transport of maërl. A determination of the critical bed shear stress of maërl allows its mobility and rate of erosion and deposition to be evaluated spatially in subsequent applications to biological conservation management.
Resumo:
The commodification of natural resources and the pursuit of continuous growth has resulted in environmental degradation, depletion, and disparity in access to these life-sustaining resources, including water. Utility-based objectification and exploitation of water in some societies has brought us to the brink of crisis through an apathetic disregard for present and future generations. The ongoing depletion and degradation of the world’s water sources, coupled with a reliance on Western knowledge and the continued omission of Indigenous knowledge to manage our relationship with water has unduly burdened many, but particularly so for Indigenous communities. The goal of my thesis research is to call attention to and advance the value and validity of using both Indigenous and Western knowledge systems (also known as Two-Eyed Seeing) in water research and management to better care for water. To achieve this goal, I used a combined systematic and realist review method to identify and synthesize the peer-reviewed, integrative water literature, followed by semi-structured interviews with first authors of the exemplars from the included literature to identify the challenges and insights that researchers have experienced in conducting integrative water research. Findings suggest that these authors recognize that many previous attempts to integrate Indigenous knowledges have been tokenistic rather than meaningful, and that new methods for knowledge implementation are needed. Community-based participatory research methods, and the associated tenets of balancing power, fostering trust, and community ownership over the research process, emerged as a pathway towards the meaningful implementation of Indigenous and Western knowledge systems. Data also indicate that engagement and collaborative governance structures developed from a position of mutual respect are integral to the realization of a given project. The recommendations generated from these findings offer support for future Indigenous-led research and partnerships through the identification and examination of approaches that facilitate the meaningful implementation of Indigenous and Western knowledge systems in water research and management. Asking Western science questions and seeking Indigenous science solutions does not appear to be working; instead, the co-design of research projects and asking questions directed at the problem rather than the solution better lends itself to the strengths of Indigenous science.
Resumo:
With the objective to improve the reactor physics calculation on a 2D and 3D nuclear reactor via the Diffusion Equation, an adaptive automatic finite element remeshing method, based on the elementary area (2D) or volume (3D) constraints, has been developed. The adaptive remeshing technique, guided by a posteriori error estimator, makes use of two external mesh generator programs: Triangle and TetGen. The use of these free external finite element mesh generators and an adaptive remeshing technique based on the current field continuity show that they are powerful tools to improve the neutron flux distribution calculation and by consequence the power solution of the reactor core even though they have a minor influence on the critical coefficient of the calculated reactor core examples. Two numerical examples are presented: the 2D IAEA reactor core numerical benchmark and the 3D model of the Argonauta research reactor, built in Brasil.
Resumo:
In this work, the existing understanding of flame spread dynamics is enhanced through an extensive study of the heat transfer from flames spreading vertically upwards across 5 cm wide, 20 cm tall samples of extruded Poly (Methyl Methacrylate) (PMMA). These experiments have provided highly spatially resolved measurements of flame to surface heat flux and material burning rate at the critical length scale of interest, with a level of accuracy and detail unmatched by previous empirical or computational studies. Using these measurements, a wall flame model was developed that describes a flame’s heat feedback profile (both in the continuous flame region and the thermal plume above) solely as a function of material burning rate. Additional experiments were conducted to measure flame heat flux and sample mass loss rate as flames spread vertically upwards over the surface of seven other commonly used polymers, two of which are glass reinforced composite materials. Using these measurements, our wall flame model has been generalized such that it can predict heat feedback from flames supported by a wide range of materials. For the seven materials tested here – which present a varied range of burning behaviors including dripping, polymer melt flow, sample burnout, and heavy soot formation – model-predicted flame heat flux has been shown to match experimental measurements (taken across the full length of the flame) with an average accuracy of 3.9 kW m-2 (approximately 10 – 15 % of peak measured flame heat flux). This flame model has since been coupled with a powerful solid phase pyrolysis solver, ThermaKin2D, which computes the transient rate of gaseous fuel production of constituents of a pyrolyzing solid in response to an external heat flux, based on fundamental physical and chemical properties. Together, this unified model captures the two fundamental controlling mechanisms of upward flame spread – gas phase flame heat transfer and solid phase material degradation. This has enabled simulations of flame spread dynamics with a reasonable computational cost and accuracy beyond that of current models. This unified model of material degradation provides the framework to quantitatively study material burning behavior in response to a wide range of common fire scenarios.
Resumo:
This thesis is devoted to the development, synthesis, properties, and applications of nano materials for critical technologies, including three areas: (1) Microbial contamination of drinking water is a serious problem of global significance. About 51% of the waterborne disease outbreaks in the United States can be attributed to contaminated ground water. Development of metal oxide nanoparticles, as viricidal materials is of technological and fundamental scientific importance. Nanoparticles with high surface areas and ultra small particle sizes have dramatically enhanced efficiency and capacity of virus inactivation, which cannot be achieved by their bulk counterparts. A series of metal oxide nanoparticles, such as iron oxide nanoparticles, zinc oxide nanoparticles and iron oxide-silver nanoparticles, coated on fiber substrates was developed in this research for evaluation of their viricidal activity. We also carried out XRD, TEM, SEM, XPS, surface area measurements, and zeta potential of these nanoparticles. MS2 virus inactivation experiments showed that these metal oxide nanoparticle coated fibers were extremely powerful viricidal materials. Results from this research suggest that zinc oxide nanoparticles with diameter of 3.5 nm, showing an isoelectric point (IEP) at 9.0, were well dispersed on fiberglass. These fibers offer an increase in capacity by orders of magnitude over all other materials. Compared to iron oxide nanoparticles, zinc oxide nanoparticles didn’t show an improvement in inactivation kinetics but inactivation capacities did increase by two orders of magnitude to 99.99%. Furthermore, zinc oxide nanoparticles have higher affinity to viruses than the iron oxide nanoparticles in presence of competing ions. The advantages of zinc oxide depend on high surface charge density, small nanoparticle sizes and capabilities of generating reactive oxygen species. The research at its present stage of development appears to offer the best avenue to remove viruses from water. Without additional chemicals and energy input, this system can be implemented by both points of use (POU) and large-scale use water treatment technology, which will have a significant impact on the water purification industry. (2) A new family of aliphatic polyester lubricants has been developed for use in micro-electromechanical systems (MEMS), specifically for hard disk drives that operate at high spindle speeds (>15000rpm). Our program was initiated to address current problems with spin-off of the perfluoroether (PFPE) lubricants. The new polyester lubricant appears to alleviate spin-off problems and at the same time improves the chemical and thermal stability. This new system provides a low cost alternative to PFPE along with improved adhesion to the substrates. In addition, it displays a much lower viscosity, which may be of importance to stiction related problems. The synthetic route is readily scalable in case additional interest emerges in other areas including small motors. (3) The demand for increased signal transmission speed and device density for the next generation of multilevel integrated circuits has placed stringent demands on materials performance. Currently, integration of the ultra low-k materials in dual Damascene processing requires chemical mechanical polishing (CMP) to planarize the copper. Unfortunately, none of the commercially proposed dielectric candidates display the desired mechanical and thermal properties for successful CMP. A new polydiacetylene thermosetting polymer (DEB-TEB), which displays a low dielectric constant (low-k) of 2.7, was recently developed. This novel material appears to offer the only avenue for designing an ultra low k dielectric (1.85k), which can still display the desired modulus (7.7Gpa) and hardness (2.0Gpa) sufficient to withstand the process of CMP. We focused on further characterization of the thermal properties of spin-on poly (DEB-TEB) ultra-thin film. These include the coefficient of thermal expansion (CTE), biaxial thermal stress, and thermal conductivity. Thus the CTE is 2.0*10-5K-1 in the perpendicular direction and 8.0*10-6 K-1 in the planar direction. The low CTE provides a better match to the Si substrate which minimizes interfacial stress and greatly enhances the reliability of the microprocessors. Initial experiments with oxygen plasma etching suggest a high probability of success for achieving vertical profiles.
Resumo:
Lantana camara L. is a significant weed of which there are some 650 varieties in over 60 countries or island groups. It has been the focus of biological control attempts for a century, yet still poses major problems in many regions. Lantana has a significant impact on economic and environmental areas and is difficult to control. The key to good management of lantana is constant vigilance. Repeated control of new regrowth is critical to success. Control of new infestations should be a priority because the species is able to expand its range during good seasons, but does not die out during poor conditions. This book is a resource for land managers and researchers on methods of lantana control, particularly biocontrol.
Resumo:
Purpose of review: Health-related quality of life (HRQoL) is an important patient-reported outcome measure following critical illness. ‘Validated’ and professionally endorsed generic measures are widely used to evaluate critical care intervention and guide practice, policy and research. Although recognizing that they are ‘here to stay’, leading QoL researchers are beginning to question their ‘fitness for purpose’. It is therefore timely to review critiques of their limitations in the wider healthcare and social science literatures and to examine the implications for critical care research including, in particular, emerging interventional studies in which HRQoL is the primary outcome of interest. Recent findings: Generic HRQoL measures have provided important yet limited insights into HRQoL among survivors of critical illness. They are rarely developed or validated in collaboration with patients and cannot therefore be assumed to reflect their experiences and perspectives. Summary: Collaboration with patients is advocated in order to improve the interpretation and utility of such data. Failure to do so may result in important study effects being overlooked and the dismissal of potentially useful interventions.
Resumo:
This thesis is an examination of the ASEAN’s prospects in establishing regional competition policy in the Southeast Asia region, a topic of contemporary relevance in light of the ASEAN’s recent foray into the economic integration field on 31 December 2015. It questions whether the current approach undertaken by the ASEAN could contribute to an effective regional competition policy under the regional market integration. In answering this question, the thesis first critically surveys the current terrain of regional competition laws and policies in order to determine the possible existence of an optimal template. It argues that although the EU model is oft used as a source of inspiration, each regional organisation conceives different configurations of the model in order to best adjust to the local regional contexts. The thesis makes an inquiry into the narratives of the ASEAN’s competition policy, as well as the ASEAN’s specific considerations in the development of competition policy, before comparing the findings to the actual approaches taken by the ASEAN in its pursuit of regional competition policy. This thesis reveals that the actual approach taken by the ASEAN demonstrates an important discrepancy from the economic integration goal. The ASEAN applies a soft harmonisation approach regarding substantive competition law while refraining from establishing a centralised institution or a representative institution. The sole organ with regards to competition policy at the regional level is an expert organ. The thesis also conducts an investigation into the reception of the ASEAN’s regional policy by the member states in order to ascertain the possibility of the achievement of the ASEAN’s aspiration of regional competition policy. The study reveals that despite some shared similarities in the broad principles of competition law amongst the member states, the various competition law regimes are not harmonised thus creating challenging obstacle to the ASEAN’s ambition. The thesis then concludes that the ASEAN’s approach to regional competition law is unlikely to be effective.
Resumo:
Polymer aluminum electrolytic capacitors were introduced to provide an alternative to liquid electrolytic capacitors. Polymer electrolytic capacitor electric parameters of capacitance and ESR are less temperature dependent than those of liquid aluminum electrolytic capacitors. Furthermore, the electrical conductivity of the polymer used in these capacitors (poly-3,4ethylenedioxithiophene) is orders of magnitude higher than the electrolytes used in liquid aluminum electrolytic capacitors, resulting in capacitors with much lower equivalent series resistance which are suitable for use in high ripple-current applications. The presence of the moisture-sensitive polymer PEDOT introduces concerns on the reliability of polymer aluminum capacitors in high humidity conditions. Highly accelerated stress testing (or HAST) (110ºC, 85% relative humidity) of polymer aluminum capacitors in which the parts were subjected to unbiased HAST conditions for 700 hours was done to understand the design factors that contribute to the susceptibility to degradation of a polymer aluminum electrolytic capacitor exposed to HAST conditions. A large scale study involving capacitors of different electrical ratings (2.5V – 16V, 100µF – 470 µF), mounting types (surface-mount and through-hole) and manufacturers (6 different manufacturers) was done to determine a relationship between package geometry and reliability in high temperature-humidity conditions. A Geometry-Based HAST test in which the part selection limited variations between capacitor samples to geometric differences only was done to analyze the effect of package geometry on humidity-driven degradation more closely. Raman spectroscopy, x-ray imaging, environmental scanning electron microscopy, and destructive analysis of the capacitors after HAST exposure was done to determine the failure mechanisms of polymer aluminum capacitors under high temperature-humidity conditions.
Resumo:
In this thesis, we present a quantitative approach using probabilistic verification techniques for the analysis of reliability, availability, maintainability, and safety (RAMS) properties of satellite systems. The subject of our research is satellites used in mission critical industrial applications. A strong case for using probabilistic model checking to support RAMS analysis of satellite systems is made by our verification results. This study is intended to build a foundation to help reliability engineers with a basic background in model checking to apply probabilistic model checking to small satellite systems. We make two major contributions. One of these is the approach of RAMS analysis to satellite systems. In the past, RAMS analysis has been extensively applied to the field of electrical and electronics engineering. It allows system designers and reliability engineers to predict the likelihood of failures from the indication of historical or current operational data. There is a high potential for the application of RAMS analysis in the field of space science and engineering. However, there is a lack of standardisation and suitable procedures for the correct study of RAMS characteristics for satellite systems. This thesis considers the promising application of RAMS analysis to the case of satellite design, use, and maintenance, focusing on its system segments. Data collection and verification procedures are discussed, and a number of considerations are also presented on how to predict the probability of failure. Our second contribution is leveraging the power of probabilistic model checking to analyse satellite systems. We present techniques for analysing satellite systems that differ from the more common quantitative approaches based on traditional simulation and testing. These techniques have not been applied in this context before. We present the use of probabilistic techniques via a suite of detailed examples, together with their analysis. Our presentation is done in an incremental manner: in terms of complexity of application domains and system models, and a detailed PRISM model of each scenario. We also provide results from practical work together with a discussion about future improvements.
Resumo:
The discussions about social justice date from ancient times, but despite the enduring interest in the topic and the progress made, we are still witnessing injustices throughout the world. Thus, the search for social justice, under some form, is an inseparable part of our lives. In general, social justice may be considered as a critical idea that challenges us to reform our institutions and practices in the name of greater fairness (Miller 1999, p. x). In political and policy debates, social justice is often related to fair access (Brown, 2013) but at the same time its meanings seem to vary when we consider different definitions, perspectives and social theories (Zajda, Majhanovich, & Rust, 2006). When seen in the context of higher education, social justice appears in relevant literature as a buzzword (Patton, Shahjahan, Riyad, & Osei-Kofi, 2010). Within the recent studies of higher education and public debates related to the development of higher education, more emphasis is placed on the link between higher education and the economic growth and how higher education could be more responsive to the labour market demands, and little emphasis has been put on social justice. Given this, the present study attempts to at least partially fill the gap with regard to this apparently very topical issue, especially in the context of the unprecedented worldwide expansion of higher education in the last century (Schofer & Meyer, 2005), an expansion that is expected to continue in the next decades. More specifically, the expansion of higher education intensified in the second part of the 20th century, especially after World War II. It was seen as a result of the intertwined dynamics related to demographic, economic and political pressures (Goastellec, 2008a). This trend undoubtedly contributed to the increase of the size of the student body. To illustrate this trend, we may point out that in the period between 2000 and 2007, the number of tertiary students in the world increased from 98,303,539 to 150,656,459 (UNESCO, 2009, p. 205). This growth occurred in all regions of the world, including Central and Eastern Europe, North America and Western Europe, and contributed to raising the number of tertiary graduates. Thus, in the period between 2000 and 2008, the total number of tertiary graduates in the European Union (EU) 27 increased by a total of 35 percent (or 4.5 percent per year). However, this growth was very uneven, ranging from 21.1 percent in Romania to 0.7 percent in Hungary (European Commission working staff document, 2011). The increase of the number of students and graduates was seen as enhancing the social justice in higher education, since it is assumed that expansion “extends a valued good to a broader spectrum of the population” (Arum, Gamoran, & Shavit, 2007, p. 29). However, concerns for a deep contradiction for 21st-century higher education also emerged with regard to its expansion.
Resumo:
The service of a critical infrastructure, such as a municipal wastewater treatment plant (MWWTP), is taken for granted until a flood or another low frequency, high consequence crisis brings its fragility to attention. The unique aspects of the MWWTP call for a method to quantify the flood stage-duration-frequency relationship. By developing a bivariate joint distribution model of flood stage and duration, this study adds a second dimension, time, into flood risk studies. A new parameter, inter-event time, is developed to further illustrate the effect of event separation on the frequency assessment. The method is tested on riverine, estuary and tidal sites in the Mid-Atlantic region. Equipment damage functions are characterized by linear and step damage models. The Expected Annual Damage (EAD) of the underground equipment is further estimated by the parametric joint distribution model, which is a function of both flood stage and duration, demonstrating the application of the bivariate model in risk assessment. Flood likelihood may alter due to climate change. A sensitivity analysis method is developed to assess future flood risk by estimating flood frequency under conditions of higher sea level and stream flow response to increased precipitation intensity. Scenarios based on steady and unsteady flow analysis are generated for current climate, future climate within this century, and future climate beyond this century, consistent with the WWTP planning horizons. The spatial extent of flood risk is visualized by inundation mapping and GIS-Assisted Risk Register (GARR). This research will help the stakeholders of the critical infrastructure be aware of the flood risk, vulnerability, and the inherent uncertainty.
Resumo:
The topic of the thesis is media discourse about current state if income inequality in the US, and political ideologies as influences behind the discourse. The data consists of four opinion articles, two from CNN and two from Fox News. The purpose of the study was to examine how media represents income inequality as an issue, and if the attitudes conveyed are concerned or indifferent. Previous studies have indicated that the level of income is often seen as a personal responsibility, and such perspective can be linked with Republican ideology. In contrast, the Democrats typically express more concern about the consequences of inequality. CNN has been previously considered to have a Democratic bias, and Fox News has been considered to have Republican bias, which is one reason why these two news channels were chosen as the sources of the data. The study is a critical discourse analysis, and the methods applied were sociocognitive approach, which analyzes the social and cognitive factors affecting the discourse, and appraisal framework, which was applied to scrutinize the expressed attitudes more closely by identifyind specific linguistic features. The appraisal framework includes studying such features as affect, judgment and appreciation, which offer a more detailed analysis on the attitudes present in the articles. The sociocognitive approach, additionally, offers a way of analyzing a more broad context affecting the articles. The findings were then compared, to see if there are differences between the articles, or between the news sites with alleged bias. The findings showed that CNN, with alleged Democratic bias, had a more symphatetic attitude towards income inequality, whereas Fox News, with more Republican views, showed clearly less concern towards the issue. Moreover, the Fox News articles had such dubious claims that the underlying ideology behind the articles could be even supporting of income inequality, as it allows the rich to pursue all the wealth they can without having to give anything away. The results, thus, suggest that the political ideologies may a significant effect on media discourse, which, in turn, may have a significant effect on the attitudes of the public towards great issues that could require prompt measures.