884 resultados para reliability-cost evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the evaluation of the weather forecasting services used by the Iowa Department of Transportation is to ascertain the accuracy of the forecasts given to maintenance personnel and to determine whether the forecasts are useful in the decision-making process and whether the forecasts have potential for improving the level of service. The Iowa Department of Transportation has estimated the average cost of fighting a winter storm to be about $60,000 to $70,000 per hour. This final report is to provide an evaluation report describing the collection of weather data and information associated with the weather forecasting services provided to the Iowa Department of Transportation and its maintenance activities and to determine their impact in winter maintenance decision-making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface flow types (SFT) are advocated as ecologically relevant hydraulic units, often mapped visually from the bankside to characterise rapidly the physical habitat of rivers. SFT mapping is simple, non-invasive and cost-efficient. However, it is also qualitative, subjective and plagued by difficulties in recording accurately the spatial extent of SFT units. Quantitative validation of the underlying physical habitat parameters is often lacking, and does not consistently differentiate between SFTs. Here, we investigate explicitly the accuracy, reliability and statistical separability of traditionally mapped SFTs as indicators of physical habitat, using independent, hydraulic and topographic data collected during three surveys of a c. 50m reach of the River Arrow, Warwickshire, England. We also explore the potential of a novel remote sensing approach, comprising a small unmanned aerial system (sUAS) and Structure-from-Motion photogrammetry (SfM), as an alternative method of physical habitat characterisation. Our key findings indicate that SFT mapping accuracy is highly variable, with overall mapping accuracy not exceeding 74%. Results from analysis of similarity (ANOSIM) tests found that strong differences did not exist between all SFT pairs. This leads us to question the suitability of SFTs for characterising physical habitat for river science and management applications. In contrast, the sUAS-SfM approach provided high resolution, spatially continuous, spatially explicit, quantitative measurements of water depth and point cloud roughness at the microscale (spatial scales ≤1m). Such data are acquired rapidly, inexpensively, and provide new opportunities for examining the heterogeneity of physical habitat over a range of spatial and temporal scales. Whilst continued refinement of the sUAS-SfM approach is required, we propose that this method offers an opportunity to move away from broad, mesoscale classifications of physical habitat (spatial scales 10-100m), and towards continuous, quantitative measurements of the continuum of hydraulic and geomorphic conditions which actually exists at the microscale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Primary total knee replacement is a common operation that is performed to provide pain relief and restore functional ability. Inpatient physiotherapy is routinely provided after surgery to enhance recovery prior to hospital discharge. However, international variation exists in the provision of outpatient physiotherapy after hospital discharge. While evidence indicates that outpatient physiotherapy can improve short-term function, the longer term benefits are unknown. The aim of this randomised controlled trial is to evaluate the long-term clinical effectiveness and cost-effectiveness of a 6-week group-based outpatient physiotherapy intervention following knee replacement. Methods/design: Two hundred and fifty-six patients waiting for knee replacement because of osteoarthritis will be recruited from two orthopaedic centres. Participants randomised to the usual-care group (n = 128) will be given a booklet about exercise and referred for physiotherapy if deemed appropriate by the clinical care team. The intervention group (n = 128) will receive the same usual care and additionally be invited to attend a group-based outpatient physiotherapy class starting 6 weeks after surgery. The 1-hour class will be run on a weekly basis over 6 weeks and will involve task-orientated and individualised exercises. The primary outcome will be the Lower Extremity Functional Scale at 12 months post-operative. Secondary outcomes include: quality of life, knee pain and function, depression, anxiety and satisfaction. Data collection will be by questionnaire prior to surgery and 3, 6 and 12 months after surgery and will include a resource-use questionnaire to enable a trial-based economic evaluation. Trial participation and satisfaction with the classes will be evaluated through structured telephone interviews. The primary statistical and economic analyses will be conducted on an intention-to-treat basis with and without imputation of missing data. The primary economic result will estimate the incremental cost per quality-adjusted life year gained from this intervention from a National Health Services (NHS) and personal social services perspective. Discussion: This research aims to benefit patients and the NHS by providing evidence on the long-term effectiveness and cost-effectiveness of outpatient physiotherapy after knee replacement. If the intervention is found to be effective and cost-effective, implementation into clinical practice could lead to improvement in patients’ outcomes and improved health care resource efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En este trabajo se presenta la descripción e investigación en la evaluación de vetiver (Chrysopogon zizanioides) y la elefanta (Pennisetum purpureum) en el diseño de humedales artificiales. Para el tratamiento de aguas residuales de origen doméstico, siendo la vegetación uno de los principales componentes de estos sistemas de tratamientos no convencionales. Muchos \sistemas naturales" están siendo considerados con el propósito del tratamiento del agua residual y control de la contaminación del agua, debido a su alta fiabilidad ambiental y los bajos costos de construcción y mantenimiento, es el caso de los humedales artificiales. El interés en los sistemas naturales está basado en la conservación de los recursos asociados con estos sistemas como opuesto al proceso de tratamiento convencional de aguas residuales que es intensivo respecto al uso de energía y químicos. Los wetlands o humedales artificiales constituyen una alternativa de tratamiento debido a su alta eficiencia de remoción de contaminantes, a su bajo costo de instalación y mantenimiento y a su alta fiabilidad ambiental, generalmente un humedal artificial esta constituido por un medio de soporte el cual generalmente es arena o grava, vegetación y microorganismos o biopelícula los cuales llevan los diferentes procesos bioquímicos para remover los contaminantes del afluente. El objetivo general de este trabajo ha sido: Evaluar la eficiencia de remoción de materia orgánica, sólidos, nitrógeno y fósforo total de dos especies de plantas: vetiver (Chrysopogon zizanioides) y la elefanta (Pennisetum purpureum), en el diseño de humedales artificiales para el tratamiento de aguas residuales de origen doméstico. Los humedales artificiales o sistemas pilotos, se encuentran ubicados en la universidad de Medellín y reciben una preparación de agua sintética, que asemeja a las características de un agua residual de origen doméstico. En el presente trabajo se evalúa el porcentaje de remoción de la carga orgánica de aguas residuales, en un sistema de tratamiento por humedales artificiales con dos especies vegetales. El sistema fue diseñado con tres módulos instalados de manera adjunta. En el primero no se integra ninguna especie vegetal, solo el medio de sustrato el cual constituye el blanco (-), en el segundo se integraron organismos de la especie vetiver (Chrysopogon zizanioides), en el tercer sistema piloto, organismos de la especie elefanta (Pennisetum purpureum) y en el cuarto organismos de la especie papiro japones (Cyperus alternifolius), los cuales constituyen el control positivo (+). Los módulos experimentales fueron limpiados, cortados y adecuados acorde al montaje inicial de las plantas y al espacio requerido para su disposición. A cada sistema piloto se le agrega medio de soporte constituido por grava (5 a 10 cm) y arena (15 a 20 cm), el sustrato es evaluado y caracterizado por su diámetro nominal, posterior en cada sistema se siembran las especies en un área de 3x3 y cada humedal por dos semanas se adecua bajo la solución de Hoagland y Arnon y régimen de humedad. En el agua sintética se analizaron los siguientes parámetros: pH, sólidos totales, sólidos suspendido totales, sólidos disueltos totales, demanda química de oxígeno (DQO), demanda bioquímica de oxígeno (DBO5), nitrógeno total (NTK) y fosforo total (PT). También se realizó la determinación del crecimiento de las plantas a partir del incremento de biomasa, porosidad de la raíz y de igual forma se determina NTK y PT. Los resultados demostraron que el sistema es una opción para la remoción de la carga orgánica y de nutrientes en aguas residuales de origen doméstico, de bajo costo de operación y mantenimiento, especialmente se observa que las plantas que crecen en sistemas de régimen de humedad ácuico y ústico, tienden a tener una mayor recepción y adaptación en los humedales artificiales pilotos, es el caso de la elefanta (Pennisetum purpureum), el cual presenta las más altas tasas de remoción de contaminantes y nutrientes en el afluente, seguido por el papiro japonés (Cyperus alternifolius) y el vetiver (Chrysopogon zizanioides), respecto a tasas de remoción. La remoción de contaminantes que se presentan más altos respectivamente, constituyen sólidos en primera instancia, seguido por la demanda bioquímica de oxigeno (DBO5), demanda química de oxígeno (DQO), nitrógeno total (NTK) y fósforo total (PT), estos últimos presentan una baja tasa de remoción, debido a la naturaleza misma del contaminante, a los organismos que realizan la remoción y absorción y al tiempo de retención que se elige, el cual influye en la tasa de remoción del contaminante siendo menor en la concentración de fósforo, pero se encuentranen el rango esperado para estos sistemas de tratamiento no convencionales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les déficits cognitifs sont présents chez les patients atteints de cancer. Les tests cognitifs tels que le Montreal Cognitive Assessment se sont révélés peu spécifiques, incapables de détecter des déficits légers et ne sont pas linéaires. Pour suppléer à ces limitations nous avons développé un questionnaire cognitif simple, bref et adapté aux dimensions cognitives atteintes chez les patients avec un cancer, le FaCE « The Fast Cognitif Evaluation », en utilisant la modélisation Rasch (MR). La MR est une méthode mathématique probabiliste qui détermine les conditions pour qu’un outil soit considéré une échelle de mesure et elle est indépendante de l’échantillon. Si les résultats s’ajustent au modèle, l’échelle de mesure est linéaire avec des intervalles égaux. Les réponses sont basées sur la capacité des sujets et la difficulté des items. La carte des items permet de sélectionner les items les plus adaptés pour l’évaluation de chaque aspect cognitif et d’en réduire le nombre au minimum. L’analyse de l’unidimensionnalité évalue si l’outil mesure une autre dimension que celle attendue. Les résultats d’analyses, conduites sur 165 patients, montrent que le FaCE distingue avec une excellente fiabilité et des niveaux suffisamment différents les compétences des patients (person-reliability-index=0.86; person-separation-index=2.51). La taille de la population et le nombre d’items sont suffisants pour que les items aient une hiérarchisation fiable et précise (item-reliability=0.99; item-séparation-index=8.75). La carte des items montre une bonne dispersion de ceux-ci et une linéarité du score sans effet plafond. Enfin, l’unidimensionnalité est respectée et le temps d’accomplissement moyen est d’environ 6 minutes. Par définition la MR permet d’assurer la linéarité et la continuité de l’échelle de mesure. Nous avons réussi à développer un questionnaire bref, simple, rapide et adapté aux déficits cognitifs des patients avec un cancer. Le FaCE pourrait, aussi, servir de mesure de référence pour les futures recherches dans le domaine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Groundnut cake (GNC) meal is an important source of dietary protein for domestic animals with a cost advantage over the conventional animal protein sources used in aquaculture feed production. It would be useful to evaluate the effects of GNC processing methods on the density and nutritional values of processed GNC meals. The use of processed GNC meals in the diets of Clarias gariepinus fingerlings was evaluated. Seven iso-proteic and iso-caloric diets were formulated, replacing fish meal with roasted and boiled GNC meals, each at three inclusion levels of 30%, 35%, and 40%. Diet I is 100% fishmeal, Diet II is 30% roasted GNC meal, Diet III is 35% roasted GNC meal, Diet IV is 40% roasted GNC meal, Diet V is 30% boiled GNC meal, Diet VI is 35% boiled GNC meal and Diet VII is 40% boiled GNC meal. Results showed that the crude protein content of GNC meals was 40.5% and 40.8% in boiled and roasted GNC meals respectively; the lower protein content for processed GNC meals might be due to heat denaturation of the seed protein, with boiled GNC meal being more adversely affected. The mean weight gain of fingerlings fed roasted GNC meals ranged between 5.29 – 5.64 while for boiled GNC meals, it was between 4.60 – 5.22. Generally, fish performed better when fed diets containing roasted GNC meals, than boiled GNC meals, and compared favorably with fish fed fish meal based diet. Body mass increase, total feed increase, protein efficiency ratio and specific growth rate by C. gariepinus fingerlings in all diets, showed no significant differences, suggesting that processed GNC meals could partially replace diets for C. gariepinus fingerlings without adverse consequences. This study showed that processed GNC meals could partially replace fish meal up to 30% without significantly influencing fingerling growth and health. It is recommended that the use of fish meal as the main basal ingredient for fingerlings could be discontinued, since GNC meal was a cheaper alternative, and could replace fish meal up to 35%, without any significant adverse effects on the fingerling performance. KEYWORDS: Clarias gariepinus, Fingerlings, Groundnut cake meal, Nutrient utilization, Performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recently developed network-wide real-time signal control strategy TUC has been implemented in three traffic networks with quite different traffic and control infrastructure characteristics: Chania, Greece (23 junctions); Southampton, UK (53 junctions); and Munich, Germany (25 junctions), where it has been compared to the respective resident real-time signal control strategies TASS, SCOOT and BALANCE. After a short outline of TUC, the paper describes the three application networks; the application, demonstration and evaluation conditions; as well as the comparative evaluation results. The main conclusions drawn from this high-effort inter-European undertaking is that TUC is an easy-to-implement, inter-operable, low-cost real-time signal control strategy whose performance, after very limited fine-tuning, proved to be better or, at least, similar to the ones achieved by long-standing strategies that were in most cases very well fine-tuned over the years in the specific networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes how the recently developed network-wide real-time signal control strategy TUC has been implemented in three traffic networks with quite different traffic and control infrastructure characteristics: Chania, Greece (23 junctions); Southampton, U.K. (53 junctions); and Munich, Germany (25 junctions), where it has been compared to the respective resident real-time signal control strategies TASS, SCOOT and BALANCE. After a short outline of TUC, the paper describes the three application networks; the application, demonstration and evaluation conditions; as well as the comparative evaluation results. The main conclusions drawn from this high-effort inter-European undertaking is that TUC is an easy-to-implement, inter-operable, low-cost real-time signal control strategy whose performance, after limited fine-tuning, proved to be better or, at least, similar to the ones achieved by long-standing strategies that were in most cases very well fine-tuned over the years in the specific networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The PhD project addresses the potential of using concentrating solar power (CSP) plants as a viable alternative energy producing system in Libya. Exergetic, energetic, economic and environmental analyses are carried out for a particular type of CSP plants. The study, although it aims a particular type of CSP plant – 50 MW parabolic trough-CSP plant, it is sufficiently general to be applied to other configurations. The novelty of the study, in addition to modeling and analyzing the selected configuration, lies in the use of a state-of-the-art exergetic analysis combined with the Life Cycle Assessment (LCA). The modeling and simulation of the plant is carried out in chapter three and they are conducted into two parts, namely: power cycle and solar field. The computer model developed for the analysis of the plant is based on algebraic equations describing the power cycle and the solar field. The model was solved using the Engineering Equation Solver (EES) software; and is designed to define the properties at each state point of the plant and then, sequentially, to determine energy, efficiency and irreversibility for each component. The developed model has the potential of using in the preliminary design of CSPs and, in particular, for the configuration of the solar field based on existing commercial plants. Moreover, it has the ability of analyzing the energetic, economic and environmental feasibility of using CSPs in different regions of the world, which is illustrated for the Libyan region in this study. The overall feasibility scenario is completed through an hourly analysis on an annual basis in chapter Four. This analysis allows the comparison of different systems and, eventually, a particular selection, and it includes both the economic and energetic components using the “greenius” software. The analysis also examined the impact of project financing and incentives on the cost of energy. The main technological finding of this analysis is higher performance and lower levelized cost of electricity (LCE) for Libya as compared to Southern Europe (Spain). Therefore, Libya has the potential of becoming attractive for the establishment of CSPs in its territory and, in this way, to facilitate the target of several European initiatives that aim to import electricity generated by renewable sources from North African and Middle East countries. The analysis is presented a brief review of the current cost of energy and the potential of reducing the cost from parabolic trough- CSP plant. Exergetic and environmental life cycle assessment analyses are conducted for the selected plant in chapter Five; the objectives are 1) to assess the environmental impact and cost, in terms of exergy of the life cycle of the plant; 2) to find out the points of weakness in terms of irreversibility of the process; and 3) to verify whether solar power plants can reduce environmental impact and the cost of electricity generation by comparing them with fossil fuel plants, in particular, Natural Gas Combined Cycle (NGCC) plant and oil thermal power plant. The analysis also targets a thermoeconomic analysis using the specific exergy costing (SPECO) method to evaluate the level of the cost caused by exergy destruction. The main technological findings are that the most important contribution impact lies with the solar field, which reports a value of 79%; and the materials with the vi highest impact are: steel (47%), molten salt (25%) and synthetic oil (21%). The “Human Health” damage category presents the highest impact (69%) followed by the “Resource” damage category (24%). In addition, the highest exergy demand is linked to the steel (47%); and there is a considerable exergetic demand related to the molten salt and synthetic oil with values of 25% and 19%, respectively. Finally, in the comparison with fossil fuel power plants (NGCC and Oil), the CSP plant presents the lowest environmental impact, while the worst environmental performance is reported to the oil power plant followed by NGCC plant. The solar field presents the largest value of cost rate, where the boiler is a component with the highest cost rate among the power cycle components. The thermal storage allows the CSP plants to overcome solar irradiation transients, to respond to electricity demand independent of weather conditions, and to extend electricity production beyond the availability of daylight. Numerical analysis of the thermal transient response of a thermocline storage tank is carried out for the charging phase. The system of equations describing the numerical model is solved by using time-implicit and space-backward finite differences and which encoded within the Matlab environment. The analysis presented the following findings: the predictions agree well with the experiments for the time evolution of the thermocline region, particularly for the regions away from the top-inlet. The deviations observed in the near-region of the inlet are most likely due to the high-level of turbulence in this region due to the localized level of mixing resulting; a simple analytical model to take into consideration this increased turbulence level was developed and it leads to some improvement of the predictions; this approach requires practically no additional computational effort and it relates the effective thermal diffusivity to the mean effective velocity of the fluid at each particular height of the system. Altogether the study indicates that the selected parabolic trough-CSP plant has the edge over alternative competing technologies for locations where DNI is high and where land usage is not an issue, such as the shoreline of Libya.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Happier employees are more productive. Organizations across industry, no doubt, try to improve their employees’ happiness with the objective to achieve higher profitability and company value. While this issue has drawn increasing attention in high tech and other industries, little is known about the happiness of project management professionals. More research is needed to explore the current situation of workplace happiness of project management professionals and the driving factors behind it. This thesis explores the workplace happiness (subjective well-being) of project management professionals based on the exploratory statistical analysis of a survey 225 professionals in the state of Maryland, conducted in October 2014. The thesis applies Structural Equation Modeling and multiple regression analysis to the dataset and shows no significant impact of gender, age, work experience, and some other demographic traits on workplace happiness, also named well-being. Statistically significant factors for workplace happiness include: creating pleasant work environment, promoting open organization and well-managed team, and good organization to work for. With respect to the reliability of self-reporting, the study finds that the comprehensive appraisal tool designed by Happiness Works and New Economics Foundation can give a more reliable happiness evaluation. Two key factors, i.e. career perspectives and free to be self, can help alleviate the overconfidence of workplace happiness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is devoted to the development, synthesis, properties, and applications of nano materials for critical technologies, including three areas: (1) Microbial contamination of drinking water is a serious problem of global significance. About 51% of the waterborne disease outbreaks in the United States can be attributed to contaminated ground water. Development of metal oxide nanoparticles, as viricidal materials is of technological and fundamental scientific importance. Nanoparticles with high surface areas and ultra small particle sizes have dramatically enhanced efficiency and capacity of virus inactivation, which cannot be achieved by their bulk counterparts. A series of metal oxide nanoparticles, such as iron oxide nanoparticles, zinc oxide nanoparticles and iron oxide-silver nanoparticles, coated on fiber substrates was developed in this research for evaluation of their viricidal activity. We also carried out XRD, TEM, SEM, XPS, surface area measurements, and zeta potential of these nanoparticles. MS2 virus inactivation experiments showed that these metal oxide nanoparticle coated fibers were extremely powerful viricidal materials. Results from this research suggest that zinc oxide nanoparticles with diameter of 3.5 nm, showing an isoelectric point (IEP) at 9.0, were well dispersed on fiberglass. These fibers offer an increase in capacity by orders of magnitude over all other materials. Compared to iron oxide nanoparticles, zinc oxide nanoparticles didn’t show an improvement in inactivation kinetics but inactivation capacities did increase by two orders of magnitude to 99.99%. Furthermore, zinc oxide nanoparticles have higher affinity to viruses than the iron oxide nanoparticles in presence of competing ions. The advantages of zinc oxide depend on high surface charge density, small nanoparticle sizes and capabilities of generating reactive oxygen species. The research at its present stage of development appears to offer the best avenue to remove viruses from water. Without additional chemicals and energy input, this system can be implemented by both points of use (POU) and large-scale use water treatment technology, which will have a significant impact on the water purification industry. (2) A new family of aliphatic polyester lubricants has been developed for use in micro-electromechanical systems (MEMS), specifically for hard disk drives that operate at high spindle speeds (>15000rpm). Our program was initiated to address current problems with spin-off of the perfluoroether (PFPE) lubricants. The new polyester lubricant appears to alleviate spin-off problems and at the same time improves the chemical and thermal stability. This new system provides a low cost alternative to PFPE along with improved adhesion to the substrates. In addition, it displays a much lower viscosity, which may be of importance to stiction related problems. The synthetic route is readily scalable in case additional interest emerges in other areas including small motors. (3) The demand for increased signal transmission speed and device density for the next generation of multilevel integrated circuits has placed stringent demands on materials performance. Currently, integration of the ultra low-k materials in dual Damascene processing requires chemical mechanical polishing (CMP) to planarize the copper. Unfortunately, none of the commercially proposed dielectric candidates display the desired mechanical and thermal properties for successful CMP. A new polydiacetylene thermosetting polymer (DEB-TEB), which displays a low dielectric constant (low-k) of 2.7, was recently developed. This novel material appears to offer the only avenue for designing an ultra low k dielectric (1.85k), which can still display the desired modulus (7.7Gpa) and hardness (2.0Gpa) sufficient to withstand the process of CMP. We focused on further characterization of the thermal properties of spin-on poly (DEB-TEB) ultra-thin film. These include the coefficient of thermal expansion (CTE), biaxial thermal stress, and thermal conductivity. Thus the CTE is 2.0*10-5K-1 in the perpendicular direction and 8.0*10-6 K-1 in the planar direction. The low CTE provides a better match to the Si substrate which minimizes interfacial stress and greatly enhances the reliability of the microprocessors. Initial experiments with oxygen plasma etching suggest a high probability of success for achieving vertical profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem: Around 300 million people worldwide have asthma and prevalence is increasing. Support for optimal self-management can be effective in improving a range of outcomes and is cost effective, but is underutilised as a treatment strategy. Supporting optimum self-management using digital technology shows promise, but how best to do this is not clear. Aim: The purpose of this project was to explore the potential role of a digital intervention in promoting optimum self-management in adults with asthma. Methods: Following the MRC Guidance on the Development and Evaluation of Complex Interventions which advocates using theory, evidence, user testing and appropriate modelling and piloting, this project had 3 phases. Phase 1: Examination of the literature to inform phases 2 and 3, using systematic review methods and focussed literature searching. Phase 2: Developing the Living Well with Asthma website. A prototype (paper-based) version of the website was developed iteratively with input from a multidisciplinary expert panel, empirical evidence from the literature (from phase 1), and potential end users via focus groups (adults with asthma and practice nurses). Implementation and behaviour change theories informed this process. The paper-based designs were converted to the website through an iterative user centred process (think aloud studies with adults with asthma). Participants considered contents, layout, and navigation. Development was agile using feedback from the think aloud sessions immediately to inform design and subsequent think aloud sessions. Phase 3: A pilot randomised controlled trial over 12 weeks to evaluate the feasibility of a Phase 3 trial of Living Well with Asthma to support self-management. Primary outcomes were 1) recruitment & retention; 2) website use; 3) Asthma Control Questionnaire (ACQ) score change from baseline; 4) Mini Asthma Quality of Life (AQLQ) score change from baseline. Secondary outcomes were patient activation, adherence, lung function, fractional exhaled nitric oxide (FeNO), generic quality of life measure (EQ-5D), medication use, prescribing and health services contacts. Results: Phase1: Demonstrated that while digital interventions show promise, with some evidence of effectiveness in certain outcomes, participants were poorly characterised, telling us little about the reach of these interventions. The interventions themselves were poorly described making drawing definitive conclusions about what worked and what did not impossible. Phase 2: The literature indicated that important aspects to cover in any self-management intervention (digital or not) included: asthma action plans, regular health professional review, trigger avoidance, psychological functioning, self-monitoring, inhaler technique, and goal setting. The website asked users to aim to be symptom free. Key behaviours targeted to achieve this include: optimising medication use (including inhaler technique); attending primary care asthma reviews; using asthma action plans; increasing physical activity levels; and stopping smoking. The website had 11 sections, plus email reminders, which promoted these behaviours. Feedback during think aloud studies was mainly positive with most changes focussing on clarification of language, order of pages and usability issues mainly relating to navigation difficulties. Phase 3: To achieve our recruitment target 5383 potential participants were invited, leading to 51 participants randomised (25 to intervention group). Age range 16-78 years; 75% female; 28% from most deprived quintile. Nineteen (76%) of the intervention group used the website for an average of 23 minutes. Non-significant improvements in favour of the intervention group observed in the ACQ score (-0.36; 95% confidence interval: -0.96, 0.23; p=0.225), and mini-AQLQ scores (0.38; -0.13, 0.89; p=0.136). A significant improvement was observed in the activity limitation domain of the mini-AQLQ (0.60; 0.05 to 1.15; p = 0.034). Secondary outcomes showed increased patient activation and reduced reliance on reliever medication. There was no significant difference in the remaining secondary outcomes. There were no adverse events. Conclusion: Living Well with Asthma has been shown to be acceptable to potential end users, and has potential for effectiveness. This intervention merits further development, and subsequent evaluation in a Phase III full scale RCT.