920 resultados para fidelity of implementation
Resumo:
Miami-Dade County implemented a series of water conservation programs, which included rebate/exchange incentives to encourage the use of high efficiency aerators (AR), showerheads (SH), toilets (HET) and clothes washers (HEW), to respond to the environmental sustainability issue in urban areas. This study first used panel data analysis of water consumption to evaluate the performance and actual water savings of individual programs. Integrated water demand model has also been developed for incorporating property’s physical characteristics into the water consumption profiles. Life cycle assessment (with emphasis on end-use stage in water system) of water intense appliances was conducted to determine the environmental impacts brought by each practice. Approximately 6 to 10 % of water has been saved in the first and second year of implementation of high efficiency appliances, and with continuing savings in the third and fourth years. Water savings (gallons per household per day) for water efficiency appliances were observed at 28 (11.1%) for SH, 34.7 (13.3%) for HET, and 39.7 (14.5%) for HEW. Furthermore, the estimated contributions of high efficiency appliances for reducing water demand in the integrated water demand model were between 5 and 19% (highest in the AR program). Results indicated that adoption of more than one type of water efficiency appliance could significantly reduce residential water demand. For the sustainable water management strategies, the appropriate water conservation rate was projected to be 1 to 2 million gallons per day (MGD) through 2030. With 2 MGD of water savings, the estimated per capita water use (GPCD) could be reduced from approximately 140 to 122 GPCD. Additional efforts are needed to reduce the water demand to US EPA’s “Water Sense” conservation levels of 70 GPCD by 2030. Life cycle assessment results showed that environmental impacts (water and energy demands and greenhouse gas emissions) from end-use and demand phases are most significant within the water system, particularly due to water heating (73% for clothes washer and 93% for showerhead). Estimations of optimal lifespan for appliances (8 to 21 years) implied that earlier replacement with efficiency models is encouraged in order to minimize the environmental impacts brought by current practice.
Resumo:
The purpose of the study was to investigate the physiological and psychological benefits provided by a self-selected health and wellness course on a racially and ethnically diverse student population. It was designed to determine if students from a 2-year Hispanic serving institution (HIS) from a large metropolitan area would enhance their capacity to perform physical activities, increase their knowledge of health topics and raise their exercise self-efficacy after completing a course that included educational and activity components for a period of 16 weeks. A total of 185 students voluntarily agreed to participate in the study. An experimental group was selected from six sections of a health and wellness course, and a comparison group from students in a student life skills course. All participants were given anthropometric tests of physical fitness, a knowledge test, and an exercise self-efficacy scale was given at the beginning and at the conclusion of the semester. An ANCOVA analyses with the pretest scores being the covariate and the dependent variable being the difference score, indicated a significant improvement of the experimental group in five of the seven anthropometric tests over the comparison group. In addition, the experimental group increased in two of the three sections of the exercise self-efficacy scale indicating greater confidence to participate in physical activities in spite of barriers over the comparison group. The experimental group also increased in knowledge of health related topics over the comparison group at the .05 significance level. Results indicated beneficial outcomes gained by students enrolled in a 16-week health and wellness course. The study has several implications for practitioners, faculty members, educational policy makers and researchers in terms of implementation of strategies to promote healthy behaviors in college students and, to encourage them to engage in regular physical activities throughout their college years.
Resumo:
A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.
Resumo:
INTRODUCTION: The ability to reproducibly identify clinically equivalent patient populations is critical to the vision of learning health care systems that implement and evaluate evidence-based treatments. The use of common or semantically equivalent phenotype definitions across research and health care use cases will support this aim. Currently, there is no single consolidated repository for computable phenotype definitions, making it difficult to find all definitions that already exist, and also hindering the sharing of definitions between user groups. METHOD: Drawing from our experience in an academic medical center that supports a number of multisite research projects and quality improvement studies, we articulate a framework that will support the sharing of phenotype definitions across research and health care use cases, and highlight gaps and areas that need attention and collaborative solutions. FRAMEWORK: An infrastructure for re-using computable phenotype definitions and sharing experience across health care delivery and clinical research applications includes: access to a collection of existing phenotype definitions, information to evaluate their appropriateness for particular applications, a knowledge base of implementation guidance, supporting tools that are user-friendly and intuitive, and a willingness to use them. NEXT STEPS: We encourage prospective researchers and health administrators to re-use existing EHR-based condition definitions where appropriate and share their results with others to support a national culture of learning health care. There are a number of federally funded resources to support these activities, and research sponsors should encourage their use.
Resumo:
Automated acceptance testing is the testing of software done in higher level to test whether the system abides by the requirements desired by the business clients by the use of piece of script other than the software itself. This project is a study of the feasibility of acceptance tests written in Behavior Driven Development principle. The project includes an implementation part where automated accep- tance testing is written for Touch-point web application developed by Dewire (a software consultant company) for Telia (a telecom company) from the require- ments received from the customer (Telia). The automated acceptance testing is in Cucumber-Selenium framework which enforces Behavior Driven Development principles. The purpose of the implementation is to verify the practicability of this style of acceptance testing. From the completion of implementation, it was concluded that all the requirements from customer in real world can be converted into executable specifications and the process was not at all time-consuming or difficult for a low-experienced programmer like the author itself. The project also includes survey to measure the learnability and understandability of Gherkin- the language that Cucumber understands. The survey consist of some Gherkin exam- ples followed with questions that include making changes to the Gherkin exam- ples. Survey had 3 parts: first being easy, second medium and third most difficult. Survey also had a linear scale from 1 to 5 to rate the difficulty level for each part of the survey. 1 stood for very easy and 5 for very difficult. Time when the partic- ipants began the survey was also taken in order to calculate the total time taken by the participants to learn and answer the questions. Survey was taken by 18 of the employers of Dewire who had primary working role as one of the programmer, tester and project manager. In the result, tester and project manager were grouped as non-programmer. The survey concluded that it is very easy and quick to learn Gherkin. While the participants rated Gherkin as very easy.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Commercial computer games contain “physics engine” components, responsible for providing realistic interactions among game objects. The question naturally arises of whether these engines can be used to develop educational materials for high school and university physics education. To answer this question, the author's group recently conducted a detailed scientific investigation of the physics engine of Unreal Tournament 2004 (UT2004). This article presents their motivation, methodology, and results. The author presents the findings of experiments that probed the accessibility and fidelity of UT2004's physics engine, examples of educational materials developed, and an evaluation of their use in high school classes. The associated pedagogical implications of this approach are discussed, and the author suggests guidelines for educators on how to deploy the approach. Key resources are presented on an associated Web site.
Resumo:
One possible laser source for the Laser Interferometer Space Antenna (LISA) consists of an Ytterbium-doped fiber amplifier originally developed for inter-satellite communication, seeded by the laser used for the technology demonstrator mission LISA Pathfinder. LISA needs to transmit clock information between its three spacecraft to correct for phase noise between the clocks on the individual spacecraft. For this purpose phase modulation sidebands at GHz frequencies will be imprinted on the laser beams between spacecraft. Differential phase noise between the carrier and a sideband introduced within the optical chain must be very low. We report on a transportable setup to measure the phase fidelity of optical amplifiers.
Resumo:
There is variation in how teachers and schools implement bullying prevention programs. Although this variation has been discussed, there has been little empirical research concerning the relationship between implementation fidelity and program outcomes. This thesis contains three studies, each of them in the context of implementing the KiVa antibullying program, and examines teachers’ actions in preventing and intervening in school bullying. The first aim of this thesis is to examine implementation degree of the KiVa curriculum and its’ association with reductions in victimization and bullying perpetration (Study I). The second aim is to clarify why teachers displayed different degrees of adherence to the KiVa curriculum during a school year (Study II). Thirdly, it is investigated whether recognizing victimization can be difficult for school staff (Study III). In addition to these peer-reviewed studies, the thesis includes a qualitative analysis (unpublished) of the teachers’ open answers concerning their implementation experiences. The data were collected from elementary school teachers (Studies I–II; the unpublished study), elementary school students (Study I), and students on the elementary and middle school levels (Study III) during the evaluation of the effectiveness of KiVa antibullying program between 2007 and 2009. The findings demonstrate that a larger reduction in victimization can be achieved in classrooms where teachers display higher levels of adherence to the KiVa curriculum and invest more time for preparing the lessons. Bullying perpetration, however, was not equally affected by the level of curriculum implementation. With respect to the implementation process over one year, there was significant variation between individual teachers’ activity—ranging from systematic and high implementation to declining delivery from lesson to lesson. The sustained actions (high and moderate levels of implementation) were premised on principal support for antibullying work. Lesson preparation was associated with keeping implementation high throughout the school year. The findings also implied that the belief in the effectiveness of the program is important for a higher implementation degree at starting point of the process. Finally, there are severe flaws in teachers’ ability to identify students who are victimized. As it turns out, it is possible that only one-fourth of chronically victimized students are helped by the school staff. Especially when the victims are middle-school-aged girls, when they bully others themselves, or when they do not tell adults about bullying, reaching out for them is difficult. Implementation and dissemination of research-based interventions will take a good deal of time and effort. The findings demonstrate that active implementation is important for improving program outcomes. They also show how implementation can be sustained—there are both individual and interpersonal factors that facilitate or inhibit high-quality implementation. Thus, implications for future research regarding the implementation of school-based programs are suggested.
Resumo:
In the first part of the study the types of barriers to tourism development that may occur during the planning phase of this development, and in the phase of implementation of these plans, including the endogenous and exogenous barriers, were presented. The second part presents the results of research on the factors hindering the development of tourism identified in the selected region of Wielkopolska Province (Poland). The article presents detailed description of tourism barriers categories, which include: political and legal, economic, infrastructure, social, geographical and organizational problems. In the final part article presents a difference in the understanding of problems depending on the stakeholder groups, which leads to the conclusion that in order to be able to specifically identify problematic issues opinion of different stakeholders categories should be recognized. Only such action can lead to the construction of the development strategy, which will not have any areas of uncertainty (i.e. «gaps» in the identifying problem areas).
Resumo:
The history of comitology – the system of implementation committees that control the Commission in the execution of delegated powers – has been characterised by institutional tensions. The crux of these tensions has often been the role of the European Parliament and its quest to be granted powers equal to those of the Council. Over time this tension has been resolved through a series of inter-institutional agreements and Comitology Decisions, essentially giving the Parliament incremental increases in power. This process came to a head with the 2006 Comitology reform and the introduction of the regulatory procedure with scrutiny (RPS). After just over three years of experience with the RPS procedure, and having revised the entire acquis communautaire, the Treaty of Lisbon made has made it redundant through the creation of Delegated Acts (Article 290 TFEU), which gives the Parliament equal rights of oversight. This article aims to evaluate the practical implications that Delegated Acts will entail for the Parliament, principally by using the four years of experience with the RPS to better understand the challenges ahead. This analysis will be of interest to those following the study of comitology, formal and informal interinstitutional relations, and also to practitioners who will have to work with Delegated Acts in the future.
Resumo:
Implementation of human rights is often criticized because it is perceived as being imposed on the rest of the world. In this case, human rights start to be seen as a sole abstraction, an empty word. What are the theoretical arguments of these critics and can we determine any historical grounds for them? In this paper, I will try to point at similar critics after the French Revolution – like that of the Historical School and Hegel – and try to show if some of these critics are still relevant. And I will compare these critics with contemporary arguments of cultural relativists. There are different streams and categorizations of human rights theories in today’s world. What differentiates them is basically the source of the human rights. After the French Revolution, the historical school had criticized the individuation and Hegel had criticized the formal freedom which was, according to him, a consequence of the Revolution. In this context Hegel drew a distinction between real freedom and formal freedom. Besides the theory of sources, the theories of implementation such as human rights as a model of learning, human rights as a result of an historical process are worth attention. The crucial point is about integrating human rights as an inner process and not to use them as a tool for intervention in other countries, which we observe in today’s world. And this is the exact point why I find the discussion of the sources more important. This discussion can help us to show how the inner evaluation of a society makes the realization of human rights possible and how we can avoid the above mentioned abstraction and misuse.
Resumo:
Tutkittu yritys on suomalainen maaleja ja lakkoja kansainvälisesti valmistava ja myyvä toimija. Yrityksessä otettiin vuonna 2010 käyttöön uudet tuotannon ja toimitusketjun tavoitteet ja suunnitelmat ja tämä tutkimus on osa tuota kokonaisvaltaista kehittämissuuntaa. Tutkimuksessa käsitellään tuotannon ja kunnossapidon tehokkuuden parantamis- ja mittaustyökalu OEE:tä ja tuotevaihtoaikojen pienentämiseen tarkoitettua SMED -työkalua. Työn teoriaosuus perustuu lähinnä akateemisiin julkaisuihin, mutta myös haastatteluihin, kirjoihin, internet sivuihin ja yhteen vuosikertomukseen. Empiriaosuudessa OEE:n käyttöönoton ongelmia ja onnistumista tutkittiin toistettavalla käyttäjäkyselyllä. OEE:n potentiaalia ja käyttöönottoa tutkittiin myös tarkastelemalla tuotanto- ja käytettävyysdataa, jota oli kerätty tuotantolinjalta. SMED:iä tutkittiin siihen perustuvan tietokoneohjelman avulla. SMED:iä tutkittiin teoreettisella tasolla, eikä sitä implementoitu vielä käytäntöön. Tutkimustuloksien mukaan OEE ja SMED sopivat hyvin esimerkkiyritykselle ja niissä on paljon potentiaalia. OEE ei ainoastaan paljasta käytettävyyshäviöiden määrää, mutta myös niiden rakenteen. OEE -tulosten avulla yritys voi suunnata rajalliset tuotannon ja kunnossapidon parantamisen resurssit oikeisiin paikkoihin. Työssä käsiteltävä tuotantolinja ei tuottanut mitään 56 % kaikesta suunnitellusta tuotantoajasta huhtikuussa 2016. Linjan pysähdyksistä ajallisesti 44 % johtui vaihto-, aloitus- tai lopetustöistä. Tuloksista voidaan päätellä, että käytettävyyshäviöt ovat vakava ongelma yrityksen tuotannontehokkuudessa ja vaihtotöiden vähentäminen on tärkeä kehityskohde. Vaihtoaikaa voitaisiin vähentää ~15 % yksinkertaisilla ja halvoilla SMED:illä löydetyillä muutoksilla työjärjestyksessä ja työkaluissa. Parannus olisi vielä suurempi kattavimmilla muutoksilla. SMED:in suurin potentiaali ei välttämättä ole vaihtoaikojen lyhentämisessä vaan niiden standardisoinnissa.
Resumo:
Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.
Resumo:
Human immunodeficiency virus (HIV) rapidly evolves through generation and selection of mutants that can escape drug therapy. This process is fueled, in part, by the presumably highly error prone polymerase reverse transcriptase (RT). Fidelity of polymerases can be influenced by cation co-factors. Physiologically, magnesium (Mg2+) is used as a co-factor by RT to perform catalysis, however, alternative cations including manganese (Mn2+), cobalt (Co2+), and zinc (Zn2+) can also be used. I demonstrate here that fidelity and inhibition of HIV RT can be influenced differently, in vitro, by divalent cations depending on their concentration. The reported mutation frequency for purified HIV RT in vitro is typically in the 10-4 range (per nucleotide addition), making the enzyme several-fold less accurate than most polymerases. Paradoxically, results examining HIV replication in cells indicate an error frequency that is ~10 times lower than the error rate obtained in the test tube. Here, I reconcile, at least in part, these discrepancies by showing that HIV RT fidelity in vitro is in the same range as cellular results, in physiological concentrations of free Mg2+ (~0.25 mM). At low Mg2+, mutation rates were 5-10 times lower compared to high Mg2+ conditions (5-10 mM). Alternative divalent cations also have a concentration-dependent effect on RT fidelity. Presumed promutagenic cations Mn2+ and Co2+ decreases the fidelity of RT only at elevated concentrations, and Zn2+, when present in low concentration, increases the fidelity of HIV-1 RT by ~2.5 fold compared to Mg2+. HIV-1 and HIV-2 RT inhibition by nucleoside (NRTIs) and non-nucleoside RT inhibitors (NNRTIs) in vitro is also affected by the Mg2+ concentration. NRTIs lacking 3'-OH group inhibited both enzymes less efficiently in low Mg2+ than in high Mg2+; whereas inhibition by the “translocation defective RT inhibitor”, which retains the 3ʹ-OH, was unaffected by Mg2+ concentration, suggesting that NRTIs with a 3ʹ-OH group may be more potent than other NRTIs. In contrast, NNRTIs were more effective in low vs. high Mg2+ conditions. Overall, the studies presented reveal strategies for designing novel RT inhibitors and strongly emphasize the need for studying HIV RT and RT inhibitors in physiologically relevant low Mg2+ conditions.