768 resultados para reliability-cost evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different tools have been used to set up and adopt the model for the fulfillment of the objective of this research. 1. The Model The base model that has been used is the Analytical Hierarchy Process (AHP) adapted with the aim to perform a Benefit Cost Analysis. The AHP developed by Thomas Saaty is a multicriteria decision - making technique which decomposes a complex problem into a hierarchy. It is used to derive ratio scales from both discreet and continuous paired comparisons in multilevel hierarchic structures. These comparisons may be taken from actual measurements or from a fundamental scale that reflects the relative strength of preferences and feelings. 2. Tools and methods 2.1. The Expert Choice Software The software Expert Choice is a tool that allows each operator to easily implement the AHP model in every stage of the problem. 2.2. Personal Interviews to the farms For this research, the farms of the region Emilia Romagna certified EMAS have been detected. Information has been given by EMAS center in Wien. Personal interviews have been carried out to each farm in order to have a complete and realistic judgment of each criteria of the hierarchy. 2.3. Questionnaire A supporting questionnaire has also been delivered and used for the interviews . 3. Elaboration of the data After data collection, the data elaboration has taken place. The software support Expert Choice has been used . 4. Results of the Analysis The result of the figures above (vedere altro documento) gives a series of numbers which are fractions of the unit. This has to be interpreted as the relative contribution of each element to the fulfillment of the relative objective. So calculating the Benefits/costs ratio for each alternative the following will be obtained: Alternative One: Implement EMAS Benefits ratio: 0, 877 Costs ratio: 0, 815 Benfit/Cost ratio: 0,877/0,815=1,08 Alternative Two: Not Implement EMAS Benefits ratio: 0,123 Costs ration: 0,185 Benefit/Cost ratio: 0,123/0,185=0,66 As stated above, the alternative with the highest ratio will be the best solution for the organization. This means that the research carried out and the model implemented suggests that EMAS adoption in the agricultural sector is the best alternative. It has to be noted that the ratio is 1,08 which is a relatively low positive value. This shows the fragility of this conclusion and suggests a careful exam of the benefits and costs for each farm before adopting the scheme. On the other part, the result needs to be taken in consideration by the policy makers in order to enhance their intervention regarding the scheme adoption on the agricultural sector. According to the AHP elaboration of judgments we have the following main considerations on Benefits: - Legal compliance seems to be the most important benefit for the agricultural sector since its rank is 0,471 - The next two most important benefits are Improved internal organization (ranking 0,230) followed by Competitive advantage (ranking 0, 221) mostly due to the sub-element Improved image (ranking 0,743) Finally, even though Incentives are not ranked among the most important elements, the financial ones seem to have been decisive on the decision making process. According to the AHP elaboration of judgments we have the following main considerations on Costs: - External costs seem to be largely more important than the internal ones (ranking 0, 857 over 0,143) suggesting that Emas costs over consultancy and verification remain the biggest obstacle. - The implementation of the EMS is the most challenging element regarding the internal costs (ranking 0,750).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term Congenital Nystagmus (Early Onset Nystagmus or Infantile Nystagmus Syndrome) refers to a pathology characterised by an involuntary movement of the eyes, which often seriously reduces a subject’s vision. Congenital Nystagmus (CN) is a specific kind of nystagmus within the wider classification of infantile nystagmus, which can be best recognized and classified by means of a combination of clinical investigations and motility analysis; in some cases, eye movement recording and analysis are indispensable for diagnosis. However, interpretation of eye movement recordings still lacks of complete reliability; hence new analysis techniques and precise identification of concise parameters directly related to visual acuity are necessary to further support physicians’ decisions. To this aim, an index computed from eye movement recordings and related to the visual acuity of a subject is proposed in this thesis. This estimator is based on two parameters: the time spent by a subject effectively viewing a target (foveation time - Tf) and the standard deviation of eye position (SDp). Moreover, since previous studies have shown that visual acuity largely depends on SDp, a data collection pilot study was also conducted with the purpose of specifically identifying eventual slow rhythmic component in the eye position and to characterise in more detail the SDp. The results are presented in this thesis. In addition, some oculomotor system models are reviewed and a new approach to those models, i.e. the recovery of periodic orbits of the oculomotor system in patients with CN, is tested on real patients data. In conclusion, the results obtained within this research consent to completely and reliably characterise the slow rhythmic component sometimes present in eye position recordings of CN subjects and to better classify the different kinds of CN waveforms. Those findings can successfully support the clinicians in therapy planning and treatment outcome evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Das Störungsbild der Hypochondrie stellt für die Betroffenen eine erhebliche Belastung und Beeinträchtigung dar und ist zudem von hoher gesundheitspolitischer Relevanz. Hieraus ergibt sich die Notwendigkeit für die Entwicklung und Evaluation wirkungsvoller Behandlungsansätze. Mit der vorliegenden Untersuchung wird die bisher umfangreichste Studie zur Wirksamkeit von gruppentherapeutischen Interventionen bei Patienten mit Hypochondrie beschrieben. Insgesamt nahmen 35 Patienten, die die DSM-IV-Kriterien der Hypochondrie erfüllten, an der Studie teil. Die durchgeführte Behandlung bestand aus insgesamt acht Gruppen- und sechs Einzelsitzungen. Zur Beurteilung des Therapieerfolgs wurden standardisierte Fragebogen und Einschätzungen der behandelnden Therapeuten eingeholt. Zudem wurde vor und nach der Behandlung die implizite Ängstlichkeit der Patienten mit Hilfe des Ängstlichkeits-IATs (Egloff & Schmukle, 2002) erfasst. Die Datenerhebung der Fragebögen erfolgte zu vier Messzeitpunkten. Eine Teilgruppe der Patienten (n = 10) konnte zudem über eine zweimonatige Wartezeit befragt werden. Ingesamt wurde die Therapie von den Patienten gut akzeptiert. Im Laufe der Behandlung zeigten sich auf den Selbstbeurteilungsverfahren umfangreiche Veränderungen im Erleben und Verhalten der Patienten. Es zeigte sich eine Reduktion von krankheitsbezogenen Kognitionen und Ängsten, eine Abnahme des Krankheitsverhaltens und eine Zunahme von Störungs- und Bewältigungswissen. Die Reduktion der hypochondrischen Symptomatik stellte sich als klinisch relevant heraus. Zudem zeigte sich eine Reduktion der allgemeinen Belastung und Ängstlichkeit sowie depressiver und körperlicher Symptome. Die Einschätzungen der behandelnden Therapeuten bestätigten die mittels Fragebogen ermittelten Befunde. Mit Hilfe des Ängstlichkeits-IATs konnte eine Veränderung des angstbezogenen Selbstkonzepts nachgewiesen werden. In einer Wartekontrollzeit zeigten sich nur geringfügige Reduktionen der hypochondrischen Symptomatik und keine bedeutsamen Reduktionen der allgemeinen Psychopathologie. Die Ergebnisse der durchgeführten Kombinationstherapie sind mit den Befunden bisheriger Evaluationen zur Effektivität von Einzeltherapien bei Hypochondrie vergleichbar. Die Befunde unterstreichen die Gleichwertigkeit von ökonomischeren gruppentherapeutischen Interventionen bei der Behandlung der Hypochondrie.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable electronic systems, namely a set of reliable electronic devices connected to each other and working correctly together for the same functionality, represent an essential ingredient for the large-scale commercial implementation of any technological advancement. Microelectronics technologies and new powerful integrated circuits provide noticeable improvements in performance and cost-effectiveness, and allow introducing electronic systems in increasingly diversified contexts. On the other hand, opening of new fields of application leads to new, unexplored reliability issues. The development of semiconductor device and electrical models (such as the well known SPICE models) able to describe the electrical behavior of devices and circuits, is a useful means to simulate and analyze the functionality of new electronic architectures and new technologies. Moreover, it represents an effective way to point out the reliability issues due to the employment of advanced electronic systems in new application contexts. In this thesis modeling and design of both advanced reliable circuits for general-purpose applications and devices for energy efficiency are considered. More in details, the following activities have been carried out: first, reliability issues in terms of security of standard communication protocols in wireless sensor networks are discussed. A new communication protocol is introduced, allows increasing the network security. Second, a novel scheme for the on-die measurement of either clock jitter or process parameter variations is proposed. The developed scheme can be used for an evaluation of both jitter and process parameter variations at low costs. Then, reliability issues in the field of “energy scavenging systems” have been analyzed. An accurate analysis and modeling of the effects of faults affecting circuit for energy harvesting from mechanical vibrations is performed. Finally, the problem of modeling the electrical and thermal behavior of photovoltaic (PV) cells under hot-spot condition is addressed with the development of an electrical and thermal model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite several clinical tests that have been developed to qualitatively describe complex motor tasks by functional testing, these methods often depend on clinicians' interpretation, experience and training, which make the assessment results inconsistent, without the precision required to objectively assess the effect of the rehabilitative intervention. A more detailed characterization is required to fully capture the various aspects of motor control and performance during complex movements of lower and upper limbs. The need for cost-effective and clinically applicable instrumented tests would enable quantitative assessment of performance on a subject-specific basis, overcoming the limitations due to the lack of objectiveness related to individual judgment, and possibly disclosing subtle alterations that are not clearly visible to the observer. Postural motion measurements at additional locations, such as lower and upper limbs and trunk, may be necessary in order to obtain information about the inter-segmental coordination during different functional tests involved in clinical practice. With these considerations in mind, this Thesis aims: i) to suggest a novel quantitative assessment tool for the kinematics and dynamics evaluation of a multi-link kinematic chain during several functional motor tasks (i.e. squat, sit-to-stand, postural sway), using one single-axis accelerometer per segment, ii) to present a novel quantitative technique for the upper limb joint kinematics estimation, considering a 3-link kinematic chain during the Fugl-Meyer Motor Assessment and using one inertial measurement unit per segment. The suggested methods could have several positive feedbacks from clinical practice. The use of objective biomechanical measurements, provided by inertial sensor-based technique, may help clinicians to: i) objectively track changes in motor ability, ii) provide timely feedback about the effectiveness of administered rehabilitation interventions, iii) enable intervention strategies to be modified or changed if found to be ineffective, and iv) speed up the experimental sessions when several subjects are asked to perform different functional tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work is included in the context of the assessment of sustainability in the construction field and is aimed at estimating and analyzing life cycle cost of the existing reinforced concrete bridge “Viadotto delle Capre” during its entire life. This was accomplished by a comprehensive data collection and results evaluation. In detail, the economic analysis of the project is performed. The work has investigated possible design alternatives for maintenance/rehabilitation and end-of-life operations, when structural, functional, economic and also environmental requirements have to be fulfilled. In detail, the economic impact of different design options for the given reinforced concrete bridge have been assessed, whereupon the most economically, structurally and environmentally efficient scenario was chosen. The Integrated Life-Cycle Analysis procedure and Environmental Impact Assessment were also discussed in this work. The scope of this thesis is to illustrate that Life Cycle Cost analysis as part of Life Cycle Assessment approach could be effectively used to drive the design and management strategy of new and existing structures. The final objective of this contribution is to show how an economic analysis can influence decision-making in the definition of the most sustainable design alternatives. The designers can monitor the economic impact of different design strategies in order to identify the most appropriate option.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How to evaluate the cost-effectiveness of repair/retrofit intervention vs. demolition/replacement and what level of shaking intensity can the chosen repairing/retrofit technique sustain are open questions affecting either the pre-earthquake prevention, the post-earthquake emergency and the reconstruction phases. The (mis)conception that the cost of retrofit interventions would increase linearly with the achieved seismic performance (%NBS) often discourages stakeholders to consider repair/retrofit options in a post-earthquake damage situation. Similarly, in a pre-earthquake phase, the minimum (by-law) level of %NBS might be targeted, leading in some cases to no-action. Furthermore, the performance measure enforcing owners to take action, the %NBS, is generally evaluated deterministically. Not directly reflecting epistemic and aleatory uncertainties, the assessment can result in misleading confidence on the expected performance. The present study aims at contributing to the delicate decision-making process of repair/retrofit vs. demolition/replacement, by developing a framework to assist stakeholders with the evaluation of the effects in terms of long-term losses and benefits of an increment in their initial investment (targeted retrofit level) and highlighting the uncertainties hidden behind a deterministic approach. For a pre-1970 case study building, different retrofit solutions are considered, targeting different levels of %NBS, and the actual probability of reaching Collapse when considering a suite of ground-motions is evaluated, providing a correlation between %NBS and Risk. Both a simplified and a probabilistic loss modelling are then undertaken to study the relationship between %NBS and expected direct and indirect losses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bandlaufwerke waren bisher die vorherrschende Technologie, um die anfallenden Datenmengen in Archivsystemen zu speichern. Mit Zugriffsmustern, die immer aktiver werden, und Speichermedien wie Festplatten die kostenmäßig aufholen, muss die Architektur vor Speichersystemen zur Archivierung neu überdacht werden. Zuverlässigkeit, Integrität und Haltbarkeit sind die Haupteigenschaften der digitalen Archivierung. Allerdings nimmt auch die Zugriffsgeschwindigkeit einen erhöhten Stellenwert ein, wenn aktive Archive ihre gesamten Inhalte für den direkten Zugriff bereitstellen. Ein band-basiertes System kann die hierfür benötigte Parallelität, Latenz und Durchsatz nicht liefern, was in der Regel durch festplattenbasierte Systeme als Zwischenspeicher kompensiert wird.rnIn dieser Arbeit untersuchen wir die Herausforderungen und Möglichkeiten ein festplattenbasiertes Speichersystem zu entwickeln, das auf eine hohe Zuverlässigkeit und Energieeffizienz zielt und das sich sowohl für aktive als auch für kalte Archivumgebungen eignet. Zuerst analysieren wir die Speichersysteme und Zugriffsmuster eines großen digitalen Archivs und präsentieren damit ein mögliches Einsatzgebiet für unsere Architektur. Daraufhin stellen wir Mechanismen vor um die Zuverlässigkeit einer einzelnen Festplatte zu verbessern und präsentieren sowie evaluieren einen neuen, energieeffizienten, zwei- dimensionalen RAID Ansatz der für „Schreibe ein Mal, lese mehrfach“ Zugriffe optimiert ist. Letztlich stellen wir Protokollierungs- und Zwischenspeichermechanismen vor, die die zugrundeliegenden Ziele unterstützen und evaluieren das RAID System in einer Dateisystemumgebung.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective : To compare two scoring systems: the Huddart/Bodenham system (HB system) and the Bauru-BCLP yardstick (BCLP yardstick), which classify treatment outcome in terms of dental arch relationships in patients with complete bilateral cleft lip and palate (CBCLP). The predictive value of these scoring systems for treatment outcome was also evaluated. Design : Retrospective longitudinal study. Patients : Dental arch relationships of 43 CBCLP patients were evaluated at 6, 9, and 12 years. Setting : Treatment outcome in BCLP patients using two scoring systems. Main Outcome Measures : For each age group, the HB scores were correlated with the BCLP yardstick scores using Spearman's correlation coefficient. The predictive value of the two scoring systems was evaluated by backward regression analysis. Results : Intraobserver Kappa values for the BCLP yardstick scoring for the two observers were .506 and .627, respectively, and the interobserver reliability ranged from .427 and .581. The intraobserver reliability for the HB system ranged from .92 to .97 and the interobserver reliability from .88 to .96. The BCLP yardstick scores of 6 and 9 years together were predictors for the outcome at 12 years (explained variance 41.3%). Adding the incisor and lateral HB scores in the regression model increased the explained variance to 67%. Conclusions : The BCLP yardstick and the HB system are reliable scoring systems for evaluation of dental arch relationships of CBCLP patients. The HB system categorizes treatment outcome into similar categories as the BCLP yardstick. In case a more sensitive measure of treatment outcome is needed, selectively both scoring systems should be used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Owing to its optimal nuclear properties, ready availability, low cost and favourable dosimetry, (99m)Tc continues to be the ideal radioisotope for medical-imaging applications. Bifunctional chelators based on a tetraamine framework exhibit facile complexation with Tc(V)O(2) to form monocationic species with high in vivo stability and significant hydrophilicity, which leads to favourable pharmacokinetics. The synthesis of a series of 1,4,8,11-tetraazaundecane derivatives (01-06) containing different functional groups at the 6-position for the conjugation of biomolecules and subsequent labelling with (99m)Tc is described herein. The chelator 01 was used as a starting material for the facile synthesis of chelators functionalised with OH (02), N(3) (04) and O-succinyl ester (05) groups. A straightforward and easy synthesis of carboxyl-functionalised tetraamine-based chelator 06 was achieved by using inexpensive and commercially available starting materials. Conjugation of 06 to a potent bombesin-antagonist peptide and subsequent labelling with (99m)Tc afforded the radiotracer (99m)Tc-N4-BB-ANT, with radiolabelling yields of >97% at a specific activity of 37 GBq micromol(-1). An IC(50) value of (3.7+/-1.3) nM was obtained, which confirmed the high affinity of the conjugate to the gastrin-releasing-peptide receptor (GRPr). Immunofluorescence and calcium mobilisation assays confirmed the strong antagonist properties of the conjugate. In vivo pharmacokinetic studies of (99m)Tc-N4-BB-ANT showed high and specific uptake in PC3 xenografts and in other GRPr-positive organs. The tumour uptake was (22.5+/-2.6)% injected activity per gram (% IA g(-1)) at 1 h post injection (p.i.). and increased to (29.9+/-4.0)% IA g(-1) at 4 h p.i. The SPECT/computed tomography (CT) images showed high tumour uptake, clear background and negligible radioactivity in the abdomen. The promising preclinical results of (99m)Tc-N4-BB-ANT warrant its potential candidature for clinical translation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detailed evaluation and cost analysis of a cranial contrast-enhanced MRI (c-ceMRI) in outpatients, inpatients, patients in an intensive care unit and children under anesthesia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to evaluate, using visual assessment, an experimental optical sensor measuring perpendicular reflection intensity (PRI) as an indicator of enamel caries lesion activity/inactivity. Forty teeth with either an active or an inactive enamel lesion were selected from a pool of extracted teeth. Each tooth was cut into halves, with a clinically sound half and a half with a non-cavitated enamel lesion. After gentle plaque removal, the teeth were kept moistened. The lesions were then photographed and a defined measuring site per lesion was chosen and indicated with an arrow on a printout. Independently, the chosen site was visually assessed for lesion activity, and its glossiness was measured with PRI assessment. Surface roughness (SR) was assessed with optical profilometry using a confocal microscope. Visual assessment and PRI were repeated after several weeks and a reliability analysis was performed. For enamel lesions visually scored as active versus inactive, significantly different values were obtained with both PRI and SR. PRI values of the clinically sound control surfaces were significantly different only from active lesions. Generally, inactive lesions had the same glossiness and the same roughness as the sound control surfaces. The reliabilities for visual assessment (? = 0.89) and for PRI (ICC = 0.86) were high. It is concluded that, within the limits of this study, PRI can be regarded as a promising tool for quantitative enamel lesion activity assessment. There is scope and potential for the PRI device to be considerably improved for in vivo use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data gathering, either for event recognition or for monitoring applications is the primary intention for sensor network deployments. In many cases, data is acquired periodically and autonomously, and simply logged onto secondary storage (e.g. flash memory) either for delayed offline analysis or for on demand burst transfer. Moreover, operational data such as connectivity information, node and network state is typically kept as well. Naturally, measurement and/or connectivity logging comes at a cost. Space for doing so is limited. Finding a good representative model for the data and providing clever coding of information, thus data compression, may be a means to use the available space to its best. In this paper, we explore the design space for data compression for wireless sensor and mesh networks by profiling common, publicly available algorithms. Several goals such as a low overhead in terms of utilized memory and compression time as well as a decent compression ratio have to be well balanced in order to find a simple, yet effective compression scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Abstractor training is a key element in creating valid and reliable data collection procedures. The choice between in-person vs. remote or simultaneous vs. sequential abstractor training has considerable consequences for time and resource utilization. We conducted a web-based (webinar) abstractor training session to standardize training across six individual Cancer Research Network (CRN) sites for a study of breast cancer treatment effects in older women (BOWII). The goals of this manuscript are to describe the training session, its participants and participants' evaluation of webinar technology for abstraction training. Findings A webinar was held for all six sites with the primary purpose of simultaneously training staff and ensuring consistent abstraction across sites. The training session involved sequential review of over 600 data elements outlined in the coding manual in conjunction with the display of data entry fields in the study's electronic data collection system. Post-training evaluation was conducted via Survey Monkey©. Inter-rater reliability measures for abstractors within each site were conducted three months after the commencement of data collection. Ten of the 16 people who participated in the training completed the online survey. Almost all (90%) of the 10 trainees had previous medical record abstraction experience and nearly two-thirds reported over 10 years of experience. Half of the respondents had previously participated in a webinar, among which three had participated in a webinar for training purposes. All rated the knowledge and information delivered through the webinar as useful and reported it adequately prepared them for data collection. Moreover, all participants would recommend this platform for multi-site abstraction training. Consistent with participant-reported training effectiveness, results of data collection inter-rater agreement within sites ranged from 89 to 98%, with a weighted average of 95% agreement across sites. Conclusions Conducting training via web-based technology was an acceptable and effective approach to standardizing medical record review across multiple sites for this group of experienced abstractors. Given the substantial time and cost savings achieved with the webinar, coupled with participants' positive evaluation of the training session, researchers should consider this instructional method as part of training efforts to ensure high quality data collection in multi-site studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives To compare the use of pair-wise meta-analysis methods to multiple treatment comparison (MTC) methods for evidence-based health-care evaluation to estimate the effectiveness and cost-effectiveness of alternative health-care interventions based on the available evidence. Methods Pair-wise meta-analysis and more complex evidence syntheses, incorporating an MTC component, are applied to three examples: 1) clinical effectiveness of interventions for preventing strokes in people with atrial fibrillation; 2) clinical and cost-effectiveness of using drug-eluting stents in percutaneous coronary intervention in patients with coronary artery disease; and 3) clinical and cost-effectiveness of using neuraminidase inhibitors in the treatment of influenza. We compare the two synthesis approaches with respect to the assumptions made, empirical estimates produced, and conclusions drawn. Results The difference between point estimates of effectiveness produced by the pair-wise and MTC approaches was generally unpredictable—sometimes agreeing closely whereas in other instances differing considerably. In all three examples, the MTC approach allowed the inclusion of randomized controlled trial evidence ignored in the pair-wise meta-analysis approach. This generally increased the precision of the effectiveness estimates from the MTC model. Conclusions The MTC approach to synthesis allows the evidence base on clinical effectiveness to be treated as a coherent whole, include more data, and sometimes relax the assumptions made in the pair-wise approaches. However, MTC models are necessarily more complex than those developed for pair-wise meta-analysis and thus could be seen as less transparent. Therefore, it is important that model details and the assumptions made are carefully reported alongside the results.