936 resultados para performance assessment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the findings of research on the environmental performance of two case-study houses, a retrofit and new build. The retrofit was completed to a Passivhaus standard while the new build was completed to current Irish building regulations. Environmental performance of the retrofit and new build was measured using life-cycle assessments, examining the assembly, operational and end-of-life stage over life spans of 50 and 80 years. Using primary information, life-cycle assessment software and life-cycle assessment databases the environmental impacts of each stage were modelled. The operational stage of both case studies was found to be the source of the most significant environmental damage, followed by the assembly and the end-of-life stage respectively. The relative importance of the assembly and end-of-life stage decreased as the life span increased. It was found that the retrofit house studied outperformed the new build in the assembly and operational stage, whereas the new build performed better in the end-of-life stage; however, this is highly sensitive, depending on the standards to which both are completed. Operational energy savings pre- and post-retrofit were significant, indicating the future potential for adoption of high-quality retrofitting practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article addresses the relative absence of class-based analysis in theatre and performance studies, and suggests the reconfiguration of class as performance rather than as it is traditionally conceived as an identity predicated solely on economic stratification. It engages with the occlusion of class by the ascendancy of identity politics based on race, gender and sexuality and its attendant theoretical counterparts in deconstruction and post-structuralism, which became axiomatic as they displaced earlier methodologies to become hegemonic in the arts and humanities. The article proceeds to an assessment of the development of sociological approaches to theatre, particularly the legacy of Raymond Williams and Pierre Bourdieu. The argument concludes with the application of an approach which reconfigures class as performance to the production of Declan Hughes's play Shiver of 2003, which dramatizes the consequences of the dot.com bubble of the late 1990s for ambitious members of the Irish middle class.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A full-scale seven-storey in-situ advanced reinforced concrete building frame was constructed in the Building Research Establishment's Cardington laboratory encompassing a range of different concrete mixes and construction techniques. This provided an opportunity to use in-situ non-destructive test methods, namely Lok and CAPO tests, on a systematic basis during the construction of the building. They were used in conjunction with both standard and temperature-matched cube specimens to assess their practicality and their individual capabilities under field conditions. Results have been analysed and presented to enable comparisons of the performance of the individual test methods employed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Off-design performance is of key importance now in the design of automotive turbocharger turbines. Due to automotive drive cycles, a turbine that can extract more energy at high pressure ratios and lower rotational speeds is desirable. Typically a radial turbine provides peak efficiency at U/C values of 0.7, but at high pressure ratios and low rotational speeds, the U/C value will be low and the rotor will experience high values of positive incidence at the inlet. The positive incidence causes high blade loading resulting in additional tip leakage flow in the rotor as well as flow separation on the suction surface of the blade. An experimental assessment has been performed on a scaled automotive VGS (variable geometry system). Three different stator vane positions have been analyzed: minimum, 25%, and maximum flow position. The first tests were to establish whether positioning the endwall clearance on the hub or shroud side of the stator vanes produced a different impact on turbine efficiency. Following this, a back swept rotor was tested to establish the potential gains to be achieved during off-design operation. A single passage CFD model of the test rig was developed and used to provide information on the flow features affecting performance in both the stator vanes and turbine. It was seen that off-design performance was improved by implementing clearance on the hub side of the stator vanes rather than on the shroud side. Through CFD analysis and tests, it was seen that two leakage vortices form, one at the leading edge and one after the spindle of the stator vane. The vortices affect the flow angle at the inlet to the rotor, in the hub region. The flow angle is shifted to more negative values of incidence, which is beneficial at the off-design conditions but detrimental at the design point. The back swept rotor was tested with the hub side stator vane clearance configuration. The efficiency and MFR were increased at the minimum and 25% stator vane position. At the design point, the efficiency and MFR were decreased. The CFD investigation showed that the incidence angle was improved at the off-design conditions for the back swept rotor. This reduction in the positive incidence angle, along with the improvement caused by the stator vane tip leakage flow, reduced flow separation on the suction surface of the rotor. At the design point, both the tip leakage flow of the stator vanes and the back swept blade angle caused flow separation on the pressure surface of the rotor. This resulted in additional blockage at the throat of the rotor reducing MFR and efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Utilising cameras as a means to survey the surrounding environment is becoming increasingly popular in a number of different research areas and applications. Central to using camera sensors as input to a vision system, is the need to be able to manipulate and process the information captured in these images. One such application, is the use of cameras to monitor the quality of airport landing lighting at aerodromes where a camera is placed inside an aircraft and used to record images of the lighting pattern during the landing phase of a flight. The images are processed to determine a performance metric. This requires the development of custom software for the localisation and identification of luminaires within the image data. However, because of the necessity to keep airport operations functioning as efficiently as possible, it is difficult to collect enough image data to develop, test and validate any developed software. In this paper, we present a technique to model a virtual landing lighting pattern. A mathematical model is postulated which represents the glide path of the aircraft including random deviations from the expected path. A morphological method has been developed to localise and track the luminaires under different operating conditions. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High effectiveness and leanness of modern supply chains (SCs) increase their vulnerability, i.e. susceptibility to disturbances reflected in non-robust SC performances. Both the SC management literature and SC professionals indicate the need for the development of SC vulnerability assessment tools. In this article, a new method for vulnerability assessment, the VULA method, is presented. The VULA method helps to identify how much a company would underperform on a specific Key Performance Indicator in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision about whether process redesign is appropriate and what kind of redesign strategies should be used in order to increase the SC's robustness. The applicability of the VULA method is demonstrated in the context of a meat SC using discrete-event simulation to conduct the performance analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the impedance cardiogram recorded by an automated external defibrillator during cardiac arrest to facilitate emergency care by lay persons. Lay persons are poor at emergency pulse checks (sensitivity 84%, specificity 36%); guidelines recommend they should not be performed. The impedance cardiogram (dZ/dt) is used to indicate stroke volume. Can an impedance cardiogram algorithm in a defibrillator determine rapidly circulatory arrest and facilitate prompt initiation of external cardiac massage?

DESIGN: Clinical study.

SETTING: University hospital.

PATIENTS: Phase 1 patients attended for myocardial perfusion imaging. Phase 2 patients were recruited during cardiac arrest. This group included nonarrest controls.

INTERVENTIONS: The impedance cardiogram was recorded through defibrillator/electrocardiographic pads oriented in the standard cardiac arrest position.

MEASUREMENTS AND MAIN RESULTS: Phase 1: Stroke volumes from gated myocardial perfusion imaging scans were correlated with parameters from the impedance cardiogram system (dZ/dt(max) and the peak amplitude of the Fast Fourier Transform of dZ/dt between 1.5 Hz and 4.5 Hz). Multivariate analysis was performed to fit stroke volumes from gated myocardial perfusion imaging scans with linear and quadratic terms for dZ/dt(max) and the Fast Fourier Transform to identify significant parameters for incorporation into a cardiac arrest diagnostic algorithm. The square of the peak amplitude of the Fast Fourier Transform of dZ/dt was the best predictor of reduction in stroke volumes from gated myocardial perfusion imaging scans (range = 33-85 mL; p = .016). Having established that the two pad impedance cardiogram system could detect differences in stroke volumes from gated myocardial perfusion imaging scans, we assessed its performance in diagnosing cardiac arrest. Phase 2: The impedance cardiogram was recorded in 132 "cardiac arrest" patients (53 training, 79 validation) and 97 controls (47 training, 50 validation): the diagnostic algorithm indicated cardiac arrest with sensitivities and specificities (+/- exact 95% confidence intervals) of 89.1% (85.4-92.1) and 99.6% (99.4-99.7; training) and 81.1% (77.6-84.3) and 97% (96.7-97.4; validation).

CONCLUSIONS: The impedance cardiogram algorithm is a significant marker of circulatory collapse. Automated defibrillators with an integrated impedance cardiogram could improve emergency care by lay persons, enabling rapid and appropriate initiation of external cardiac massage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Neuropsychological deficits have been reported in association with first-episode psychosis (FEP). Reductions in grey matter (GM) volumes have been documented in FEP subjects compared to healthy controls. However, the possible inter-relationship between the findings of those two lines of research has been scarcely investigated.

Objective: To investigate the relationship between neuropsychological deficits and GM volume abnormalities in a population-based sample of FEP patients compared to healthy controls from the same geographical area.

Methods: FEP patients (n = 88) and control subjects (n = 86) were evaluated by neuropsychological assessment (Controlled Oral Word Association Test, forward and backward digit span tests) and magnetic resonance imaging using voxel-based morphometry.

Results: Single-group analyses showed that prefrontal and temporo-parietal GM volumes correlated significantly (p < 0.05, corrected) with cognitive performance in FEP patients. A similar pattern of direct correlations between neocortical GM volumes and cognitive impairment was seen in the schizophrenia subgroup (n = 48). In the control group, cognitive performance was directly correlated with GM volume in the right dorsal anterior cingulate cortex and inversely correlated with parahippocampal gyral volumes bilaterally. Interaction analyses with "group status" as a predictor variable showed significantly greater positive correlation within the left inferior prefrontal cortex (BA46) in the FEP group relative to controls, and significantly greater negative correlation within the left parahippocampal gyrus in the control group relative to FEP patients.

Conclusion: Our results indicate that cognitive deficits are directly related to brain volume abnormalities in frontal and temporo-parietal cortices in FEP subjects, most specifically in inferior portions of the dorsolateral prefrontal cortex. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A plethora of studies have described the in vitro assessment of dissolving microneedle (MN) arrays for enhanced transdermal drug delivery, utilising a wide variety of model membranes as a representation of the skin barrier. However, to date, no discussion has taken place with regard to the choice of model skin membrane and the impact this may have on the evaluation of MN performance. In this study, we have, for the first time, critically assessed the most common types of in vitro skin permeation models - a synthetic hydrophobic membrane (Silescol(®) of 75 µm) and neonatal porcine skin of definable thickness (300-350 µm and 700-750 µm) - for evaluating the performance of drug loaded dissolving poly (methyl vinyl ether co maleic acid) (PMVE/MA) MN arrays. It was found that the choice of in vitro skin model had a significant effect on the permeation of a wide range of small hydrophilic molecules released from dissolving MNs. For example, when Silescol(®) was used as the model membrane, the cumulative percentage permeation of methylene blue 24h after the application of dissolvable MNs was found to be only approximately 3.7% of the total methylene blue loaded into the MN device. In comparison, when dermatomed and full thickness neonatal porcine skin were used as a skin model, approximately 67.4% and 47.5% of methylene blue loaded into the MN device was delivered across the skin 24h after the application of MN arrays, respectively. The application of methylene blue loaded MN arrays in a rat model in vivo revealed that the extent of MN-mediated percutaneous delivery achieved was most similar to that predicted from the in vitro investigations employing dermatomed neonatal porcine skin (300-350 µm) as the model skin membrane. On the basis of these results, a wider discussion within the MN community will be necessary to standardise the experimental protocols used for the evaluation and comparison of MN devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES:: We assessed the effectiveness of ToT from VR laparoscopic simulation training in 2 studies. In a second study, we also assessed the TER. ToT is a detectable performance improvement between equivalent groups, and TER is the observed percentage performance differences between 2 matched groups carrying out the same task but with 1 group pretrained on VR simulation. Concordance between simulated and in-vivo procedure performance was also assessed. DESIGN:: Prospective, randomized, and blinded. PARTICIPANTS:: In Study 1, experienced laparoscopic surgeons (n = 195) and in Study 2 laparoscopic novices (n = 30) were randomized to either train on VR simulation before completing an equivalent real-world task or complete the real-world task only. RESULTS:: Experienced laparoscopic surgeons and novices who trained on the simulator performed significantly better than their controls, thus demonstrating ToT. Their performance showed a TER between 7% and 42% from the virtual to the real tasks. Simulation training impacted most on procedural error reduction in both studies (32- 42%). The correlation observed between the VR and real-world task performance was r > 0·96 (Study 2). CONCLUSIONS:: VR simulation training offers a powerful and effective platform for training safer skills.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prostatic intraepithelial neoplasia (PIN) diagnosis and grading are affected by uncertainties which arise from the fact that almost all knowledge of PIN histopathology is expressed in concepts, descriptive linguistic terms, and words. A Bayesian belief network (BBN) was therefore used to reduce the problem of uncertainty in diagnostic clue assessment, while still considering the dependences between elements in the reasoning sequence. A shallow network was used with an open-tree topology, with eight first-level descendant nodes for the diagnostic clues (evidence nodes), each independently linked by a conditional probability matrix to a root node containing the diagnostic alternatives (decision node). One of the evidence nodes was based on the tissue architecture and the others were based on cell features. The system was designed to be interactive, in that the histopathologist entered evidence into the network in the form of likelihood ratios for outcomes at each evidence node. The efficiency of the network was tested on a series of 110 prostate specimens, subdivided as follows: 22 cases of non-neoplastic prostate or benign prostatic tissue (NP), 22 PINs of low grade (PINlow), 22 PINs of high grade (PINhigh), 22 prostatic adenocarcinomas with cribriform pattern (PACcri), and 22 prostatic adenocarcinomas with large acinar pattern (PAClgac). The results obtained in the benign and malignant categories showed that the belief for the diagnostic alternatives is very high, the values being in general more than 0.8 and often close to 1.0. When considering the PIN lesions, the network classified and graded most of the cases with high certainty. However, there were some cases which showed values less than 0.8 (13 cases out of 44), thus indicating that there are situations in which the feature changes are intermediate between contiguous categories or grades. Discrepancy between morphological grading and the BBN results was observed in four out of 44 PIN cases: one PINlow was classified as PINhigh and three PINhigh were classified as PINlow. In conclusion, the network can grade PlN lesions and differentiate them from other prostate lesions with certainty. In particular, it offers a descriptive classifier which is readily implemented and which allows the use of linguistic, fuzzy variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There are many issues regarding the use of real patients in objective structured clinical examinations (OSCEs). In dermatology OSCE stations, standardised patients (SPs) with clinical photographs are often used. Temporary transfer tattoos can potentially simulate skin lesions when applied to an SP. This study aims to appraise the use of temporary malignant melanoma tattoos within an OSCE framework. Method: Within an 11-station OSCE, a temporary malignant melanoma tattoo was developed and applied to SPs in a 'skin lesion' OSCE station. A questionnaire captured the opinions of the candidate, SP and examiners, and the degree of perceived realism of each station was determined. Standard post hoc OSCE analysis determined the psychometric reliability of the stations. Results: The response rates were 95.9 per cent of candidates and 100 per cent of the examiners and SPs. The 'skin lesion' station achieved the highest realism score compared with other stations: 89.0 per cent of candidates felt that the skin lesion appeared realistic; only 28 per cent of candidates had ever seen a melanoma before in training. The psychometric performance of the melanoma station was comparable with, and in many instances better than, other OSCE stations. Discussion: Transfer tattoo technology facilitates a realistic dermatology OSCE station encounter. Temporary tattoos, alongside trained SPs, provide an authentic, standardised and reliable experience, allowing the assessment of integrated dermatology clinical skills.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to assess a novel semisolid material as a potential topical drug delivery system for acute laceration. The objectives were to correlate physical characterization data using rheologic studies and to compare with clinical assessment of performance in an emergency department (ED).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiplexed immunochemical detection platforms offer the potential to decrease labour demands, increase sample throughput and decrease overall time to result. A prototype four channel multiplexed high throughput surface plasmon resonance biosensor was previously developed, for the detection of food related contaminants. A study focused on determining the instruments performance characteristics was undertaken. This was followed by the development of a multiplexed assay for four high molecular weight proteins. The protein levels were simultaneously evaluated in serum samples of 10-week-old veal calves (n = 24) using multiple sample preparation methods. Each of the biosensor's four channels were shown to be independent of one another and produced multiplexed within run repeatability (n = 6) ranging from 2.0 to 6.7%CV, for the four tested proteins, whilst between run reproducibility (n = 4) ranged from 1.5 to 8.9%CV. Four calibration curves were successfully constructed before serum sample preparation was optimised for each protein. Multiplexed concentration analysis was successfully performed on four channels revealing that each proteins concentration was consistent across the twenty-four tested animals. Signal reproducibility (n > 19) on a further long term study revealed coefficient of variation ranging from 1.1% to 7.3% and showed that the multiplexed assay was stable for at least 480 cycles. These findings indicate that the performance characteristics fall within the range of previously published data for singleplex optical biosensors and that the multiplexing biosensor is fit-for-purpose for simultaneous concentration analysis in many different types of applications such as the multiplexed detection of markers of growth-promoter abuse and multiplexed detection of residues of concern in food safety. © 2013 Elsevier B.V.