860 resultados para comparison method
Resumo:
To evaluate an investment project in the competitive electricity market, there are several key factors that affects the project's value: the present value that the project could bring to investor, the possible future course of actions that investor has and the project's management flexibility. The traditional net present value (NPV) criteria has the ability to capture the present value of the project's future cash flow, but it fails to assess the value brought by market uncertainty and management flexibility. By contrast with NPV, the real options approach (ROA) method has the advantage to combining the uncertainty and flexibility in evaluation process. In this paper, a framework for using ROA to evaluate the generation investment opportunity has been proposed. By given a detailed case study, the proposed framework is compared with NPV and showing a different results
Resumo:
We conducted a 15-month feasibility study of telepaediatrics. A novel service was offered to two hospitals in Queensland (Mackay and Hervey Bay). We used data from all other hospitals throughout the state as the control group. Although both intervention hospitals were provided with the same service, the telepaediatric activity generated and the effect on admissions and outpatient activity were markedly different. There was a significant decrease in the number of patient admissions to Brisbane from the Mackay region. In addition, there was an increase in the number of Mackay patients treated locally (as outpatients). In contrast, little change was observed in Hervey Bay. We assessed whether the observed differences between the two hospitals were due to various factors which influenced the use of the telepaediatric service. These factors included the method of screening patients before transfer to the tertiary centre and the physical distance between each facility and the tertiary centre. We believe that the screening method used for patient referrals was the most important determinant of the use of the telepaediatric service.
Resumo:
Background Medicines reconciliation-identifying and maintaining an accurate list of a patient's current medications-should be undertaken at all transitions of care and available to all patients. Objective A self-completion web survey was conducted for chief pharmacists (or equivalent) to evaluate medicines reconciliation levels in secondary care mental health organisations. Setting The survey was sent to secondary care mental health organisations in England, Scotland, Northern Ireland and Wales. Method The survey was launched via Bristol Online Surveys. Quantitative data was analysed using descriptive statistics and qualitative data was collected through respondents free-text answers to specific questions. Main outcomes measure Investigate how medicines reconciliation is delivered, incorporate a clear description of the role of pharmacy staff and identify areas of concern. Results Forty-two (52 % response rate) surveys were completed. Thirty-seven (88.1 %) organisations have a formal policy for medicines reconciliation with defined steps. Results show that the pharmacy team (pharmacists and pharmacy technicians) are the main professionals involved in medicines reconciliation with a high rate of doctors also involved. Training procedures frequently include an induction by pharmacy for doctors whilst the pharmacy team are generally trained by another member of pharmacy. Mental health organisations estimate that nearly 80 % of medicines reconciliation is carried out within 24 h of admission. A full medicines reconciliation is not carried out on patient transfer between mental health wards; instead quicker and less exhaustive variations are implemented. 71.4 % of organisations estimate that pharmacy staff conduct daily medicine reconciliations for acute admission wards (Monday to Friday). However, only 38 % of organisations self-report to pharmacy reconciling patients' medication for other teams that admit from primary care. Conclusion Most mental health organisations appear to be complying with NICE guidance on medicines reconciliation for their acute admission wards. However, medicines reconciliation is conducted less frequently on other units that admit from primary care and rarely completed on transfer when it significantly differs to that on admission. Formal training and competency assessments on medicines reconciliation should be considered as current training varies and adherence to best practice is questionable.
Resumo:
The overall objective of this work was to compare the effect of pre-treatment and catalysts on the quality of liquid products from fast pyrolysis of biomass. This study investigated the upgrading of bio-oil in terms of its quality as a bio-fuel and/or source of chemicals. Bio-oil used directly as a biofuel for heat or power needs to be improved particularly in terms of temperature sensitivity, oxygen content, chemical instability, solid content, and heating values. Chemicals produced from bio-oil need to be able to meet product specifications for market acceptability. There were two main objectives in this research. The first was to examine the influence of pre-treatment of biomass on the fast pyrolysis process and liquid quality. The relationship between the method of pre-treatment of biomass feedstock to fast pyrolysis oil quality was studied. The thermal decomposition behaviour of untreated and pretreated feedstocks was studied by using a TGA (thermogravimetric analysis) and a Py-GC/MS (pyroprobe-gas chromatography/mass spectrometry). Laboratory scale reactors (100g/h, 300g/h, 1kg/h) were used to process untreated and pretreated feedstocks by fast pyrolysis. The second objective was to study the influence of numerous catalysts on fast pyrolysis liquids from wheat straw. The first step applied analytical pyrolysis (Py-GC/MS) to determine which catalysts had an effect on fast pyrolysis liquid, in order to select catalysts for further laboratory fast pyrolysis. The effect of activation, temperature, and biomass pre-treatment on catalysts were also investigated. Laboratory experiments were also conducted using the existing 300g/h fluidised bed reactor system with a secondary catalytic fixed bed reactor. The screening of catalysts showed that CoMo was a highly active catalyst, which particularly reduced the higher molecular weight products of fast pyrolysis. From these screening tests, CoMo catalyst was selected for larger scale laboratory experiments. With reference to the effect of pre-treatment work on fast pyrolysis process, a significant effect occurred on the thermal decomposition of biomass, as well as the pyrolysis products composition, and the proportion of key components in bio-oil. Torrefaction proved to have a mild influence on pyrolysis products, when compared to aquathermolysis and steam pre-treatment.
Resumo:
Bone marrow mesenchymal stem cells (MSCs) promote nerve growth and functional recovery in animal models of spinal cord injury (SCI) to varying levels. The authors have tested high-content screening to examine the effects of MSC-conditioned medium (MSC-CM) on neurite outgrowth from the human neuroblastoma cell line SH-SY5Y and from explants of chick dorsal root ganglia (DRG). These analyses were compared to previously published methods that involved hand-tracing individual neurites. Both methods demonstrated that MSC-CM promoted neurite outgrowth. Each showed the proportion of SH-SY5Y cells with neurites increased by ~200% in MSC-CM within 48 h, and the number of neurites/SH-SY5Y cells was significantly increased in MSC-CM compared with control medium. For high-content screening, the analysis was performed within minutes, testing multiple samples of MSC-CM and in each case measuring >15,000 SH-SY5Y cells. In contrast, the manual measurement of neurite outgrowth from >200 SH-SY5Y cells in a single sample of MSC-CM took at least 1 h. High-content analysis provided additional measures of increased neurite branching in MSC-CM compared with control medium. MSC-CM was also found to stimulate neurite outgrowth in DRG explants using either method. The application of the high-content analysis was less well optimized for measuring neurite outgrowth from DRG explants than from SH-SY5Y cells.
Resumo:
A novel dissolution method was developed, suitable for powder mixtures, based on the USP basket apparatus. The baskets were modified such that the powder mixtures were retained within the baskets and not dispersed, a potential difficulty that may arise when using conventional USP basket and paddle apparatus. The advantages of this method were that the components of the mixtures were maintained in close proximity, maximizing any drug:excipient interaction and leading to more linear dissolution profiles. Two weakly acidic model drugs, ibuprofen and acetaminophen, and a selection of pharmaceutical excipients, including potential dissolution-enhancing alkalizing agents, were chosen for investigation. Dissolution profiles were obtained for simple physical mixtures. The f1 fit factor values, calculated using pure drug as the reference material, demonstrated a trend in line with expectations, with several dissolution enhancers apparent for both drugs. Also, the dissolution rates were linear over substantial parts of the profiles. For both drugs, a rank order comparison between the f1 fit factor and calculated dissolution rate, obtained from the linear section of the dissolution profile, demonstrated a correlation using a significance level of P=0.05. The method was proven to be suitable for discriminating between the effects of excipients on the dissolution of the model drugs. The method design produced dissolution profiles where the dissolution rate was linear for a substantial time, allowing determination of the dissolution rate without mathematical transformation of the data. This method may be suitable as a preliminary excipient-screening tool in the drug formulation development process.
Resumo:
Recently, we introduced a new 'GLM-beamformer' technique for MEG analysis that enables accurate localisation of both phase-locked and non-phase-locked neuromagnetic effects, and their representation as statistical parametric maps (SPMs). This provides a useful framework for comparison of the full range of MEG responses with fMRI BOLD results. This paper reports a 'proof of principle' study using a simple visual paradigm (static checkerboard). The five subjects each underwent both MEG and fMRI paradigms. We demonstrate, for the first time, the presence of a sustained (DC) field in the visual cortex, and its co-localisation with the visual BOLD response. The GLM-beamformer analysis method is also used to investigate the main non-phase-locked oscillatory effects: an event-related desynchronisation (ERD) in the alpha band (8-13 Hz) and an event-related synchronisation (ERS) in the gamma band (55-70 Hz). We show, using SPMs and virtual electrode traces, the spatio-temporal covariance of these effects with the visual BOLD response. Comparisons between MEG and fMRI data sets generally focus on the relationship between the BOLD response and the transient evoked response. Here, we show that the stationary field and changes in oscillatory power are also important contributors to the BOLD response, and should be included in future studies on the relationship between neuronal activation and the haemodynamic response. © 2005 Elsevier Inc. All rights reserved.
Resumo:
In recent work we have developed a novel variational inference method for partially observed systems governed by stochastic differential equations. In this paper we provide a comparison of the Variational Gaussian Process Smoother with an exact solution computed using a Hybrid Monte Carlo approach to path sampling, applied to a stochastic double well potential model. It is demonstrated that the variational smoother provides us a very accurate estimate of mean path while conditional variance is slightly underestimated. We conclude with some remarks as to the advantages and disadvantages of the variational smoother. © 2008 Springer Science + Business Media LLC.
Resumo:
OBJECTIVE: To assess the effect of using different risk calculation tools on how general practitioners and practice nurses evaluate the risk of coronary heart disease with clinical data routinely available in patients' records. DESIGN: Subjective estimates of the risk of coronary heart disease and results of four different methods of calculation of risk were compared with each other and a reference standard that had been calculated with the Framingham equation; calculations were based on a sample of patients' records, randomly selected from groups at risk of coronary heart disease. SETTING: General practices in central England. PARTICIPANTS: 18 general practitioners and 18 practice nurses. MAIN OUTCOME MEASURES: Agreement of results of risk estimation and risk calculation with reference calculation; agreement of general practitioners with practice nurses; sensitivity and specificity of the different methods of risk calculation to detect patients at high or low risk of coronary heart disease. RESULTS: Only a minority of patients' records contained all of the risk factors required for the formal calculation of the risk of coronary heart disease (concentrations of high density lipoprotein (HDL) cholesterol were present in only 21%). Agreement of risk calculations with the reference standard was moderate (kappa=0.33-0.65 for practice nurses and 0.33 to 0.65 for general practitioners, depending on calculation tool), showing a trend for underestimation of risk. Moderate agreement was seen between the risks calculated by general practitioners and practice nurses for the same patients (kappa=0.47 to 0.58). The British charts gave the most sensitive results for risk of coronary heart disease (practice nurses 79%, general practitioners 80%), and it also gave the most specific results for practice nurses (100%), whereas the Sheffield table was the most specific method for general practitioners (89%). CONCLUSIONS: Routine calculation of the risk of coronary heart disease in primary care is hampered by poor availability of data on risk factors. General practitioners and practice nurses are able to evaluate the risk of coronary heart disease with only moderate accuracy. Data about risk factors need to be collected systematically, to allow the use of the most appropriate calculation tools.
Resumo:
Two contrasting multivariate statistical methods, viz., principal components analysis (PCA) and cluster analysis were applied to the study of neuropathological variations between cases of Alzheimer's disease (AD). To compare the two methods, 78 cases of AD were analyzed, each characterised by measurements of 47 neuropathological variables. Both methods of analysis revealed significant variations between AD cases. These variations were related primarily to differences in the distribution and abundance of senile plaques (SP) and neurofibrillary tangles (NFT) in the brain. Cluster analysis classified the majority of AD cases into five groups which could represent subtypes of AD. However, PCA suggested that variation between cases was more continuous with no distinct subtypes. Hence, PCA may be a more appropriate method than cluster analysis in the study of neuropathological variations between AD cases.
Resumo:
Counts of Pick bodies (PB), Pick cells (PC), senile plaques (SP) and neurofibrillary tangles (NFT) were made in the frontal and temporal cortex from patients with Pick's disease (PD). Lesions were stained histologically with hematoxylin and eosin (HE) and the Bielschowsky silver impregnation method and labeled immunohistochemically with antibodies raised to ubiquitin and tau. The greatest numbers of PB were revealed by immunohistochemistry. Counts of PB revealed by ubiquitin and tau were highly positively correlated which suggested that the two antibodies recognized virtually identical populations of PB. The greatest numbers of PC were revealed by HE followed by the anti-ubiquitin antibody. However, the correlation between counts was poor, suggesting that HE and ubiquitin revealed different populations of PC. The greatest numbers of SP and NFT were revealed by the Bielschowsky method indicating the presence of Alzheimer-type lesions not revealed by the immunohistochemistry. In addition, more NFT were revealed by the anti-ubiquitin compared with the anti-tau antibody. The data suggested that in PD: (i) the anti-ubiquitin and anti-tau antibodies were equally effective at labeling PB; (ii) both HE and anti-ubiquitin should be used to quantitate PC; and (iii) the Bielschowsky method should be used to quantitate SP and NFT.
Resumo:
Objective: Qualitative research is increasingly valued as part of the evidence for policy and practice, but how it should be appraised is contested. Various appraisal methods, including checklists and other structured approaches, have been proposed but rarely evaluated. We aimed to compare three methods for appraising qualitative research papers that were candidates for inclusion in a systematic review of evidence on support for breast-feeding. Method: A sample of 12 research papers on support for breast-feeding was appraised by six qualitative reviewers using three appraisal methods: unprompted judgement, based on expert opinion; a UK Cabinet Office quality framework; and CASP, a Critical Appraisal Skills Programme tool. Papers were assigned, following appraisals, to 1 of 5 categories, which were dichotomized to indicate whether or not papers should be included in a systematic review. Patterns of agreement in categorization of papers were assessed quantitatively using κ statistics, and qualitatively using cross-case analysis. Results: Agreement in categorizing papers across the three methods was slight (κ =0.13; 95% CI 0.06-0.24). Structured approaches did not appear to yield higher agreement than that by unprompted judgement. Qualitative analysis revealed reviewers' dilemmas in deciding between the potential impact of findings and the quality of the research execution or reporting practice. Structured instruments appeared to make reviewers more explicit about the reasons for their judgements. Conclusions: Structured approaches may not produce greater consistency of judgements about whether to include qualitative papers in a systematic review. Future research should address how appraisals of qualitative research should be incorporated in systematic reviews. © The Royal Society of Medicine Press Ltd 2007.
Resumo:
We present experimental studies and numerical modeling based on a combination of the Bidirectional Beam Propagation Method and Finite Element Modeling that completely describes the wavelength spectra of point by point femtosecond laser inscribed fiber Bragg gratings, showing excellent agreement with experiment. We have investigated the dependence of different spectral parameters such as insertion loss, all dominant cladding and ghost modes and their shape relative to the position of the fiber Bragg grating in the core of the fiber. Our model is validated by comparing model predictions with experimental data and allows for predictive modeling of the gratings. We expand our analysis to more complicated structures, where we introduce symmetry breaking; this highlights the importance of centered gratings and how maintaining symmetry contributes to the overall spectral quality of the inscribed Bragg gratings. Finally, the numerical modeling is applied to superstructure gratings and a comparison with experimental results reveals a capability for dealing with complex grating structures that can be designed with particular wavelength characteristics.
Resumo:
Lipidome profile of fluids and tissues is a growing field as the role of lipids as signaling molecules is increasingly understood, relying on an effective and representative extraction of the lipids present. A number of solvent systems suitable for lipid extraction are commonly in use, though no comprehensive investigation of their effectiveness across multiple lipid classes has been carried out. To address this, human LDL from normolipidemic volunteers was used to evaluate five different solvent extraction protocols [Folch, Bligh and Dyer, acidified Bligh and Dyer, methanol (MeOH)-tert-butyl methyl ether (TBME), and hexane-isopropanol] and the extracted lipids were analyzed by LC-MS in a high-resolution instrument equipped with polarity switching. Overall, more than 350 different lipid species from 19 lipid subclasses were identified. Solvent composition had a small effect on the extraction of predominant lipid classes (triacylglycerides, cholesterol esters, and phosphatidylcholines). In contrast, extraction of less abundant lipids (phosphatidylinositols, lyso-lipids, ceramides, and cholesterol sulfates) was greatly influenced by the solvent system used. Overall, the Folch method was most effective for the extraction of a broad range of lipid classes in LDL, although the hexane-isopropanol method was best for apolar lipids and the MeOH-TBME method was suitable for lactosyl ceramides. Copyright © 2013 by the American Society for Biochemistry and Molecular Biology, Inc.
Resumo:
There is increasing evidence that non-enzymatic post-translational protein modifications might play key roles in various diseases. These protein modifications can be caused by free radicals generated during oxidative stress or by their products generated during lipid peroxidation. 4-Hydroxynonenal (HNE), a major biomarker of oxidative stress and lipid peroxidation, has been recognized as important molecule in pathology as well as in physiology of living organisms. Therefore, its detection and quantification can be considered as valuable tool for evaluating various pathophysiological conditions.The HNE-protein adduct ELISA is a method to detect HNE bound to proteins, which is considered as the most likely form of HNE occurrence in living systems. Since the earlier described ELISA has been validated for cell lysates and the antibody used for detection of HNE-protein adducts is non-commercial, the aim of this work was to adapt the ELISA to a commercial antibody and to apply it in the analysis of human plasma samples.After modification and validation of the protocol for both antibodies, samples of two groups were analyzed: apparently healthy obese (n=62) and non-obese controls (n=15). Although the detected absolute values of HNE-protein adducts were different, depending on the antibody used, both ELISA methods showed significantly higher values of HNE-protein adducts in the obese group. © 2013 The Authors.