894 resultados para Confusion Assessment Method
Resumo:
Objectives To identify criteria by which patients can assess the communication skills of pharmacy students. Method Potential assessment criteria were generated from 2 main sources: a literature review and a focus group discussion. A modified two-round Delphi survey was subsequently conducted with 35 professionals who were actively involved in teaching and assessing communication skills of pharmacy students to determine the importance and reliability of each criterion. Results Consensus ratings identified 7 criteria that were important measures of pharmacy students' communication skills and could be reliably assessed by patients. Conclusions A modified two-round Delphi consultation survey successfully identified criteria that can be used by patients to assess the communication skills of pharmacy undergraduates. Future work will examine the feasibility of using patients as assessors of communication skills of pharmacy students, preregistration pharmacists, and qualified pharmacists.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.
Resumo:
Three British bituminous coals, (Gedling, Cresswell, and Cortonwood Silkstone) were selected for study. Procedures were developed, using phase transfer catalysts (PTC's), to degrade the solvent insoluble fractions of the coals. PTC's are of interest because they have the potential to bring about selective high conversion reactions, under mild conditions, (often in the past, severe reaction conditions have had to be used to degrade the coals, this in turn resulted in the loss of much of the structural information). We have applied a variety of physical and chemical techniques to maximise the amount of structural information, these include, elemental analysis, 1H-NMR, 13C-CPMAS-NMR, GPC, GC-MS, FTIR spectroscopy, DRIFT spectroscopy, and gas adsorption measurements. The main conclusions from the work are listed below:- ( 1 ) PTC O-methylation; This reaction removes hydrogen bonds within the coal matrix by 'capping' the phenolic groups. It was found that the polymer-like matrix could be made more flexible, but not significantly more soluble, by O-methylation. I.E. the trapped or 'mobile' phase of the coals could be removed at a faster rate after this reaction had been carried out. ( 2 ) PTC Reductive and Acidic Ether Cleavage; The three coals were found to contain insignificant amounts of dialkyl and alkyl aryl ethers. The number of diaryl ethers could not be estimated, by reductive ether cleavage, (even though a high proportion of all three coals was solublised). The majority of the ethers present in the coals were inert to both cleavage methods, and are therefore assumed to be heterocyclic ethers. ( 3 ) Trif!uoroperacetic Acid Oxidation; This oxidant was used to study the aliphatic portions of the polymer-like macromolecular matrix of the coals. Normally this reagent will only solublise low rank coals, we however have developed a method whereby trifluoroperacetic acid can be used to degrade high rank bituminous coals. ( 4 ) PTC/Permanganate Oxidation; This reagent has been found to be much more selective than the traditional alkaline permanganate oxidation, with a lot more structural information being retained within the various fractions. This degradative method therefore has the potential of yielding new information about the molecular structure of coals.
Resumo:
The Octopus Automated Perimeter was validated in a comparative study and found to offer many advantages in the assessment of the visual field. The visual evoked potential was investigated in an extensive study using a variety of stimulus parameters to simulate hemianopia and central visual field defects. The scalp topography was recorded topographically and a technique to compute the source derivation of the scalp potential was developed. This enabled clarification of the expected scalp distribution to half field stimulation using different electrode montages. The visual evoked potential following full field stimulation was found to be asymmetrical around the midline with a bias over the left occiput particularly when the foveal polar projections of the occipital cortex were preferentially stimulated. The half field response reflected the distribution asymmetry. Masking of the central 3° resulted in a response which was approximately symmetrical around the midline but there was no evidence of the PNP-complex. A method for visual field quantification was developed based on the neural representation of visual space (Drasdo and Peaston 1982) in an attempt to relate visual field depravation with the resultant visual evoked potentials. There was no form of simple, diffuse summation between the scalp potential and the cortical generators. It was, however, possible to quantify the degree of scalp potential attenuation for M-scaled full field stimuli. The results obtained from patients exhibiting pre-chiasmal lesions suggested that the PNP-complex is not scotomatous in nature but confirmed that it is most likely to be related to specific diseases (Harding and Crews 1982). There was a strong correlation between the percentage information loss of the visual field and the diagnostic value of the visual evoked potential in patients exhibiting chiasmal lesions.
Resumo:
PURPOSE. To establish an alternative method, sequential and diameter response analysis (SDRA), to determine dynamic retinal vessel responses and their time course in serial stimulation compared with the established method of averaged diameter responses and standard static assessment. METHODS. SDRA focuses on individual time and diameter responses, taking into account the fluctuation in baseline diameter, providing improved insight into reaction patterns when compared with established methods as delivered by retinal vessel analyzer (RVA) software. SDRA patterns were developed with measurements from 78 healthy nonsmokers and subsequently validated in a group of 21 otherwise healthy smokers. Fundus photography and retinal vessel responses were assessed by RVA, intraocular pressure by contact tonometry, and blood pressure by sphygmomanometry. RESULTS. Compared with the RVA software method, SDRA demonstrated a marked difference in retinal vessel responses to flickering light (P 0.05). As a validation of that finding, SDRA showed a strong relation between baseline retinal vessel diameter and subsequent dilatory response in both healthy subjects and smokers (P 0.001). The RVA software was unable to detect this difference or to find a difference in retinal vessel arteriovenous ratio between smokers and nonsmokers (P 0.243). However, SDRA revealed that smokers’ vessels showed both an increased level of arterial baseline diameter fluctuation before flicker stimulation (P 0.005) and an increased stiffness of retinal arterioles (P 0.035) compared with those in nonsmokers. These differences were unrelated to intraocular pressure or systemic blood pressure. CONCLUSIONS. SDRA shows promise as a tool for the assessment of vessel physiology. Further studies are needed to explore its application in patients with vascular diseases.
Substances hazardous to health:the nature of the expertise associated with competent risk assessment
Resumo:
This research investigated expertise in hazardous substance risk assessment (HSRA). Competent pro-active risk assessment is needed to prevent occupational ill-health caused by hazardous substance exposure occurring in the future. In recent years there has been a strong demand for HSRA expertise and a shortage of expert practitioners. The discipline of Occupational Hygiene was identified as the key repository of knowledge and skills for HSRA and one objective of this research was to develop a method to elicit this expertise from experienced occupational hygienists. In the study of generic expertise, many methods of knowledge elicitation (KE) have been investigated, since this has been relevant to the development of 'expert systems' (thinking computers). Here, knowledge needed to be elicited from human experts, and this stage was often a bottleneck in system development, since experts could not explain the basis of their expertise. At an intermediate stage, information collected was used to structure a basic model of hazardous substance risk assessment activity (HSRA Model B) and this formed the basis of tape transcript analysis in the main study with derivation of a 'classification' and a 'performance matrix'. The study aimed to elicit the expertise of occupational hygienists and compare their performance with other health and safety professionals (occupational health physicians, occupational health nurses, health and safety practitioners and trainee health and safety inspectors), as evaluated using the matrix. As a group, the hygienists performed best in the exercise, and this group were particularly good at process elicitation and at recommending specific control measures, although the other groups also performed well in selected aspects of the matrix and the work provided useful findings and insights. From the research, two models of HSRA have been derived, an HSRA aid, together with a novel videotape KE technique and interesting research findings. The implications of this are discussed with respect to future training of HS professionals and wider application of the videotape KE method.
Resumo:
Hierarchical knowledge structures are frequently used within clinical decision support systems as part of the model for generating intelligent advice. The nodes in the hierarchy inevitably have varying influence on the decisionmaking processes, which needs to be reflected by parameters. If the model has been elicited from human experts, it is not feasible to ask them to estimate the parameters because there will be so many in even moderately-sized structures. This paper describes how the parameters could be obtained from data instead, using only a small number of cases. The original method [1] is applied to a particular web-based clinical decision support system called GRiST, which uses its hierarchical knowledge to quantify the risks associated with mental-health problems. The knowledge was elicited from multidisciplinary mental-health practitioners but the tree has several thousand nodes, all requiring an estimation of their relative influence on the assessment process. The method described in the paper shows how they can be obtained from about 200 cases instead. It greatly reduces the experts’ elicitation tasks and has the potential for being generalised to similar knowledge-engineering domains where relative weightings of node siblings are part of the parameter space.
Resumo:
Biological soil crusts (BSCs) are formed by aggregates of soil particles and communities of microbial organisms and are common in all drylands. The role of BSCs on infiltration remains uncertain due to the lack of data on their role in affecting soil physical properties such as porosity and structure. Quantitative assessment of these properties is primarily hindered by the fragile nature of the crusts. Here we show how the use of a combination of non-destructive imaging X-ray microtomography (XMT) and Lattice Boltzmann method (LBM) enables quantification of key soil physical parameters and the modeling of water flow through BSCs samples from Kalahari Sands, Botswana. We quantify porosity and flow changes as a result of mechanical disturbance of such a fragile cyanobacteria-dominated crust. Results show significant variations in porosity between different types of crusts and how they affect the flow and that disturbance of a cyanobacteria-dominated crust results in the breakdown of larger pore spaces and reduces flow rates through the surface layer. We conclude that the XMT–LBM approach is well suited for study of fragile surface crust samples where physical and hydraulic properties cannot be easily quantified using conventional methods.
Resumo:
Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.
Resumo:
Data Envelopment Analysis (DEA) is recognized as a modern approach to the assessment of performance of a set of homogeneous Decision Making Units (DMUs) that use similar sources to produce similar outputs. While DEA commonly is used with precise data, recently several approaches are introduced for evaluating DMUs with uncertain data. In the existing approaches many information on uncertainties are lost. For example in the defuzzification, the a-level and fuzzy ranking approaches are not considered. In the tolerance approach the inequality or equality signs are fuzzified but the fuzzy coefficients (inputs and outputs) are not treated directly. The purpose of this paper is to develop a new model to evaluate DMUs under uncertainty using Fuzzy DEA and to include a-level to the model under fuzzy environment. An example is given to illustrate this method in details.
Resumo:
In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties
Resumo:
Objective: Development and validation of a selective and sensitive LCMS method for the determination of methotrexate polyglutamates in dried blood spots (DBS). Methods: DBS samples [spiked or patient samples] were prepared by applying blood to Guthrie cards which was then dried at room temperature. The method utilised 6-mm disks punched from the DBS samples (equivalent to approximately 12 μl of whole blood). The simple treatment procedure was based on protein precipitation using perchloric acid followed by solid phase extraction using MAX cartridges. The extracted sample was chromatographed using a reversed phase system involving an Atlantis T3-C18 column (3 μm, 2.1x150 mm) preceded by Atlantis guard column of matching chemistry. Analytes were subjected to LCMS analysis using positive electrospray ionization. Key Results: The method was linear over the range 5-400 nmol/L. The limits of detection and quantification were 1.6 and 5 nmol/L for individual polyglutamates and 1.5 and 4.5 nmol/L for total polyglutamates, respectively. The method has been applied successfully to the determination of DBS finger-prick samples from 47 paediatric patients and results confirmed with concentrations measured in matched RBC samples using conventional HPLC-UV technique. Conclusions and Clinical Relevance: The methodology has a potential for application in a range of clinical studies (e.g. pharmacokinetic evaluations or medication adherence assessment) since it is minimally invasive and easy to perform, potentially allowing parents to take blood samples at home. The feasibility of using DBS sampling can be of major value for future clinical trials or clinical care in paediatric rheumatology. © 2014 Hawwa et al.
Resumo:
An increasing number of publications on the dried blood spot (DBS) sampling approach for the quantification of drugs and metabolites have been spurred on by the inherent advantages of this sampling technique. In the present research, a selective and sensitive high-performance liquid chromatography method for the concurrent determination of multiple antiepileptic drugs (AEDs) [levetiracetam (LVT), lamotrigine (LTG), phenobarbital (PHB)], carbamazepine (CBZ) and its active metabolite carbamazepine-10,11 epoxide (CBZE)] in a single DBS has been developed and validated. Whole blood was spotted onto Guthrie cards and dried. Using a standard punch (6 mm diameter), a circular disc was punched from the card and extracted with methanol: acetonitrile (3:1, v/v) containing hexobarbital (Internal Standard) and sonicated prior to evaporation. The extract was then dissolved in water and vortex mixed before undergoing solid phase extraction using HLB cartridges. Chromatographic separation of the AEDs was achieved using Waters XBridge™ C18 column with a gradient system. The developed method was linear over the concentration ranges studied with r ≥ 0.995 for all compounds. The lower limits of quantification (LLOQs) were 2, 1, 2, 0.5 and 1 μg/mL for LVT, LTG, PHB, CBZE and CBZ, respectively. Accuracy (%RE) and precision (%CV) values for within and between day were <20% at the LLOQs and <15% at all other concentrations tested. This method was successfully applied to the analysis of the AEDs in DBS samples taken from children with epilepsy for the assessment of their adherence to prescribed treatments.
Resumo:
The target of no-reference (NR) image quality assessment (IQA) is to establish a computational model to predict the visual quality of an image. The existing prominent method is based on natural scene statistics (NSS). It uses the joint and marginal distributions of wavelet coefficients for IQA. However, this method is only applicable to JPEG2000 compressed images. Since the wavelet transform fails to capture the directional information of images, an improved NSS model is established by contourlets. In this paper, the contourlet transform is utilized to NSS of images, and then the relationship of contourlet coefficients is represented by the joint distribution. The statistics of contourlet coefficients are applicable to indicate variation of image quality. In addition, an image-dependent threshold is adopted to reduce the effect of content to the statistical model. Finally, image quality can be evaluated by combining the extracted features in each subband nonlinearly. Our algorithm is trained and tested on the LIVE database II. Experimental results demonstrate that the proposed algorithm is superior to the conventional NSS model and can be applied to different distortions. © 2009 Elsevier B.V. All rights reserved.
Resumo:
In this paper is proposed a model for researching the capability to influence, by selected methods’ groups of compression, to the co-efficient of information security of selected objects’ groups, exposed to selected attacks’ groups. With the help of methods for multi-criteria evaluation are chosen the methods’ groups with the lowest risk with respect to the information security. Recommendations for future investigations are proposed.