803 resultados para preference-based measures
Resumo:
This paper is in two parts and addresses two of getting more information out of the RF signal from three-dimensional (3D) mechanically-swept medical ultrasound . The first topic is the use of non-blind deconvolution improve the clarity of the data, particularly in the direction to the individual B-scans. The second topic is imaging. We present a robust and efficient approach to estimation and display of axial strain information. deconvolution, we calculate an estimate of the point-spread at each depth in the image using Field II. This is used as of an Expectation Maximisation (EM) framework in which ultrasound scatterer field is modelled as the product of (a) a smooth function and (b) a fine-grain varying function. the E step, a Wiener filter is used to estimate the scatterer based on an assumed piecewise smooth component. In the M , wavelet de-noising is used to estimate the piecewise smooth from the scatterer field. strain imaging, we use a quasi-static approach with efficient based algorithms. Our contributions lie in robust and 3D displacement tracking, point-wise quality-weighted , and a stable display that shows not only strain but an indication of the quality of the data at each point in the . This enables clinicians to see where the strain estimate is and where it is mostly noise. deconvolution, we present in-vivo images and simulations quantitative performance measures. With the blurred 3D taken as OdB, we get an improvement in signal to noise ratio 4.6dB with a Wiener filter alone, 4.36dB with the ForWaRD and S.18dB with our EM algorithm. For strain imaging show images based on 2D and 3D data and describe how full D analysis can be performed in about 20 seconds on a typical . We will also present initial results of our clinical study to explore the applications of our system in our local hospital. © 2008 IEEE.
Resumo:
The current procedures in post-earthquake safety and structural assessment are performed manually by a skilled triage team of structural engineers/certified inspectors. These procedures, and particularly the physical measurement of the damage properties, are time-consuming and qualitative in nature. This paper proposes a novel method that automatically detects spalled regions on the surface of reinforced concrete columns and measures their properties in image data. Spalling has been accepted as an important indicator of significant damage to structural elements during an earthquake. According to this method, the region of spalling is first isolated by way of a local entropy-based thresholding algorithm. Following this, the exposure of longitudinal reinforcement (depth of spalling into the column) and length of spalling along the column are measured using a novel global adaptive thresholding algorithm in conjunction with image processing methods in template matching and morphological operations. The method was tested on a database of damaged RC column images collected after the 2010 Haiti earthquake, and comparison of the results with manual measurements indicate the validity of the method.
Resumo:
The design, 3D FEM modelling and measurement results of a novel high temperature, low power SOI CMOS MEMS thermal conductivity gas sensor are presented here. The sensor consists of a circular membrane with an embedded tungsten micro-heater. The high sensing capability is based on the temperature sensitivity of the resistive heating element. The sensor was fabricated at a commercial foundry using a 1 μm process and measures only 1×1 mm 2. The circular membrane has a 600 μm diameter while the heating element has a 320 μm diameter. Measurement results show that for a constant power consumption of 75 mW the heater temperature was 562.4°C in air, 565.9°C in N2, 592.5°C for 1 % H2 in Ar and 599.5°C in Ar. © 2013 IEEE.
Resumo:
Design, FEM modelling and characterization of a novel dual mode thermal conductivity and infrared absorption sensor using SOI CMOS technology is reported. The dual mode sensing capability is based on the temperature sensitivity and wideband infrared radiation emission of the resistive heating element. The sensor was fabricated at a commercial foundry using a 1 μm process and measures only 1×1 mm2. Infrared detectors usually use thermopiles in addition to a separate IR source. A single highly responsive dual mode source and sensing element targeting not only low molecular mass gases but also greenhouse gases, while consuming 40 mW power at 700°C in synthetic air, thus makes this sensor particularly viable for battery powered handheld devices. © 2013 IEEE.
Resumo:
With the concerns over climate change and the escalation in worldwide population, sustainable development attracts more and more attention of academia, policy makers, and businesses in countries. Sustainable manufacturing is an inextricable measure to achieve sustainable development since manufacturing is one of the main energy consumers and greenhouse gas contributors. In the previous researches on production planning of manufacturing systems, environmental factor was rarely considered. This paper investigates the production planning problem under the performance measures of economy and environment with respect to seru production systems, a new manufacturing system praised as Double E (ecology and economy) in Japanese manufacturing industries. We propose a mathematical model with two objectives minimizing carbon dioxide emission and makespan for processing all product types by a seru production system. To solve this mathematical model, we develop an algorithm based on the non-dominated sorting genetic algorithm II. The computation results and analysis of three numeral examples confirm the effectiveness of our proposed algorithm. © 2014 Elsevier Ltd. All rights reserved.
Resumo:
The objectives of this work were to study the effects of several feeding stimulants on gibel carp fed diets with or without replacement of fish meal by meat and bone meal (MBM). The feeding stimulants tested were betaine, glycine, L-lysine, L-methionine, L-phenylalanine, and a commercial squid extract. Three inclusion levels were tested for each stimulant (0.18, 0.5%, and 1% for betaine and 0.1, 0.25 and 0.5% for the other stimulants). Two basal diets (40% crude protein) were used. one with 26% fish meal (FM), and the other with 21% fish meal and 6% MBM, Betaine at 0.1% in the fish meal group and at 0.5% in the meat and bone meal group was used in all experiments for comparison among stimulants. In the experiment on each stimulant, six tanks of fish were equally divided into two groups, one fed the FM diet, and the other fed the MBM diet. After 7 days' adaptation to the basal diet, in which the fish were fed to satiation twice a day, the fish were fed for another 7 days an equal mixture of diets containing varying levels of stimulants. Each diet contained a unique rare earth oxide as inert marker (Y2O3, Yb2O3, La2O3, Sm2O3 or Nd2O3). During the last 3 days of the experiment, faeces from each tank were collected. Preference for each diet was estimated based on the relative concentration of each marker in the faeces. Gibel carp fed the FM diet had higher intake than those fed the MBM diet, but the difference was significant only in the experiments on betaine, glycine and L-methionine. None of the feeding stimulants tested showed feeding enhancing effects in FM diets. All feeding stimulants showed feeding enhancing effects in MBM diets. and the optimum inclusion level was 0.5% for betaine, 0.1% for glycine, 0.25% for L-lysine, 0.1% for L-methionine. 0.25% For L-phenylalanine. and 0.1% for squid extract. The squid extract had the strongest stimulating effect among all the stimulants tested. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Numerous measures are used in the literature to describe the grain-size distribution of sediments. Consideration of these measures indicates that parameters computed from quartiles may not be as significant as those based on more rigorous statistical concepts. In addition, the lack of standardization of descriptive measures has resulted in limited application of the findings from one locality to another. The use of five parameters that serve as approximate graphic analogies to the moment measures commonly employed in statistics is recommended. The parameters are computed from five percentile diameters obtained from the cumulative size-frequency curve of a sediment. They include the mean (or median) diameter, standard deviation, kurtosis, and two measures of skewness, the second measure being sensitive to skew properties of the "tails" of the sediment distribution. If the five descriptive measures are listed for a sediment, it is possible to compute the five percentile diameters on which they are based (phi 5 , phi 16 , phi 50 , phi 84 , and phi 95 ), and hence five significant points on the cumulative carve of the sediment. This increases the value of the data listed for a sediment in a report, and in many cases eliminates the necessity of including the complete mechanical analysis of the sediment. The degree of correlation of the graphic parameters to the corresponding moment measures decreases as the distribution becomes more skew. However, for a fairly wide range of distributions, the first three moment measures can be ascertained from the graphic parameters with about the same degree of accuracy as is obtained by computing rough moment measures.
Resumo:
We have argued elsewhere that first order inference can be made more efficient by using non-standard syntax for first order logic. In this paper we show how a fragment of English syntax under Montague semantics provides the foundation of a new inference procedure. This procedure seems more effective than corresponding procedures based on either classical syntax of our previously proposed taxonomic syntax. This observation may provide a functional explanation for some of the syntactic structure of English.
Resumo:
An appearance-based framework for 3D hand shape classification and simultaneous camera viewpoint estimation is presented. Given an input image of a segmented hand, the most similar matches from a large database of synthetic hand images are retrieved. The ground truth labels of those matches, containing hand shape and camera viewpoint information, are returned by the system as estimates for the input image. Database retrieval is done hierarchically, by first quickly rejecting the vast majority of all database views, and then ranking the remaining candidates in order of similarity to the input. Four different similarity measures are employed, based on edge location, edge orientation, finger location and geometric moments.
Resumo:
The problem of discovering frequent arrangements of temporal intervals is studied. It is assumed that the database consists of sequences of events, where an event occurs during a time-interval. The goal is to mine temporal arrangements of event intervals that appear frequently in the database. The motivation of this work is the observation that in practice most events are not instantaneous but occur over a period of time and different events may occur concurrently. Thus, there are many practical applications that require mining such temporal correlations between intervals including the linguistic analysis of annotated data from American Sign Language as well as network and biological data. Two efficient methods to find frequent arrangements of temporal intervals are described; the first one is tree-based and uses depth first search to mine the set of frequent arrangements, whereas the second one is prefix-based. The above methods apply efficient pruning techniques that include a set of constraints consisting of regular expressions and gap constraints that add user-controlled focus into the mining process. Moreover, based on the extracted patterns a standard method for mining association rules is employed that applies different interestingness measures to evaluate the significance of the discovered patterns and rules. The performance of the proposed algorithms is evaluated and compared with other approaches on real (American Sign Language annotations and network data) and large synthetic datasets.
Resumo:
An aim of proactive risk management strategies is the timely identification of safety related risks. One way to achieve this is by deploying early warning systems. Early warning systems aim to provide useful information on the presence of potential threats to the system, the level of vulnerability of a system, or both of these, in a timely manner. This information can then be used to take proactive safety measures. The United Nation’s has recommended that any early warning system need to have four essential elements, which are the risk knowledge element, a monitoring and warning service, dissemination and communication and a response capability. This research deals with the risk knowledge element of an early warning system. The risk knowledge element of an early warning system contains models of possible accident scenarios. These accident scenarios are created by using hazard analysis techniques, which are categorised as traditional and contemporary. The assumption in traditional hazard analysis techniques is that accidents are occurred due to a sequence of events, whereas, the assumption of contemporary hazard analysis techniques is that safety is an emergent property of complex systems. The problem is that there is no availability of a software editor which can be used by analysts to create models of accident scenarios based on contemporary hazard analysis techniques and generate computer code that represent the models at the same time. This research aims to enhance the process of generating computer code based on graphical models that associate early warning signs and causal factors to a hazard, based on contemporary hazard analyses techniques. For this purpose, the thesis investigates the use of Domain Specific Modeling (DSM) technologies. The contributions of this thesis is the design and development of a set of three graphical Domain Specific Modeling languages (DSML)s, that when combined together, provide all of the necessary constructs that will enable safety experts and practitioners to conduct hazard and early warning analysis based on a contemporary hazard analysis approach. The languages represent those elements and relations necessary to define accident scenarios and their associated early warning signs. The three DSMLs were incorporated in to a prototype software editor that enables safety scientists and practitioners to create and edit hazard and early warning analysis models in a usable manner and as a result to generate executable code automatically. This research proves that the DSM technologies can be used to develop a set of three DSMLs which can allow user to conduct hazard and early warning analysis in more usable manner. Furthermore, the three DSMLs and their dedicated editor, which are presented in this thesis, may provide a significant enhancement to the process of creating the risk knowledge element of computer based early warning systems.
Resumo:
The topic of this thesis is impulsivity. The meaning and measurement of impulse control is explored, with a particular focus on forensic settings. Impulsivity is central to many areas of psychology; it is one of the most common diagnostic criteria of mental disorders and is fundamental to the understanding of forensic personalities. Despite this widespread importance there is little agreement as to the definition or structure of impulsivity, and its measurement is fraught with difficulty owing to a reliance on self-report methods. This research aims to address this problem by investigating the viability of using simple computerised cognitive performance tasks as complementary components of a multi-method assessment strategy for impulse control. Ultimately, the usefulness of this measurement strategy for a forensic sample is assessed. Impulsivity is found to be a multifaceted construct comprised of a constellation of distinct sub-dimensions. Computerised cognitive performance tasks are valid and reliable measures that can assess impulsivity at a neuronal level. Self-report and performance task methods assess distinct components of impulse control and, for the optimal assessment of impulse control, a multi-method battery of self-report and performance task measures is advocated. Such a battery is shown to have demonstrated utility in a forensic sample, and recommendations for forensic assessment in the Irish context are discussed.
Resumo:
Aim: To investigate the value of using PROMs as quality improvement tools. Methods: Two systematic reviews were undertaken. The first reviewed the quantitative literature on the impact of PROMs feedback and the second reviewed the qualitative literature on the use of PROMs in practice. These reviews informed the focus of the primary research. A cluster randomised controlled trial (PROFILE) examined the impact of providing peer benchmarked PROMs feedback to consultant orthopaedic surgeons on improving outcomes for hip replacement surgery. Qualitative interviews with surgeons in the intervention arm of the trial examined the view of and reactions to the feedback. Results: The quantitative review of 17 studies found weak evidence to suggest that providing PROMs feedback to professionals improves patient outcomes. The qualitative review of 16 studies identified the barriers and facilitators to the use of PROMs based on four themes: practical considerations, attitudes towards the data, methodological concerns and the impact of feedback on care. The PROFILE trial included 11 surgeons and 215 patients in the intervention arm, and 10 surgeons and 217 patients in the control arm. The trial found no significant difference in the Oxford Hip Score between the arms (-0.7, 95% CI -1.9-0.5, p=0.2). Interviews with surgeons revealed mixed opinions about the value of the PROMs feedback and the information did not promote explicit changes to their practice. Conclusion: It is important to use PROMs which have been validated for the specific purpose of performance measurement, consult with professionals when developing a PROMs feedback intervention, communicate with professionals about the objectives of the data collection, educate professionals on the properties and interpretation of the data, and support professionals in using the information to improve care. It is also imperative that the burden of data collection and dissemination of the information is minimised.
Resumo:
BACKGROUND: Outcome assessment can support the therapeutic process by providing a way to track symptoms and functionality over time, providing insights to clinicians and patients, as well as offering a common language to discuss patient behavior/functioning. OBJECTIVES: In this article, we examine the patient-based outcome assessment (PBOA) instruments that have been used to determine outcomes in acupuncture clinical research and highlight measures that are feasible, practical, economical, reliable, valid, and responsive to clinical change. The aims of this review were to assess and identify the commonly available PBOA measures, describe a framework for identifying appropriate sets of measures, and address the challenges associated with these measures and acupuncture. Instruments were evaluated in terms of feasibility, practicality, economy, reliability, validity, and responsiveness to clinical change. METHODS: This study was a systematic review. A total of 582 abstracts were reviewed using PubMed (from inception through April 2009). RESULTS: A total of 582 citations were identified. After screening of title/abstract, 212 articles were excluded. From the remaining 370 citations, 258 manuscripts identified explicit PBOA; 112 abstracts did not include any PBOA. The five most common PBOA instruments identified were the Visual Analog Scale, Symptom Diary, Numerical Pain Rating Scales, SF-36, and depression scales such as the Beck Depression Inventory. CONCLUSIONS: The way a questionnaire or scale is administered can have an effect on the outcome. Also, developing and validating outcome measures can be costly and difficult. Therefore, reviewing the literature on existing measures before creating or modifying PBOA instruments can significantly reduce the burden of developing a new measure.
Resumo:
Numerical approximation of the long time behavior of a stochastic di.erential equation (SDE) is considered. Error estimates for time-averaging estimators are obtained and then used to show that the stationary behavior of the numerical method converges to that of the SDE. The error analysis is based on using an associated Poisson equation for the underlying SDE. The main advantages of this approach are its simplicity and universality. It works equally well for a range of explicit and implicit schemes, including those with simple simulation of random variables, and for hypoelliptic SDEs. To simplify the exposition, we consider only the case where the state space of the SDE is a torus, and we study only smooth test functions. However, we anticipate that the approach can be applied more widely. An analogy between our approach and Stein's method is indicated. Some practical implications of the results are discussed. Copyright © by SIAM. Unauthorized reproduction of this article is prohibited.