55 resultados para reliability measurement
Resumo:
We present experiments in which the laterally confined flow of a surfactant film driven by controlled surface tension gradients causes the subtended liquid layer to self-organize into an inner upstream microduct surrounded by the downstream flow. The anomalous interfacial flow profiles and the concomitant backflow are a result of the feedback between two-dimensional and three-dimensional microfluidics realized during flow in open microchannels. Bulk and surface particle image velocimetry data combined with an interfacial hydrodynamics model explain the dependence of the observed phenomena on channel geometry.
Resumo:
Background In an agreement assay, it is of interest to evaluate the degree of agreement between the different methods (devices, instruments or observers) used to measure the same characteristic. We propose in this study a technical simplification for inference about the total deviation index (TDI) estimate to assess agreement between two devices of normally-distributed measurements and describe its utility to evaluate inter- and intra-rater agreement if more than one reading per subject is available for each device. Methods We propose to estimate the TDI by constructing a probability interval of the difference in paired measurements between devices, and thereafter, we derive a tolerance interval (TI) procedure as a natural way to make inferences about probability limit estimates. We also describe how the proposed method can be used to compute bounds of the coverage probability. Results The approach is illustrated in a real case example where the agreement between two instruments, a handle mercury sphygmomanometer device and an OMRON 711 automatic device, is assessed in a sample of 384 subjects where measures of systolic blood pressure were taken twice by each device. A simulation study procedure is implemented to evaluate and compare the accuracy of the approach to two already established methods, showing that the TI approximation produces accurate empirical confidence levels which are reasonably close to the nominal confidence level. Conclusions The method proposed is straightforward since the TDI estimate is derived directly from a probability interval of a normally-distributed variable in its original scale, without further transformations. Thereafter, a natural way of making inferences about this estimate is to derive the appropriate TI. Constructions of TI based on normal populations are implemented in most standard statistical packages, thus making it simpler for any practitioner to implement our proposal to assess agreement.
Resumo:
This paper reviews the concept of presence in immersive virtual environments, the sense of being there signalled by people acting and responding realistically to virtual situations and events. We argue that presence is a unique phenomenon that must be distinguished from the degree of engagement, involvement in the portrayed environment. We argue that there are three necessary conditions for presence: the (a) consistent low latency sensorimotor loop between sensory data and proprioception; (b) statistical plausibility: images must be statistically plausible in relation to the probability distribution of images over natural scenes. A constraint on this plausibility is the level of immersion;(c) behaviour-response correlations: Presence may be enhanced and maintained over time by appropriate correlations between the state and behaviour of participants and responses within the environment, correlations that show appropriate responses to the activity of the participants. We conclude with a discussion of methods for assessing whether presence occurs, and in particular recommend the approach of comparison with ground truth and give some examples of this.
Resumo:
We report on a field-effect light emitting device based on silicon nanocrystals in silicon oxide deposited by plasma-enhanced chemical vapor deposition. The device shows high power efficiency and long lifetime. The power efficiency is enhanced up to 0.1 %25 by the presence of a silicon nitride control layer. The leakage current reduction induced by this nitride buffer effectively increases the power efficiency two orders of magnitude with regard to similarly processed devices with solely oxide. In addition, the nitride cools down the electrons that reach the polycrystalline silicon gate lowering the formation of defects, which significantly reduces the device degradation.
Resumo:
Prompt production of charmonium χ c0, χ c1 and χ c2 mesons is studied using proton-proton collisions at the LHC at a centre-of-mass energy of TeX TeV. The χ c mesons are identified through their decay to J/ψγ, with J/ψ → μ + μ − using photons that converted in the detector. A data sample, corresponding to an integrated luminosity of 1.0 fb−1 collected by the LHCb detector, is used to measure the relative prompt production rate of χ c1 and χ c2 in the rapidity range 2.0 < y < 4.5 as a function of the J/ψ transverse momentum from 3 to 20 GeV/c. First evidence for χ c0 meson production at a high-energy hadron collider is also presented.
Resumo:
In this thesis (TFG) the results of the comparison of three assays for the measurement of AhR ligand activity are exposed. This study was part of a collaborative project aiming at the characterization of the AhR signaling activities of known naturally occurring compounds to explore the potential of using non-toxic compounds to treat inflammatory diseases via oral administration. The first goal of this project was to find an assay able to measure AhR-activity, so the comparison of different assays has been done in order to find the most convenient one according to the efficiency, sensitivity and precision. Moreover, other elements with operational nature such as price, toxicity of components or ease of use has been considered. From the use of compounds known from the literature to be AhR ligands, three assays have been tested: (1) P450-GloTM CYP1A2 Induction/Inhibition assay, (2) quantitative Polymerase Chain Reaction (qPCR) and (3) DR. CALUX® Bioassay. Moreover, a different experiment using the last assay was performed for the study in vivo of the transport of the compounds tested. The results of the TFG suggested the DR. CALUX® Bioassay as the most promising assay to be used for the screening of samples as AhR-ligands because it is quicker, easier to handle and less expensive than qPCR and more reproducible than the CYP1A2 Induction/Inhibition assay. Moreover, the use of this assay allowed having a first idea of which compounds are uptaken by the epithelial barrier and in with direction the transport happens.
Resumo:
The objective of this study was to evaluate the methodological characteristics of cost-effectiveness evaluations carried out in Spain, since 1990, which include LYG as an outcome to measure the incremental cost-effectiveness ratio. METHODS: A systematic review of published studies was conducted describing their characteristics and methodological quality. We analyse the cost per LYG results in relation with a commonly accepted Spanish cost-effectiveness threshold and the possible relation with the cost per quality adjusted life year (QALY) gained when they both were calculated for the same economic evaluation. RESULTS: A total of 62 economic evaluations fulfilled the selection criteria, 24 of them including the cost per QALY gained result as well. The methodological quality of the studies was good (55%) or very good (26%). A total of 124 cost per LYG results were obtained with a mean ratio of 49,529
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation
Resumo:
Background: Effective treatment for breast cancer requires accurate preoperative planning, developing and implementing a consistent definition of margin clearance, and using tools that provide detailed real-time intraoperative information on margin status. Intraoperative ultrasound (IOUS) may fulfil these requirements and may offer few advantages that other preoperative localization and intraoperative margin assessment techniques may notPurpose: The goal of the present work is to determine how accurate the intraoperative ultrasound should be to acquire complete surgical excision with negative histological margins in patients undergoing Breast Conservative SurgeryDesign: A diagnostic test study with a cross-sectional design carried out in a tertiary referral hospital in Girona within a Breast Pathology UnitParticipants: Women diagnosed with breast cancer undergoing a Breast Conservative Surgery in the Breast Pathology Unit at Hospital Universitari de Girona Dr. Josep Trueta
Resumo:
The aim of this project is to get used to another kind of programming. Since now, I used very complex programming languages to develop applications or even to program microcontrollers, but PicoCricket system is the evidence that we don’t need so complex development tools to get functional devices. PicoCricket system is the clear example of simple programming to make devices work the way we programmed it. There’s an easy but effective way to programs mall devices just saying what we want them to do. We cannot do complex algorithms and mathematical operations but we can program them in a short time. Nowadays, the easier and faster we produce, the more we earn. So the tendency is to develop fast, cheap and easy, and PicoCricket system can do it.