891 resultados para sampling error


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many European countries, image quality for digital x-ray systems used in screening mammography is currently specified using a threshold-detail detectability method. This is a two-part study that proposes an alternative method based on calculated detectability for a model observer: the first part of the work presents a characterization of the systems. Eleven digital mammography systems were included in the study; four computed radiography (CR) systems, and a group of seven digital radiography (DR) detectors, composed of three amorphous selenium-based detectors, three caesium iodide scintillator systems and a silicon wafer-based photon counting system. The technical parameters assessed included the system response curve, detector uniformity error, pre-sampling modulation transfer function (MTF), normalized noise power spectrum (NNPS) and detective quantum efficiency (DQE). Approximate quantum noise limited exposure range was examined using a separation of noise sources based upon standard deviation. Noise separation showed that electronic noise was the dominant noise at low detector air kerma for three systems; the remaining systems showed quantum noise limited behaviour between 12.5 and 380 µGy. Greater variation in detector MTF was found for the DR group compared to the CR systems; MTF at 5 mm(-1) varied from 0.08 to 0.23 for the CR detectors against a range of 0.16-0.64 for the DR units. The needle CR detector had a higher MTF, lower NNPS and higher DQE at 5 mm(-1) than the powder CR phosphors. DQE at 5 mm(-1) ranged from 0.02 to 0.20 for the CR systems, while DQE at 5 mm(-1) for the DR group ranged from 0.04 to 0.41, indicating higher DQE for the DR detectors and needle CR system than for the powder CR phosphor systems. The technical evaluation section of the study showed that the digital mammography systems were well set up and exhibiting typical performance for the detector technology employed in the respective systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Because of the various matrices available for forensic investigations, the development of versatile analytical approaches allowing the simultaneous determination of drugs is challenging. The aim of this work was to assess a liquid chromatography-tandem mass spectrometry (LC-MS/MS) platform allowing the rapid quantification of colchicine in body fluids and tissues collected in the context of a fatal overdose. For this purpose, filter paper was used as a sampling support and was associated with an automated 96-well plate extraction performed by the LC autosampler itself. The developed method features a 7-min total run time including automated filter paper extraction (2 min) and chromatographic separation (5 min). The sample preparation was reduced to a minimum regardless of the matrix analyzed. This platform was fully validated for dried blood spots (DBS) in the toxic concentration range of colchicine. The DBS calibration curve was applied successfully to quantification in all other matrices (body fluids and tissues) except for bile, where an excessive matrix effect was found. The distribution of colchicine for a fatal overdose case was reported as follows: peripheral blood, 29 ng/ml; urine, 94 ng/ml; vitreous humour and cerebrospinal fluid, < 5 ng/ml; pericardial fluid, 14 ng/ml; brain, < 5 pg/mg; heart, 121 pg/mg; kidney, 245 pg/mg; and liver, 143 pg/mg. Although filter paper is usually employed for DBS, we report here the extension of this alternative sampling support to the analysis of other body fluids and tissues. The developed platform represents a rapid and versatile approach for drug determination in multiple forensic media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate accuracy and reproducibility of flow velocity and volume measurements in a phantom and in human coronary arteries using breathhold velocity-encoded (VE) MRI with spiral k-space sampling at 3 Tesla. MATERIALS AND METHODS: Flow velocity assessment was performed using VE MRI with spiral k-space sampling. Accuracy of VE MRI was tested in vitro at five constant flow rates. Reproducibility was investigated in 19 healthy subjects (mean age 25.4 +/- 1.2 years, 11 men) by repeated acquisition in the right coronary artery (RCA). RESULTS: MRI-measured flow rates correlated strongly with volumetric collection (Pearson correlation r = 0.99; P < 0.01). Due to limited sample resolution, VE MRI overestimated the flow rate by 47% on average when nonconstricted region-of-interest segmentation was used. Using constricted region-of-interest segmentation with lumen size equal to ground-truth luminal size, less than 13% error in flow rate was found. In vivo RCA flow velocity assessment was successful in 82% of the applied studies. High interscan, intra- and inter-observer agreement was found for almost all indices describing coronary flow velocity. Reproducibility for repeated acquisitions varied by less than 16% for peak velocity values and by less than 24% for flow volumes. CONCLUSION: 3T breathhold VE MRI with spiral k-space sampling enables accurate and reproducible assessment of RCA flow velocity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple wipe sampling procedure was developed for the surface contamination determination of ten cytotoxic drugs: cytarabine, gemcitabine, methotrexate, etoposide phosphate, cyclophosphamide, ifosfamide, irinotecan, doxorubicin, epirubicin and vincristine. Wiping was performed using Whatman filter paper on different surfaces such as stainless steel, polypropylene, polystyrol, glass, latex gloves, computer mouse and coated paperboard. Wiping and desorption procedures were investigated: The same solution containing 20% acetonitrile and 0.1% formic acid in water gave the best results. After ultrasonic desorption and then centrifugation, samples were analysed by a validated liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) in selected reaction monitoring mode. The whole analytical strategy from wipe sampling to LC-MS/MS analysis was evaluated to determine quantitative performance. The lowest limit of quantification of 10 ng per wiping sample (i.e. 0.1 ng cm(-2)) was determined for the ten investigated cytotoxic drugs. Relative standard deviation for intermediate precision was always inferior to 20%. As recovery was dependent on the tested surface for each drug, a correction factor was determined and applied for real samples. The method was then successfully applied at the cytotoxic production unit of the Geneva University Hospitals pharmacy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a restoration algorithm for band limited images that considers irregular(perturbed) sampling, denoising, and deconvolution. We explore the application of a family ofregularizers that allow to control the spectral behavior of the solution combined with the irregular toregular sampling algorithms proposed by H.G. Feichtinger, K. Gr¨ochenig, M. Rauth and T. Strohmer.Moreover, the constraints given by the image acquisition model are incorporated as a set of localconstraints. And the analysis of such constraints leads to an early stopping rule meant to improvethe speed of the algorithm. Finally we present experiments focused on the restoration of satellite images, where the micro-vibrations are responsible of the type of distortions we are considering here. We will compare results of the proposed method with previous methods and show an extension tozoom.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cultural variation in a population is affected by the rate of occurrence of cultural innovations, whether such innovations are preferred or eschewed, how they are transmitted between individuals in the population, and the size of the population. An innovation, such as a modification in an attribute of a handaxe, may be lost or may become a property of all handaxes, which we call "fixation of the innovation." Alternatively, several innovations may attain appreciable frequencies, in which case properties of the frequency distribution-for example, of handaxe measurements-is important. Here we apply the Moran model from the stochastic theory of population genetics to study the evolution of cultural innovations. We obtain the probability that an initially rare innovation becomes fixed, and the expected time this takes. When variation in cultural traits is due to recurrent innovation, copy error, and sampling from generation to generation, we describe properties of this variation, such as the level of heterogeneity expected in the population. For all of these, we determine the effect of the mode of social transmission: conformist, where there is a tendency for each naïve newborn to copy the most popular variant; pro-novelty bias, where the newborn prefers a specific variant if it exists among those it samples; one-to-many transmission, where the variant one individual carries is copied by all newborns while that individual remains alive. We compare our findings with those predicted by prevailing theories for rates of cultural change and the distribution of cultural variation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the assessment of medical malpractice imaging methods can be used for the documentation of crucial morphological findings which are indicative for or against an iatrogenically caused injury. The clarification of deaths in this context can be usefully supported by postmortem imaging (primarily native computed tomography, angiography, magnetic resonance imaging). Postmortem imaging offers significant additional information compared to an autopsy in the detection of iatrogenic air embolisms and documentation of misplaced medical aids before dissection with an inherent danger of relocation. Additional information is supplied by postmortem imaging in the search for sources of bleeding as well as the documentation of perfusion after cardiovascular surgery. Key criteria for the decision to perform postmortem imaging can be obtained from the necessary preliminary inspection of clinical documentation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many complex systems may be described by not one but a number of complex networks mapped on each other in a multi-layer structure. Because of the interactions and dependencies between these layers, the state of a single layer does not necessarily reflect well the state of the entire system. In this paper we study the robustness of five examples of two-layer complex systems: three real-life data sets in the fields of communication (the Internet), transportation (the European railway system), and biology (the human brain), and two models based on random graphs. In order to cover the whole range of features specific to these systems, we focus on two extreme policies of system's response to failures, no rerouting and full rerouting. Our main finding is that multi-layer systems are much more vulnerable to errors and intentional attacks than they appear from a single layer perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vehicle may leave its travel lane for a number of reasons, such as driver error, poor surface conditions, or avoidance of a collision with another vehicle in the travel lane. When a vehicle leaves the travel lane, pavement edge drop-off poses a potential safety hazard because significant vertical differences between surfaces can affect vehicle stability and reduce a driver’s ability to handle the vehicle. Numerous controlled studies have tested driver response to encountering drop-offs under various conditions, including different speeds, vehicle types, drop-off height and shape, and tire scrubbing versus non-scrubbing conditions. The studies evaluated the drivers’ ability to return to and recover within their own travel lane after leaving the roadway and encountering a drop-off. Many of these studies, however, have used professional drivers as test subjects, so results may not always apply to the population of average drivers. Furthermore, test subjects are always briefed on what generally is to be expected and how to respond; thus, the sense of surprise that a truly naïve driver may experience upon realizing that one or two of his or her tires have just dropped off the edge of the pavement, is very likely diminished. Additionally, the studies were carried out under controlled conditions. The actual impact of pavement edge drop-off on drivers’ ability to recover safely once they leave the roadway, however, is not well understood under actual driving conditions. Additionally, little information is available that quantifies the number or severity of crashes that occur where pavement edge drop-off may have been a contributing factor. Without sufficient information about the frequency of edge drop-off-related crashes, agencies are not fully able to measure the economic benefits of investment decisions, evaluate the effectiveness of different treatments to mitigate edge drop-off, or focus maintenance resources. To address these issues, this report details research to quantify the contribution of pavement edge drop-off to crash frequency and severity. Additionally, the study evaluated federal and state guidance in sampling and addressing pavement edge drop-off and quantified the extent of pavement edge drop-off in two states. This study focused on rural two-lane paved roadways with unpaved shoulders, since they are often high speed facilities (55+ mph), have varying levels of maintenance, and are likely to be characterized by adverse roadway conditions such as narrow lanes or no shoulders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medical errors compromise patient safety in ambulatory practice. These errors must be faced in a framework that reduces to a minimum their consequences for the patients. This approach relies on the implementation of a new culture without stigmatization and where errors are disclosed to the patients; this culture implies the build up of a system for reporting errors associated to an in-depth analysis of the system, looking for root causes and insufficient barriers with the aim to fix them. A useful education tool is the "critical situations" meeting during which physicians are encouraged to openly present adverse events and "near misses". Their analysis, with supportive attitude towards involved staff members, allows to reveal systems failures within the institution or the private practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Small sample properties are of fundamental interest when only limited data is avail-able. Exact inference is limited by constraints imposed by speci.c nonrandomizedtests and of course also by lack of more data. These e¤ects can be separated as we propose to evaluate a test by comparing its type II error to the minimal type II error among all tests for the given sample. Game theory is used to establish this minimal type II error, the associated randomized test is characterized as part of a Nash equilibrium of a .ctitious game against nature.We use this method to investigate sequential tests for the di¤erence between twomeans when outcomes are constrained to belong to a given bounded set. Tests ofinequality and of noninferiority are included. We .nd that inference in terms oftype II error based on a balanced sample cannot be improved by sequential sampling or even by observing counter factual evidence providing there is a reasonable gap between the hypotheses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The experiential sampling method (ESM) was used to collect data from 74 parttimestudents who described and assessed the risks involved in their current activitieswhen interrupted at random moments by text messages. The major categories ofperceived risk were short-term in nature and involved loss of time or materials relatedto work and physical damage (e.g., from transportation). Using techniques of multilevelanalysis, we demonstrate effects of gender, emotional state, and types of risk onassessments of risk. Specifically, females do not differ from males in assessing thepotential severity of risks but they see these as more likely to occur. Also, participantsassessed risks to be lower when in more positive self-reported emotional states. Wefurther demonstrate the potential of ESM by showing that risk assessments associatedwith current actions exceed those made retrospectively. We conclude by notingadvantages and disadvantages of ESM for collecting data about risk perceptions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.