951 resultados para measurement data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of charged-particle event shape variables is presented in inclusive inelastic pp collisions at a center-of-mass energy of 7 TeV using the ATLAS detector at the LHC. The observables studied are the transverse thrust, thrust minor, and transverse sphericity, each defined using the final-state charged particles' momentum components perpendicular to the beam direction. Events with at least six charged particles are selected by a minimum-bias trigger. In addition to the differential distributions, the evolution of each event shape variable as a function of the leading charged-particle transverse momentum, charged-particle multiplicity, and summed transverse momentum is presented. Predictions from several Monte Carlo models show significant deviations from data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With recent advances in mass spectrometry techniques, it is now possible to investigate proteins over a wide range of molecular weights in small biological specimens. This advance has generated data-analytic challenges in proteomics, similar to those created by microarray technologies in genetics, namely, discovery of "signature" protein profiles specific to each pathologic state (e.g., normal vs. cancer) or differential profiles between experimental conditions (e.g., treated by a drug of interest vs. untreated) from high-dimensional data. We propose a data analytic strategy for discovering protein biomarkers based on such high-dimensional mass-spectrometry data. A real biomarker-discovery project on prostate cancer is taken as a concrete example throughout the paper: the project aims to identify proteins in serum that distinguish cancer, benign hyperplasia, and normal states of prostate using the Surface Enhanced Laser Desorption/Ionization (SELDI) technology, a recently developed mass spectrometry technique. Our data analytic strategy takes properties of the SELDI mass-spectrometer into account: the SELDI output of a specimen contains about 48,000 (x, y) points where x is the protein mass divided by the number of charges introduced by ionization and y is the protein intensity of the corresponding mass per charge value, x, in that specimen. Given high coefficients of variation and other characteristics of protein intensity measures (y values), we reduce the measures of protein intensities to a set of binary variables that indicate peaks in the y-axis direction in the nearest neighborhoods of each mass per charge point in the x-axis direction. We then account for a shifting (measurement error) problem of the x-axis in SELDI output. After these pre-analysis processing of data, we combine the binary predictors to generate classification rules for cancer, benign hyperplasia, and normal states of prostate. Our approach is to apply the boosting algorithm to select binary predictors and construct a summary classifier. We empirically evaluate sensitivity and specificity of the resulting summary classifiers with a test dataset that is independent from the training dataset used to construct the summary classifiers. The proposed method performed nearly perfectly in distinguishing cancer and benign hyperplasia from normal. In the classification of cancer vs. benign hyperplasia, however, an appreciable proportion of the benign specimens were classified incorrectly as cancer. We discuss practical issues associated with our proposed approach to the analysis of SELDI output and its application in cancer biomarker discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES:: This study was designed to apply the rapid Elecsys(R) S100 immunoassay for real-time measurement of S100 protein serum levels indicating acute brain damage in patients undergoing carotid artery stenting (CAS) or endarterectomy (CEA). DESIGN AND METHODS:: Data of 14 CAS patients were compared to those of 43 CEA and 14 control patients undergoing coronary angiography (CA). S100 serum levels were measured by the full-automatic Elecsys(R) S100 immunoassay and compared to those obtained by the well-established LIA-mat(R) S100 system. RESULTS:: In contrast to CAS and CA patients, median S100 serum levels of CEA patients significantly increased to 0.24 ng/mL before declamping, but subsequently returned to baseline. Three CEA patients with neurological deficits showed sustained elevated S100 levels 6 h after extubation. Absolute S100 values were not significantly different between the two methods. Bland-Altman plot analyses displayed a good agreement, mostly indicating slightly smaller values applying the Elecsys(R) S100 system. CONCLUSIONS:: The Elecsys(R) S100 system appears to be suitable for rapid real-time detection of neurological deficits in patients undergoing CAS and CEA. Persistent elevations of Elecsys(R) S100 levels during CEA were associated with prolonged neurological disorders, whereas transient increases seem to represent impaired blood-brain barrier integrity without neurological deficits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fuel Cells are a promising alternative energy technology. One of the biggest problems that exists in fuel cell is that of water management. A better understanding of wettability characteristics in the fuel cells is needed to alleviate the problem of water management. Contact angle data on gas diffusion layers (GDL) of the fuel cells can be used to characterize the wettability of GDL in fuel cells. A contact angle measurement program has been developed to measure the contact angle of sessile drops from drop images. Digitization of drop images induces pixel errors in the contact angle measurement process. The resulting uncertainty in contact angle measurement has been analyzed. An experimental apparatus has been developed for contact angle measurements at different temperature, with the feature to measure advancing and receding contact angles on gas diffusion layers of fuel cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current market system, power systems are operated at higher loads for economic reasons. Power system stability becomes a genuine concern in such operating conditions. In case of failure of any larger component, the system may become stressed. These events may start cascading failures, which may lead to blackouts. One of the main reasons of the major recorded blackout events has been the unavailability of system-wide information. Synchrophasor technology has the capability to provide system-wide real time information. Phasor Measurement Units (PMUs) are the basic building block of this technology, which provide the Global Positioning System (GPS) time-stamped voltage and current phasor values along with the frequency. It is being assumed that synchrophasor data of all the buses is available and thus the whole system is fully observable. This information can be used to initiate islanding or system separation to avoid blackouts. A system separation strategy using synchrophasor data has been developed to answer the three main aspects of system separation: (1) When to separate: One class support machines (OC-SVM) is primarily used for the anomaly detection. Here OC-SVM was used to detect wide area instability. OC-SVM has been tested on different stable and unstable cases and it is found that OC-SVM has the capability to detect the wide area instability and thus is capable to answer the question of “when the system should be separated”. (2) Where to separate: The agglomerative clustering technique was used to find the groups of coherent buses. The lines connecting different groups of coherent buses form the separation surface. The rate of change of the bus voltage phase angles has been used as the input to this technique. This technique has the potential to exactly identify the lines to be tripped for the system separation. (3) What to do after separation: Load shedding was performed approximately equal to the sum of power flows along the candidate system separation lines should be initiated before tripping these lines. Therefore it is recommended that load shedding should be initiated before tripping the lines for system separation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In-cylinder pressure transducers have been used for decades to record combustion pressure inside a running engine. However, due to the extreme operating environment, transducer design and installation must be considered in order to minimize measurement error. One such error is caused by thermal shock, where the pressure transducer experiences a high heat flux that can distort the pressure transducer diaphragm and also change the crystal sensitivity. This research focused on investigating the effects of thermal shock on in-cylinder pressure transducer data quality using a 2.0L, four-cylinder, spark-ignited, direct-injected, turbo-charged GM engine. Cylinder four was modified with five ports to accommodate pressure transducers of different manufacturers. They included an AVL GH14D, an AVL GH15D, a Kistler 6125C, and a Kistler 6054AR. The GH14D, GH15D, and 6054AR were M5 size transducers. The 6125C was a larger, 6.2mm transducer. Note that both of the AVL pressure transducers utilized a PH03 flame arrestor. Sweeps of ignition timing (spark sweep), engine speed, and engine load were performed to study the effects of thermal shock on each pressure transducer. The project consisted of two distinct phases which included experimental engine testing as well as simulation using a commercially available software package. A comparison was performed to characterize the quality of the data between the actual cylinder pressure and the simulated results. This comparison was valuable because the simulation results did not include thermal shock effects. All three sets of tests showed the peak cylinder pressure was basically unaffected by thermal shock. Comparison of the experimental data with the simulated results showed very good correlation. The spark sweep was performed at 1300 RPM and 3.3 bar NMEP and showed that the differences between the simulated results (no thermal shock) and the experimental data for the indicated mean effective pressure (IMEP) and the pumping mean effective pressure (PMEP) were significantly less than the published accuracies. All transducers had an IMEP percent difference less than 0.038% and less than 0.32% for PMEP. Kistler and AVL publish that the accuracy of their pressure transducers are within plus or minus 1% for the IMEP (AVL 2011; Kistler 2011). In addition, the difference in average exhaust absolute pressure between the simulated results and experimental data was the greatest for the two Kistler pressure transducers. The location and lack of flame arrestor are believed to be the cause of the increased error. For the engine speed sweep, the torque output was held constant at 203 Nm (150 ft-lbf) from 1500 to 4000 RPM. The difference in IMEP was less than 0.01% and the PMEP was less than 1%, except for the AVL GH14D which was 5% and the AVL GH15DK which was 2.25%. A noticeable error in PMEP appeared as the load increased during the engine speed sweeps, as expected. The load sweep was conducted at 2000 RPM over a range of NMEP from 1.1 to 14 bar. The difference in IMEP values were less 0.08% while the PMEP values were below 1% except for the AVL GH14D which was 1.8% and the AVL GH15DK which was at 1.25%. In-cylinder pressure transducer data quality was effectively analyzed using a combination of experimental data and simulation results. Several criteria can be used to investigate the impact of thermal shock on data quality as well as determine the best location and thermal protection for various transducers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Bleeding is a frequent complication during surgery. The intraoperative administration of blood products, including packed red blood cells, platelets and fresh frozen plasma (FFP), is often live saving. Complications of blood transfusions contribute considerably to perioperative costs and blood product resources are limited. Consequently, strategies to optimize the decision to transfuse are needed. Bleeding during surgery is a dynamic process and may result in major blood loss and coagulopathy due to dilution and consumption. The indication for transfusion should be based on reliable coagulation studies. While hemoglobin levels and platelet counts are available within 15 minutes, standard coagulation studies require one hour. Therefore, the decision to administer FFP has to be made in the absence of any data. Point of care testing of prothrombin time ensures that one major parameter of coagulation is available in the operation theatre within minutes. It is fast, easy to perform, inexpensive and may enable physicians to rationally determine the need for FFP. METHODS/DESIGN: The objective of the POC-OP trial is to determine the effectiveness of point of care prothrombin time testing to reduce the administration of FFP. It is a patient and assessor blind, single center randomized controlled parallel group trial in 220 patients aged between 18 and 90 years undergoing major surgery (any type, except cardiac surgery and liver transplantation) with an estimated blood loss during surgery exceeding 20% of the calculated total blood volume or a requirement of FFP according to the judgment of the physicians in charge. Patients are randomized to usual care plus point of care prothrombin time testing or usual care alone without point of care testing. The primary outcome is the relative risk to receive any FFP perioperatively. The inclusion of 110 patients per group will yield more than 80% power to detect a clinically relevant relative risk of 0.60 to receive FFP of the experimental as compared with the control group. DISCUSSION: Point of care prothrombin time testing in the operation theatre may reduce the administration of FFP considerably, which in turn may decrease costs and complications usually associated with the administration of blood products. TRIAL REGISTRATION: NCT00656396.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epidemiological data on halitosis are rare. In this study we evaluated the prevalence of halitosis in the population of the city of Bern, Switzerland, using a standardized questionnaire and clinical examination. First of all, a standardized questionnaire was filled out by all 419 participants. In the clinical examination, 'objective' values for halitosis were gathered through two different organoleptic assessments and by the measurement of volatile sulfur compounds (VSC). Additionally, tongue coating and the modified periodontal screening index (PSI) were evaluated for each participant. The questionnaire revealed that 32% of all subjects sometimes or often experienced halitosis. The organoleptic evaluation (grade 0-5) identified 48 persons with grade 3 and higher. Measurement of VSC identified 117 subjects (28%) with readings of >or= 75 parts per billion (ppb). Tongue coating, modified PSI, and smoking were significantly associated with higher organoleptic scores, and tongue coating and smoking were associated with higher VSC values. For about one-third of the Bernese city population, halitosis seems to pose an oral health problem. Only a weak correlation between self-reported halitosis and either organoleptic or VSC measurements could be detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Sound epidemiologic data on halitosis are rare. We evaluated the prevalence of halitosis in a young male adult population in Switzerland using a standardized questionnaire and clinical examination. METHODS: Six hundred twenty-six Swiss Army recruits aged 18 to 25 years (mean: 20.3 years) were selected as study subjects. First, a standardized questionnaire focusing on dental hygiene, self-reported halitosis, smoking, and alcohol consumption was filled out by all participants. In the clinical examination, objective values for the presence of halitosis were gathered through an organoleptic assessment of the breath odor and the measurement of volatile sulfur compounds (VSCs). Additionally, tongue coating, plaque index, and probing depths were evaluated for each recruit. RESULTS: The questionnaire revealed that only 17% of all included recruits had never experienced halitosis. The organoleptic evaluation (grades 0 to 3) identified eight persons with grade 3, 148 persons with grade 2, and 424 persons with grade 1 or 0. The calculation of the Pearson correlation coefficient to evaluate the relationship among the three methods of assessing halitosis revealed little to no correlation. The organoleptic score showed high reproducibility (kappa = 0.79). Tongue coating was the only influencing factor found to contribute to higher organoleptic scores and higher VSC values. CONCLUSIONS: Oral malodor seemed to pose an oral health problem for about one-fifth of 20-year-old Swiss males questioned. No correlation between self-reported halitosis and organoleptic or VSC measurements could be detected. Although the organoleptic method described here offers a high reproducibility, the lack of correlation between VSC values and organoleptic scores has to be critically addressed. For further studies assessing new organoleptic scores, a validated index should always be included as a direct control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Absolute quantitation of clinical (1)H-MR spectra is virtually always incomplete for single subjects because the separate determination of spectrum, baseline, and transverse and longitudinal relaxation times in single subjects is prohibitively long. Integrated Processing and Acquisition of Data (IPAD) based on a combined 2-dimensional experimental and fitting strategy is suggested to substantially improve the information content from a given measurement time. A series of localized saturation-recovery spectra was recorded and combined with 2-dimensional prior-knowledge fitting to simultaneously determine metabolite T(1) (from analysis of the saturation-recovery time course), metabolite T(2) (from lineshape analysis based on metabolite and water peak shapes), macromolecular baseline (based on T(1) differences and analysis of the saturation-recovery time course), and metabolite concentrations (using prior knowledge fitting and conventional procedures of absolute standardization). The procedure was tested on metabolite solutions and applied in 25 subjects (15-78 years old). Metabolite content was comparable to previously found values. Interindividual variation was larger than intraindividual variation in repeated spectra for metabolite content as well as for some relaxation times. Relaxation times were different for various metabolite groups. Parts of the interindividual variation could be explained by significant age dependence of relaxation times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of the important morbidity and mortality associated with osteoporosis, it is essential to detect subjects at risk by screening methods, such as bone quantitative ultrasounds (QUSs). Several studies showed that QUS could predict fractures. None, however, compared prospectively different QUS devices, and few data of quality controls (QCs) have been published. The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk is a prospective multicenter study that compared three QUSs for the assessment of hip fracture risk in a population of 7609 women age >/=70 yr. Because the inclusion phase lasted 20 mo, and because 10 centers participated in this study, QC became a major issue. We therefore developed a QC procedure to assess the stability and precision of the devices, and for their cross-calibration. Our study focuses on the two heel QUSs. The water bath system (Achilles+) had a higher precision than the dry system (Sahara). The QC results were highly dependent on temperature. QUS stability was acceptable, but Sahara must be calibrated regularly. A sufficient homogeneity among all the Sahara devices could be demonstrated, whereas significant differences were found among the Achilles+ devices. For speed of sound, 52% of the differences among the Achilles+ was explained by the water s temperature. However, for broadband ultrasound attenuation, a maximal difference of 23% persisted after adjustment for temperature. Because such differences could influence measurements in vivo, it is crucial to develop standardized phantoms to be used in prospective multicenter studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variability of toxicity data contained within databases was investigated using the widely used US EPA ECOTOX database as an example. Fish acute lethality (LC50) values for 44 compounds (for which at least 10 data entries existed) were extracted from the ECOTOX database yielding a total of 4654 test records. Significant variability of LC50 test results was observed, exceeding several orders of magnitude. In an attempt to systematically explore potential causes of the data variability, the influence of biological factors (such as test species or life stages) and physical factors (such as water temperature, pH or water hardness) were examined. Even after eliminating the influence of these inherent factors, considerable data variability remained, suggesting an important role of factors relating to technical and measurement procedures. The analysis, however, was limited by pronounced gaps in the test documentation. Of the 4654 extracted test reports, 66.5% provided no information on the fish life stage used for testing. Likewise, water temperature, hardness or pH were not recorded in 19.6%, 48.2% and 41.2% of the data entries, respectively. From these findings, we recommend the rigorous control of data entries ensuring complete recording of testing conditions. A more consistent database will help to better discriminate between technical and natural variability of the test data, which is of importance in ecological risk assessment for extrapolation from laboratory tests to the field, and also might help to develop correction factors that account for systematic differences in test results caused by species, life stage or test conditions.