973 resultados para Data Coordinating Center


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research examines prevalence of alcohol and illicit substance use in the United States and Mexico and associated socio-demographic characteristics. The sources of data for this study are public domain data from the U.S. National Household Survey of Drug Abuse, 1988 (n = 8814), and the Mexican National Survey of Addictions, 1988 (n = 12,579). In addition, this study discusses methodologic issues in cross-cultural and cross-national comparison of behavioral and epidemiologic data from population-based samples. The extent to which patterns of substance abuse vary among subgroups of the U.S. and Mexican populations is assessed, as well as the comparability and equivalence of measures of alcohol and drug use in these national samples.^ The prevalence of alcohol use was somewhat similar in the two countries for all three measures of use: lifetime, past year and past year heavy use, (85.0%, 68.1%, 39.6% and 72.6%, 47.7% and 45.8% for the U.S. and Mexico respectively). The use of illegal substances varied widely between countries, with U.S. respondents reporting significantly higher levels of use than their Mexican counterparts. For example, reported use of any illicit substance in lifetime and past year was 34.2%, 11.6 for the U.S., and 3.3% and 0.6% for Mexico. Despite these differences in prevalence, two demographic characteristics, gender and age, were important correlates of use in both countries. Men in both countries were more likely to report use of alcohol and illicit substances than women. Generally speaking, a greater proportion of respondents in both countries 18 years of age or older reported use of alcohol for all three measures than younger respondents; and a greater proportion of respondents between the ages of 18 and 34 years reported use of illicit substances during lifetime and past year than any other age group.^ Additional substantive research investigating population-based samples and at-risk subgroups is needed to understand the underlying mechanisms of these associations. Further development of cross-culturally meaningful survey methods is warranted to validate comparisons of substance use across countries and societies. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer is a chronic disease that often necessitates recurrent hospitalizations, a costly pattern of medical care utilization. In chronically ill patients, most readmissions are for treatment of the same condition that caused the preceding hospitalization. There is concern that rather than reducing costs, earlier discharge may shift costs from the initial hospitalization to emergency center visits. ^ This is the first descriptive study to measure the incidence of emergency center visits (ECVs) after hospitalization at The University of M. D. Anderson Cancer Center (UTMDACC), to identify the risk factors for and outcomes of these ECVs, and to compare 30-day all-cause mortality and costs for episodes of care with and without ECVs. ^ We identified all hospitalizations at UTMDACC with admission dates from September 1, 1993 through August 31, 1997 which met inclusion criteria. Data were electronically obtained primarily from UTMDACC's institutional database. Demographic factors, clinical factors, duration of the index hospitalization, method of payment for care, and year of hospitalization study were variables determined for each hospitalization. ^ The overall incidence of ECVs was 18%. Forty-five percent of ECVs resulted in hospital readmission (8% of all hospitalizations). In 1% of ECVs the patient died in the emergency center, and for the remaining 54% of ECVs the patient was discharged home. Risk factors for ECVs were marital status, type of index hospitalization, cancer type, and duration of the index hospitalization. The overall 30-day all-cause mortality rate was 8.6% for hospitalizations with an ECV and 5.3% for those without an ECV. In all subgroups, the 30-day all-cause mortality rate was higher for groups with ECVs than for those without ECVs. The most important factor increasing cost was having an ECV. In all patient subgroups, the cost per episode of care with an ECV was at least 1.9 times the cost per episode without an ECV. ^ The higher costs and poorer outcomes of episodes of care with ECVs and hospital readmissions suggest that interventions to avoid these ECVs or mitigate their costs are needed. Further research is needed to improve understanding of the methodological issues involved in relation to health care issues for cancer patients. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many studies in biostatistics deal with binary data. Some of these studies involve correlated observations, which can complicate the analysis of the resulting data. Studies of this kind typically arise when a high degree of commonality exists between test subjects. If there exists a natural hierarchy in the data, multilevel analysis is an appropriate tool for the analysis. Two examples are the measurements on identical twins, or the study of symmetrical organs or appendages such as in the case of ophthalmic studies. Although this type of matching appears ideal for the purposes of comparison, analysis of the resulting data while ignoring the effect of intra-cluster correlation has been shown to produce biased results.^ This paper will explore the use of multilevel modeling of simulated binary data with predetermined levels of correlation. Data will be generated using the Beta-Binomial method with varying degrees of correlation between the lower level observations. The data will be analyzed using the multilevel software package MlwiN (Woodhouse, et al, 1995). Comparisons between the specified intra-cluster correlation of these data and the estimated correlations, using multilevel analysis, will be used to examine the accuracy of this technique in analyzing this type of data. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

(1) A mathematical theory for computing the probabilities of various nucleotide configurations is developed, and the probability of obtaining the correct phylogenetic tree (model tree) from sequence data is evaluated for six phylogenetic tree-making methods (UPGMA, distance Wagner method, transformed distance method, Fitch-Margoliash's method, maximum parsimony method, and compatibility method). The number of nucleotides (m*) necessary to obtain the correct tree with a probability of 95% is estimated with special reference to the human, chimpanzee, and gorilla divergence. m* is at least 4,200, but the availability of outgroup species greatly reduces m* for all methods except UPGMA. m* increases if transitions occur more frequently than transversions as in the case of mitochondrial DNA. (2) A new tree-making method called the neighbor-joining method is proposed. This method is applicable either for distance data or character state data. Computer simulation has shown that the neighbor-joining method is generally better than UPGMA, Farris' method, Li's method, and modified Farris method on recovering the true topology when distance data are used. A related method, the simultaneous partitioning method, is also discussed. (3) The maximum likelihood (ML) method for phylogeny reconstruction under the assumption of both constant and varying evolutionary rates is studied, and a new algorithm for obtaining the ML tree is presented. This method gives a tree similar to that obtained by UPGMA when constant evolutionary rate is assumed, whereas it gives a tree similar to that obtained by the maximum parsimony tree and the neighbor-joining method when varying evolutionary rate is assumed. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrophysiological experiments were performed on 96 male New Zealand white rabbits, anesthetized with urethane. Glass electrodes, filled with 2M NaCl, were used for microstimulation of three fiber pathways projecting from "limbic" centers to the ventromedial nucleus of the hypothalamus (VMH). Unitary and field potential recordings were made in the VMH after stimulation.^ Stimulation of the lateral portion of the fimbria, which carries fibers from the ventral subiculum of the hippocampal formation, evokes predominantly an inhibition of neurons medially in the VMH, and excitation of neurons located laterally.^ Stimulation of the dorsal portion of the stria terminalis, which carries fibers from the cortical nucleus of the amygdala, also produces predominantly an inhibition of cells medially and excitation laterally.^ Stimulation of the ventral component of the stria terminalis, which carries fibers from the medial nucleus of the amygdala, evokes excitation of cell medially, with little or no response seen laterally.^ Cells recorded medially in the VMH received convergent inputs from each of the three fiber systems: inhibition from fimbria and dorsal stria stimulation, excitation from ventral stria stimulation.^ The excitatory unitary responses recorded medially to ventral stria stimulation and laterally to fimbria and dorsal stria stimulation were subjected to a series of threshold stimulus intensities. From these tests it was determined that each of these three projections terminates monosynaptically on VMH neurons.^ The evidence for convergence upon single VMH neurons of projections from the amygdala and the hippocampal formation suggests this area of the brain to be important for integration of information from these two limbic centers. The VMH has been implied in a number of behavioral states: eating, reproduction, defense and aggression; it has further been linked to control of the anterior pituitary. These data provide a functional circuit through which the amygdaloid complex and the hippocampal formation can channel information from higher cortical centers into a hypothalamic area capable of coordinating behavioral and hormonal responses. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generic search for anomalous production of events with at least three charged leptons is presented. The search uses a pp-collision data sample at a center-of-mass energy of root s = 7 TeV corresponding to 4.6 fb(-1) of integrated luminosity collected in 2011 by the ATLAS detector at the CERN Large Hadron Collider. Events are required to contain at least two electrons or muons, while the third lepton may either be an additional electron or muon, or a hadronically decaying tau lepton. Events are categorized by the presence or absence of a reconstructed tau-lepton or Z-boson candidate decaying to leptons. No significant excess above backgrounds expected from Standard Model processes is observed. Results are presented as upper limits on event yields from non-Standard-Model processes producing at least three prompt, isolated leptons, given as functions of lower bounds on several kinematic variables. Fiducial efficiencies for model testing are also provided. The use of the results is illustrated by setting upper limits on the production of doubly charged Higgs bosons decaying to same-sign lepton pairs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE Intensity-modulated radiotherapy (IMRT) credentialing for a EORTC study was performed using an anthropomorphic head phantom from the Radiological Physics Center (RPC; RPC(PH)). Institutions were retrospectively requested to irradiate their institutional phantom (INST(PH)) using the same treatment plan in the framework of a Virtual Phantom Project (VPP) for IMRT credentialing. MATERIALS AND METHODS CT data set of the institutional phantom and measured 2D dose matrices were requested from centers and sent to a dedicated secure EORTC uploader. Data from the RPC(PH) and INST(PH) were thereafter centrally analyzed and inter-compared by the QA team using commercially available software (RIT; ver.5.2; Colorado Springs, USA). RESULTS Eighteen institutions participated to the VPP. The measurements of 6 (33%) institutions could not be analyzed centrally. All other centers passed both the VPP and the RPC ±7%/4 mm credentialing criteria. At the 5%/5 mm gamma criteria (90% of pixels passing), 11(92%) as compared to 12 (100%) centers pass the credentialing process with RPC(PH) and INST(PH) (p = 0.29), respectively. The corresponding pass rate for the 3%/3 mm gamma criteria (90% of pixels passing) was 2 (17%) and 9 (75%; p = 0.01), respectively. CONCLUSIONS IMRT dosimetry gamma evaluations in a single plane for a H&N prospective trial using the INST(PH) measurements showed agreement at the gamma index criteria of ±5%/5 mm (90% of pixels passing) for a small number of VPP measurements. Using more stringent, criteria, the RPC(PH) and INST(PH) comparison showed disagreement. More data is warranted and urgently required within the framework of prospective studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapeutic resistance remains the principal problem in acute myeloid leukemia (AML). We used area under receiver-operating characteristic curves (AUCs) to quantify our ability to predict therapeutic resistance in individual patients, where AUC=1.0 denotes perfect prediction and AUC=0.5 denotes a coin flip, using data from 4601 patients with newly diagnosed AML given induction therapy with 3+7 or more intense standard regimens in UK Medical Research Council/National Cancer Research Institute, Dutch–Belgian Cooperative Trial Group for Hematology/Oncology/Swiss Group for Clinical Cancer Research, US cooperative group SWOG and MD Anderson Cancer Center studies. Age, performance status, white blood cell count, secondary disease, cytogenetic risk and FLT3-ITD/NPM1 mutation status were each independently associated with failure to achieve complete remission despite no early death (‘primary refractoriness’). However, the AUC of a bootstrap-corrected multivariable model predicting this outcome was only 0.78, indicating only fair predictive ability. Removal of FLT3-ITD and NPM1 information only slightly decreased the AUC (0.76). Prediction of resistance, defined as primary refractoriness or short relapse-free survival, was even more difficult. Our limited ability to forecast resistance based on routinely available pretreatment covariates provides a rationale for continued randomization between standard and new therapies and supports further examination of genetic and posttreatment data to optimize resistance prediction in AML.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of fully-automatic localization and segmentation of 3D intervertebral discs (IVDs) from MR images. Our method contains two steps, where we first localize the center of each IVD, and then segment IVDs by classifying image pixels around each disc center as foreground (disc) or background. The disc localization is done by estimating the image displacements from a set of randomly sampled 3D image patches to the disc center. The image displacements are estimated by jointly optimizing the training and test displacement values in a data-driven way, where we take into consideration both the training data and the geometric constraint on the test image. After the disc centers are localized, we segment the discs by classifying image pixels around disc centers as background or foreground. The classification is done in a similar data-driven approach as we used for localization, but in this segmentation case we are aiming to estimate the foreground/background probability of each pixel instead of the image displacements. In addition, an extra neighborhood smooth constraint is introduced to enforce the local smoothness of the label field. Our method is validated on 3D T2-weighted turbo spin echo MR images of 35 patients from two different studies. Experiments show that compared to state of the art, our method achieves better or comparable results. Specifically, we achieve for localization a mean error of 1.6-2.0 mm, and for segmentation a mean Dice metric of 85%-88% and a mean surface distance of 1.3-1.4 mm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose Ophthalmologists are confronted with a set of different image modalities to diagnose eye tumors e.g., fundus photography, CT and MRI. However, these images are often complementary and represent pathologies differently. Some aspects of tumors can only be seen in a particular modality. A fusion of modalities would improve the contextual information for diagnosis. The presented work attempts to register color fundus photography with MRI volumes. This would complement the low resolution 3D information in the MRI with high resolution 2D fundus images. Methods MRI volumes were acquired from 12 infants under the age of 5 with unilateral retinoblastoma. The contrast-enhanced T1-FLAIR sequence was performed with an isotropic resolution of less than 0.5mm. Fundus images were acquired with a RetCam camera. For healthy eyes, two landmarks were used: the optic disk and the fovea. The eyes were detected and extracted from the MRI volume using a 3D adaption of the Fast Radial Symmetry Transform (FRST). The cropped volume was automatically segmented using the Split Bregman algorithm. The optic nerve was enhanced by a Frangi vessel filter. By intersection the nerve with the retina the optic disk was found. The fovea position was estimated by constraining the position with the angle between the optic and the visual axis as well as the distance from the optic disk. The optical axis was detected automatically by fitting a parable on to the lens surface. On the fundus, the optic disk and the fovea were detected by using the method of Budai et al. Finally, the image was projected on to the segmented surface using the lens position as the camera center. In tumor affected eyes, the manually segmented tumors were used instead of the optic disk and macula for the registration. Results In all of the 12 MRI volumes that were tested the 24 eyes were found correctly, including healthy and pathological cases. In healthy eyes the optic nerve head was found in all of the tested eyes with an error of 1.08 +/- 0.37mm. A successful registration can be seen in figure 1. Conclusions The presented method is a step toward automatic fusion of modalities in ophthalmology. The combination enhances the MRI volume with higher resolution from the color fundus on the retina. Tumor treatment planning is improved by avoiding critical structures and disease progression monitoring is made easier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A measurement of the production processes of the recently discovered Higgs boson is performed in the two-photon final state using 4.5  fb −1 of proton-proton collisions data at s √ =7  TeV and 20.3  fb −1 at s √ =8  TeV collected by the ATLAS detector at the Large Hadron Collider. The number of observed Higgs boson decays to diphotons divided by the corresponding Standard Model prediction, called the signal strength, is found to be μ=1.17±0.27 at the value of the Higgs boson mass measured by ATLAS, m H =125.4  GeV . The analysis is optimized to measure the signal strengths for individual Higgs boson production processes at this value of m H . They are found to be μ ggF =1.32±0.38 , μ VBF =0.8±0.7 , μ WH =1.0±1.6 , μ ZH =0.1 +3.7 −0.1 , and μ tt ¯ H =1.6 +2.7 −1.8 , for Higgs boson production through gluon fusion, vector-boson fusion, and in association with a W or Z boson or a top-quark pair, respectively. Compared with the previously published ATLAS analysis, the results reported here also benefit from a new energy calibration procedure for photons and the subsequent reduction of the systematic uncertainty on the diphoton mass resolution. No significant deviations from the predictions of the Standard Model are found.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An improved measurement of the mass of the Higgs boson is derived from a combined fit to the reconstructed invariant mass spectra of the decay channels H→γγ and H→ZZ ∗ →4ℓ . The analysis uses the pp collision data sample recorded by the ATLAS experiment at the CERN Large Hadron Collider at center-of-mass energies of 7 TeV and 8 TeV, corresponding to an integrated luminosity of 25  fb −1 . The measured value of the Higgs boson mass is m H =125.36±0.37(stat)±0.18(syst)  GeV . This result is based on improved energy-scale calibrations for photons, electrons, and muons as well as other analysis improvements, and supersedes the previous result from ATLAS. Upper limits on the total width of the Higgs boson are derived from fits to the invariant mass spectra of the H→γγ and H→ZZ ∗ →4ℓ decay channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.