936 resultados para Consistent labeling
Resumo:
BACKGROUND Cardiac events (CEs) are among the most serious late effects following childhood cancer treatment. To establish accurate risk estimates for the occurrence of CEs it is essential that they are graded in a valid and consistent manner, especially for international studies. We therefore developed a data-extraction form and a set of flowcharts to grade CEs and tested the validity and consistency of this approach in a series of patients. METHODS The Common Terminology Criteria for Adverse Events version 3.0 and 4.0 were used to define the CEs. Forty patients were randomly selected from a cohort of 72 subjects with known CEs that had been graded by a physician for an earlier study. To establish whether the new method was valid for appropriate grading, a non-physician graded the CEs by using the new method. To evaluate consistency of the grading, the same charts were graded again by two other non-physicians, one with receiving brief introduction and one with receiving extensive training on the new method. We calculated weighted Kappa statistics to quantify inter-observer agreement. RESULTS The inter-observer agreement was 0.92 (95% CI 0.80-1.00) for validity, and 0.88 (0.79-0.98) and 0.99 (0.96-1.00) for consistency with the outcome assessors who had the brief introduction and the extensive training, respectively. CONCLUSIONS The newly developed standardized method to grade CEs using data from medical records has shown excellent validity and consistency. The study showed that the method can be correctly applied by researchers without a medical background, provided that they receive adequate training.
Resumo:
BACKGROUND Unilateral ischemic stroke disrupts the well balanced interactions within bilateral cortical networks. Restitution of interhemispheric balance is thought to contribute to post-stroke recovery. Longitudinal measurements of cerebral blood flow (CBF) changes might act as surrogate marker for this process. OBJECTIVE To quantify longitudinal CBF changes using arterial spin labeling MRI (ASL) and interhemispheric balance within the cortical sensorimotor network and to assess their relationship with motor hand function recovery. METHODS Longitudinal CBF data were acquired in 23 patients at 3 and 9 months after cortical sensorimotor stroke and in 20 healthy controls using pulsed ASL. Recovery of grip force and manual dexterity was assessed with tasks requiring power and precision grips. Voxel-based analysis was performed to identify areas of significant CBF change. Region-of-interest analyses were used to quantify the interhemispheric balance across nodes of the cortical sensorimotor network. RESULTS Dexterity was more affected, and recovered at a slower pace than grip force. In patients with successful recovery of dexterous hand function, CBF decreased over time in the contralesional supplementary motor area, paralimbic anterior cingulate cortex and superior precuneus, and interhemispheric balance returned to healthy control levels. In contrast, patients with poor recovery presented with sustained hypoperfusion in the sensorimotor cortices encompassing the ischemic tissue, and CBF remained lateralized to the contralesional hemisphere. CONCLUSIONS Sustained perfusion imbalance within the cortical sensorimotor network, as measured with task-unrelated ASL, is associated with poor recovery of dexterous hand function after stroke. CBF at rest might be used to monitor recovery and gain prognostic information.
Resumo:
A rapid and simple DNA labeling system has been developed for disposable microarrays and has been validated for the detection of 117 antibiotic resistance genes abundant in Gram-positive bacteria. The DNA was fragmented and amplified using phi-29 polymerase and random primers with linkers. Labeling and further amplification were then performed by classic PCR amplification using biotinylated primers specific for the linkers. The microarray developed by Perreten et al. (Perreten, V., Vorlet-Fawer, L., Slickers, P., Ehricht, R., Kuhnert, P., Frey, J., 2005. Microarray-based detection of 90 antibiotic resistance genes of gram-positive bacteria. J.Clin.Microbiol. 43, 2291-2302.) was improved by additional oligonucleotides. A total of 244 oligonucleotides (26 to 37 nucleotide length and with similar melting temperatures) were spotted on the microarray, including genes conferring resistance to clinically important antibiotic classes like β-lactams, macrolides, aminoglycosides, glycopeptides and tetracyclines. Each antibiotic resistance gene is represented by at least 2 oligonucleotides designed from consensus sequences of gene families. The specificity of the oligonucleotides and the quality of the amplification and labeling were verified by analysis of a collection of 65 strains belonging to 24 species. Association between genotype and phenotype was verified for 6 antibiotics using 77 Staphylococcus strains belonging to different species and revealed 95% test specificity and a 93% predictive value of a positive test. The DNA labeling and amplification is independent of the species and of the target genes and could be used for different types of microarrays. This system has also the advantage to detect several genes within one bacterium at once, like in Staphylococcus aureus strain BM3318, in which up to 15 genes were detected. This new microarray-based detection system offers a large potential for applications in clinical diagnostic, basic research, food safety and surveillance programs for antimicrobial resistance.
Resumo:
Thyroid transcription factor 1 (TTF-1) is encoded by the NKX2-1 homeobox gene. Besides specifying thyroid and pulmonary organogenesis, it is also temporarily expressed during embryonic development of the ventral forebrain. We recently observed widespread immunoreactivity for TTF-1 in a case of subependymal giant cell astrocytoma (SEGA, WHO grade I) – a defining lesion of the tuberous sclerosis complex (TSC). This prompted us to investigate additional SEGAs in this regard. We found tumor cells in all 7 specimens analyzed to be TTF-1 positive. In contrast, we did not find TTF-1 immunoreactivity in a cortical tuber or two renal angiomyolipomas resected from TSC patients. We propose our finding of consistent TTF-1 expression in SEGAs to indicate lineage-committed derivation of these tumors from a regionally specified cell of origin. The medial ganglionic eminence, ventral septal region, and preoptic area of the developing brain may represent candidates for the origin of SEGAs. Such lineagerestricted histogenesis may also explain the stereotypic distribution of SEGAs along the caudate nucleus in the lateral ventricles.
Resumo:
The paper considers panel data methods for estimating ordered logit models with individual-specific correlated unobserved heterogeneity. We show that a popular approach is inconsistent, whereas some consistent and efficient estimators are available, including minimum distance and generalized method-of-moment estimators. A Monte Carlo study reveals the good properties of an alternative estimator that has not been considered in econometric applications before, is simple to implement and almost as efficient. An illustrative application based on data from the German Socio-Economic Panel confirms the large negative effect of unemployment on life satisfaction that has been found in the previous literature.
Resumo:
In this article, the realization of a global terrestrial reference system (TRS) based on a consistent combination of Global Navigation Satellite System (GNSS) and Satellite Laser Ranging (SLR) is studied. Our input data consists of normal equation systems from 17 years (1994– 2010) of homogeneously reprocessed GPS, GLONASS and SLR data. This effort used common state of the art reduction models and the same processing software (Bernese GNSS Software) to ensure the highest consistency when combining GNSS and SLR. Residual surface load deformations are modeled with a spherical harmonic approach. The estimated degree-1 surface load coefficients have a strong annual signal for which the GNSS- and SLR-only solutions show very similar results. A combination including these coefficients reduces systematic uncertainties in comparison to the singletechnique solution. In particular, uncertainties due to solar radiation pressure modeling in the coefficient time series can be reduced up to 50 % in the GNSS+SLR solution compared to the GNSS-only solution. In contrast to the ITRF2008 realization, no local ties are used to combine the different geodetic techniques.We combine the pole coordinates as global ties and apply minimum constraints to define the geodetic datum. We show that a common origin, scale and orientation can be reliably realized from our combination strategy in comparison to the ITRF2008.
Resumo:
The Jovian moon, Europa, hosts a thin neutral gas atmosphere, which is tightly coupled to Jupiter's magnetosphere. Magnetospheric ions impacting the surface sputter off neutral atoms, which, upon ionization, carry currents that modify the magnetic field around the moon. The magnetic field in the plasma is also affected by Europa's induced magnetic field. In this paper we investigate the environment of Europa using our multifluid MHD model and focus on the effects introduced by both the magnetospheric and the pickup ion populations. The model self-consistently derives the electron temperature that governs the electron impact ionization process, which is the major source of ionization in this environment. The resulting magnetic field is compared to measurements performed by the Galileo magnetometer, the bulk properties of the modeled thermal plasma population is compared to the Galileo Plasma Subsystem observations, and the modeled surface precipitation fluxes are compared to Galileo Ultraviolet Spectrometer observations. The model shows good agreement with the measured magnetic field and reproduces the basic features of the plasma interaction observed at the moon for both the E4 and the E26 flybys of the Galileo spacecraft. The simulation also produces perturbations asymmetric about the flow direction that account for observed asymmetries.
Resumo:
This study presents static measurements of the Ca isotopic composition of standard reference materials SRM 915 a/b on a Triton Plus™ thermal ionization mass spectrometer with a specially developed Faraday cup array allowing simultaneous measurement of 40Ca and 48Ca. The total amount of Ca in all analyses was kept < 1 µg. With this setup the measurement uncertainties were 0.06 ‰ for 40Ca/44Ca and 0.12 ‰ for 48Ca/40Ca. Measuring all isotopes simultaneously better allows to test the internal consistency of different Ca isotope abundances reported in the literature. The exponential law was observed to correct incompletely instrumental mass fractionation. An improved fractionation correction based on the exponential law is proposed. It changes the 40Ca/44Ca ratio of SRM 915a (corrected relative to 42Ca/44Ca = 0.31221; 48Ca/44Ca = 0.08871) from 47.1635 ± 0.0028 to 47.1649 ± 0.0047. The measurements of SRM 915b were performed with different analytical conditions (runs were prolonged till complete filament load depletion). Even if the 40Ca/44Ca ratio of SRM 915b, when corrected with the simple exponential law, appears different (47.1532 ± 0.0038) from that of SRM 915a, it becomes coincident (47.1613 ± 0.0028) when corrected with a second-order refinement. This supports the use of the improved exponential law to obtain internally consistent Ca isotope ratio for natural samples.
Resumo:
The objective of this research has been to study the molecular basis for chromosome aberration formation. Predicated on a variety of data, Mitomycin C (MMC)-induced DNA damage has been postulated to cause the formation of chromatid breaks (and gaps) by preventing the replication of regions of the genome prior to mitosis. The basic protocol for these experiments involved treating synchronized Hela cells in G(,1)-phase with a 1 (mu)g/ml dose of MMC for one hour. After removing the drug, cells were then allowed to progress to mitosis and were harvested for analysis by selective detachment. Utilizing the alkaline elution assay for DNA damage, evidence was obtained to support the conclusion that Hela cells can progress through S-phase into mitosis with intact DNA-DNA interstrand crosslinks. A higher level of crosslinking was observed in those cells remaining in interphase compared to those able to reach mitosis at the time of analysis. Dual radioisotope labeling experiments revealed that, at this dose, these crosslinks were associated to the same extent with both parental and newly replicated DNA. This finding was shown not to be the result of a two-step crosslink formation mechanism in which crosslink levels increase with time after drug treatment. It was also shown not to be an artefact of the double-labeling protocol. Using neutral CsCl density gradient ultracentrifugation of mitotic cells containing BrdU-labeled newly replicated DNA, control cells exhibited one major peak at a heavy/light density. However, MMC-treated cells had this same major peak at the heavy/light density, in addition to another minor peak at a density characteristic for light/light DNA. This was interpreted as indicating either: (1) that some parental DNA had not been replicated in the MMC treated sample or; (2) that a recombination repair mechanism was operational. To distinguish between these two possibilities, flow cytometric DNA fluorescence (i.e., DNA content) measurements of MMC-treated and control cells were made. These studies revealed that the mitotic cells that had been treated with MMC while in G(,1)-phase displayed a 10-20% lower DNA content than untreated control cells when measured under conditions that neutralize chromosome condensation effects (i.e., hypotonic treatment). These measurements were made under conditions in which the binding of the drug, MMC, was shown not to interfere with the stoichiometry of the ethidium bromide-mithramycin stain. At the chromosome level, differential staining techniques were used in an attempt to visualize unreplicated regions of the genome, but staining indicative of large unreplicated regions was not observed. These results are best explained by a recombinogenic mechanism. A model consistent with these results has been proposed.^
Resumo:
Material Safety Data Sheets (MSDSs) are an integral component of occupational hazard communication systems. These documents are used to disseminate hazard information to workers on chemical substances. The primary purpose of this study was to investigate the comprehensibility of MSDSs by workers at an international level. ^ A total of 117 employees of a multi-national petrochemical company participated; thirty-nine (39) each in the United States, Canada and the United Kingdom. Overall participation rate of those approached to participate was 82%. These countries were selected as they each utilize one of the three major existing hazard communication systems for fixed workplaces. The systems are comprised of the Occupational Safety and Health Administration's Hazard Communication Standard in the United States, the Workplace Hazardous Materials Information System (WHMIS) in Canada, and the compilation of several European Union directives addressing classification, labeling of substances and preparations, and MSDSs in Europe. ^ A pretest posttest randomized study design was used, with the posttest being comparable to an open book test. The results of this research indicated that only about two-thirds of the information on the MSDSs was comprehended by the workers with a significant difference identified among study participants based on country comparisons. This data was fairly consistent with the results of previous MSDS comprehensibility studies conducted in the United States. There was no significant difference in the comprehension level among study participants when taking into account the international hazard communication standard that the MSDS complied with. Marginally, age, education level and experience level did not have a significant impact on the comprehension level. ^ Participants did find MSDSs to be satisfactory in providing the information needed to protect them regardless of their views on the readability and formatting of MSDSs. The health-related information was the least comprehended as less than half of it was comprehended on the basis of the responses. The findings from this research suggest that there is much work needed yet to make MSDSs more comprehensible on a global basis, particularly regarding health-related information. ^
Resumo:
Arterial spin labeling (ASL) is a technique for noninvasively measuring cerebral perfusion using magnetic resonance imaging. Clinical applications of ASL include functional activation studies, evaluation of the effect of pharmaceuticals on perfusion, and assessment of cerebrovascular disease, stroke, and brain tumor. The use of ASL in the clinic has been limited by poor image quality when large anatomic coverage is required and the time required for data acquisition and processing. This research sought to address these difficulties by optimizing the ASL acquisition and processing schemes. To improve data acquisition, optimal acquisition parameters were determined through simulations, phantom studies and in vivo measurements. The scan time for ASL data acquisition was limited to fifteen minutes to reduce potential subject motion. A processing scheme was implemented that rapidly produced regional cerebral blood flow (rCBF) maps with minimal user input. To provide a measure of the precision of the rCBF values produced by ASL, bootstrap analysis was performed on a representative data set. The bootstrap analysis of single gray and white matter voxels yielded a coefficient of variation of 6.7% and 29% respectively, implying that the calculated rCBF value is far more precise for gray matter than white matter. Additionally, bootstrap analysis was performed to investigate the sensitivity of the rCBF data to the input parameters and provide a quantitative comparison of several existing perfusion models. This study guided the selection of the optimum perfusion quantification model for further experiments. The optimized ASL acquisition and processing schemes were evaluated with two ASL acquisitions on each of five normal subjects. The gray-to-white matter rCBF ratios for nine of the ten acquisitions were within ±10% of 2.6 and none were statistically different from 2.6, the typical ratio produced by a variety of quantitative perfusion techniques. Overall, this work produced an ASL data acquisition and processing technique for quantitative perfusion and functional activation studies, while revealing the limitations of the technique through bootstrap analysis. ^
Resumo:
This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the central bank should adopt a loss function that differs from the social loss function. Carefully designing the central bank s loss function with consistent targets can harmonize optimal and consistent policy. This desirable result emerges from two observations. First, the social loss function reflects a normative process that does not necessarily prove consistent with the structure of the microeconomy. Thus, the social loss function cannot serve as a direct loss function for the central bank. Second, an optimal loss function for the central bank must depend on the structure of that microeconomy. In addition, this paper shows that control theory provides a benchmark for institution design in a game-theoretical framework.
Resumo:
This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the social loss function cannot serve as a direct loss function for the central bank. Accordingly, we employ implementation theory to design a central bank loss function (mechanism design) with consistent targets, while the social loss function serves as a social welfare criterion. That is, with the correct mechanism design for the central bank loss function, optimal policy and consistent policy become identical. In other words, optimal policy proves implementable (consistent).
Resumo:
Kydland and Prescott (1977) develop a simple model of monetary policy making, where the central bank needs some commitment technique to achieve optimal monetary policy over time. Although not their main focus, they illustrate the difference between consistent and optimal policy in a sequential-decision one-period world. We employ the analytical method developed in Yuan and Miller (2005), whereby the government appoints a central bank with consistent targets or delegates consistent targets to the central bank. Thus, the central bank s welfare function differs from the social welfare function, which cause consistent policy to prove optimal.