997 resultados para Normalization Methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The function of DNA-binding proteins is controlled not just by their abundance, but mainly at the level of their activity in terms of their interactions with DNA and protein targets. Moreover, the affinity of such transcription factors to their target sequences is often controlled by co-factors and/or modifications that are not easily assessed from biological samples. Here, we describe a scalable method for monitoring protein-DNA interactions on a microarray surface. This approach was designed to determine the DNA-binding activity of proteins in crude cell extracts, complementing conventional expression profiling arrays. Enzymatic labeling of DNA enables direct normalization of the protein binding to the microarray, allowing the estimation of relative binding affinities. Using DNA sequences covering a range of affinities, we show that the new microarray-based method yields binding strength estimates similar to low-throughput gel mobility-shift assays. The microarray is also of high sensitivity, as it allows the detection of a rare DNA-binding protein from breast cancer cells, the human tumor suppressor AP-2. This approach thus mediates precise and robust assessment of the activity of DNA-binding proteins and takes present DNA-binding assays to a high throughput level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Most quantitative empirical analyses are motivated by the desire to estimate the causal effect of an independent variable on a dependent variable. Although the randomized experiment is the most powerful design for this task, in most social science research done outside of psychology, experimental designs are infeasible. (Winship & Morgan, 1999, p. 659)." This quote from earlier work by Winship and Morgan, which was instrumental in setting the groundwork for their book, captures the essence of our review of Morgan and Winship's book: It is about causality in nonexperimental settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The centrifugal liquid membrane (CLM) cell has been utilized for chiroptical studies of liquid-liquid interfaces with a conventional circular dichroism (CD) spectropolarimeter. These studies required the characterization of optical properties of the rotating cylindrical CLM glass cell, which was used under the high speed rotation. In the present study, we have measured the circular and linear dichroism (CD and LD) spectra and the circular and linear birefringence (CB and LB) spectra of the CLM cell itself as well as those of porphyrine aggregates formed at the liquid-liquid interface in the CLM cell, applying Mueller matrix measurement method. From the results, it was confirmed that the CLM-CD spectra of the interfacial porphyrin aggregates observed by a conventional CD spectropolarimeter should be correct irrespective of LD and LB signals in the CLM cell.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the use of supplementary cementing materials (SCMs) in concrete mixtures, salt scaling tests such as ASTM C672 have been found to be overly aggressive and do correlate well with field scaling performance. The reasons for this are thought to be because at high replacement levels, SCM mixtures can take longer to set and to develop their properties: neither of these factors is taken into account in the standard laboratory finishing and curing procedures. As a result, these variables were studied as well as a modified scaling test, based on the Quebec BNQ scaling test that had shown promise in other research. The experimental research focused on the evaluation of three scaling resistance tests, including the ASTM C672 test with normal curing as well as an accelerated curing regime used by VDOT for ASTM C1202 rapid chloride permeability tests and now included as an option in ASTM C1202. As well, several variations on the proposed draft ASTM WK9367 deicer scaling resistance test, based on the Quebec Ministry of Transportation BNQ test method, were evaluated for concretes containing varying amounts of slag cement. A total of 16 concrete mixtures were studied using both high alkali cement and low alkali cement, Grade 100 slag and Grade 120 slag with 0, 20, 35 and 50 percent slag replacement by mass of total cementing materials. Vinsol resin was used as the primary air entrainer and Micro Air® was used in two replicate mixes for comparison. Based on the results of this study, a draft alternative test method to ASTM C762 is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The method of instrumental variable (referred to as Mendelian randomization when the instrument is a genetic variant) has been initially developed to infer on a causal effect of a risk factor on some outcome of interest in a linear model. Adapting this method to nonlinear models, however, is known to be problematic. In this paper, we consider the simple case when the genetic instrument, the risk factor, and the outcome are all binary. We compare via simulations the usual two-stages estimate of a causal odds-ratio and its adjusted version with a recently proposed estimate in the context of a clinical trial with noncompliance. In contrast to the former two, we confirm that the latter is (under some conditions) a valid estimate of a causal odds-ratio defined in the subpopulation of compliers, and we propose its use in the context of Mendelian randomization. By analogy with a clinical trial with noncompliance, compliers are those individuals for whom the presence/absence of the risk factor X is determined by the presence/absence of the genetic variant Z (i.e., for whom we would observe X = Z whatever the alleles randomly received at conception). We also recall and illustrate the huge variability of instrumental variable estimates when the instrument is weak (i.e., with a low percentage of compliers, as is typically the case with genetic instruments for which this proportion is frequently smaller than 10%) where the inter-quartile range of our simulated estimates was up to 18 times higher compared to a conventional (e.g., intention-to-treat) approach. We thus conclude that the need to find stronger instruments is probably as important as the need to develop a methodology allowing to consistently estimate a causal odds-ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pursuit of high response rates to minimise the threat of nonresponse bias continues to dominate decisions about resource allocation in survey research. Yet a growing body of research has begun to question this practice. In this study, we use previously unavailable data from a new sampling frame based on population registers to assess the value of different methods designed to increase response rates on the European Social Survey in Switzerland. Using sampling data provides information about both respondents and nonrespondents, making it possible to examine how changes in response rates resulting from the use of different fieldwork methods relate to changes in the composition and representativeness of the responding sample. We compute an R-indicator to assess representativity with respect to the sampling register variables, and find little improvement in the sample composition as response rates increase. We then examine the impact of response rate increases on the risk of nonresponse bias based on Maximal Absolute Bias (MAB), and coefficients of variation between subgroup response rates, alongside the associated costs of different types of fieldwork effort. The results show that increases in response rate help to reduce MAB, while only small but important improvements to sample representativity are gained by varying the type of effort. These findings lend further support to research that has called into question the value of extensive investment in procedures aimed at reaching response rate targets and the need for more tailored fieldwork strategies aimed both at reducing survey costs and minimising the risk of bias.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To compare different techniques for positive contrast imaging of susceptibility markers with MRI for three-dimensional visualization. As several different techniques have been reported, the choice of the suitable method depends on its properties with regard to the amount of positive contrast and the desired background suppression, as well as other imaging constraints needed for a specific application. MATERIALS AND METHODS: Six different positive contrast techniques are investigated for their ability to image at 3 Tesla a single susceptibility marker in vitro. The white marker method (WM), susceptibility gradient mapping (SGM), inversion recovery with on-resonant water suppression (IRON), frequency selective excitation (FSX), fast low flip-angle positive contrast SSFP (FLAPS), and iterative decomposition of water and fat with echo asymmetry and least-squares estimation (IDEAL) were implemented and investigated. RESULTS: The different methods were compared with respect to the volume of positive contrast, the product of volume and signal intensity, imaging time, and the level of background suppression. Quantitative results are provided, and strengths and weaknesses of the different approaches are discussed. CONCLUSION: The appropriate choice of positive contrast imaging technique depends on the desired level of background suppression, acquisition speed, and robustness against artifacts, for which in vitro comparative data are now available.