729 resultados para NORMALIZATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter provides a theoretical overview of some of the ways in which the process of normalization functions as forms of hidden privilege. It outlines how privileged groups come to represent the dominant norm whereby white, male, able-bodied, heterosexual, middle-class people in Western societies come to embody what it means to be normal. It also explores strategies for challenging the normalization of privilege by encouraging the development of responsibility not only for individual actions but also for the social practices which create them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performance in strength and power sports is greatly affected by a variety of anthropometric factors. The goal of performance normalization is to factor out the effects of confounding factors and compute a canonical (normalized) performance measure from the observed absolute performance. Performance normalization is applied in the ranking of elite athletes, as well as in the early stages of youth talent selection. Consequently, it is crucial that the process is principled and fair. The corpus of previous work on this topic, which is significant, is uniform in the methodology adopted. Performance normalization is universally reduced to a regression task: the collected performance data are used to fit a regression function that is then used to scale future performances. The present article demonstrates that this approach is fundamentally flawed. It inherently creates a bias that unfairly penalizes athletes with certain allometric characteristics, and, by virtue of its adoption in the ranking and selection of elite athletes, propagates and strengthens this bias over time. The main flaws are shown to originate in the criteria for selecting the data used for regression, as well as in the manner in which the regression model is applied in normalization. This analysis brings into light the aforesaid methodological flaws and motivates further work on the development of principled methods, the foundations of which are also laid out in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been argued that pornography is the most prominent sex educator for young people today (Flood, M. (2010). Young men using pornography. In E. Boyle (Ed.), Everyday Pornography (pp. 164–178). Oxford: Routledge). Research indicates that first exposure to pornography can be as young as 11 years of age. There is evidence that exposure to pornography is shaping young people’s sexual expectations and practices (H€aggstro ̈ m-Nordin et al. 2005). Many young people are learning what sex looks like from what they – or their partner or peers – observe in pornography. Significantly,pornography is normalizing sex acts that most women do not enjoy and may experience as degrading, painful, or violating. This raises serious implications for young people’s capacity to develop a sexuality that incorporates mutual pleasure, respect, and negotiation of free and full consent.While the results are complex and nuanced, research into the effects of pornography consumption provides reliable evidence that exposure to pornography increases aggressive attitudes and behavior towards women for some viewers (Malamuth et al. Annual Review of Sex Research 11, 26–91, 2000). Pornography consumption also has been found to be associated with sexual health risk taking and can impact negatively on body image and sense of self (Dean, L. (2007). Young Men, Pornography and Sexual Health Promotion, MA Research, Brighton University, Brighton, in possession of the author), and as such is a serious health and well-being issue, particularly for young women.This chapter explores preservice teachers’ reactions to pornography education using two examples from teaching of an elective Teaching Sexuality in the Middle Years, in 2011. These examples explore the complex emotions such teaching can generate and the challenges faced by preservice teachers when they are encouraged to confront the gendered and violent consequences of the normalization of pornography in a coeducational setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Primary pulmonary choriocarcinoma (PPC) is rare and frequently leads to death.CASES: Two young patients presented with previous molar pregnancy and spontaneous serum human chorionic gonadotropin (hCG) normalization. Patient 1 was referred to our center after partial response to chemotherapy. Pulmonary lobectomy was performed, and hCG rapidly declined. During further chemotherapy, liver metastasis was detected by positron emission tomography. Right hepatectomy was performed, and hCG declined for 28 days, but increased again despite chemotherapy. This patient died from hepatic failure 3 years after diagnosis. Patient 2 presented with persistently high hCG, though the affected organ was not identified. Chemotherapy was unsuccessful. Patient reevaluation showed an isolated pulmonary mass. Pulmonary lobectomy was performed; 2 weeks later, hCG was normal and consolidation with 2 cycles of chemotherapy was administered. The patient has been in remission for 24 months. PPC was confirmed by histo pathology and immunohistochemistry in both cases. Gestational origin of the tumor was confirmed by molecular genetic analysis (polymorphic microsatellite markers).CONCLUSION: The possibility of choriocarcinoma cannot be overlooked in young women with an isolated pulmonary mass. Early diagnosis, prompt chemotherapy, and surgical resection in a specialized center improves the prognosis. (J Reprod Med 2010;55:311-316)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of a spacecraft orbiting the Neptune-Triton system is presented. The new ingredients in this restricted three body problem are the Neptune oblateness and the high inclined and retrograde motion of Triton. First we present some interesting simulations showing the role played by the oblateness on a Neptune's satellite, disturbed by Triton. We also give an extensive numerical exploration in the case when the spacecraft orbits Triton, considering Sun, Neptune and its planetary oblateness as disturbers. In the plane a x I (a = semi-major axis, I = inclination), we give a plot of the stable regions where the massless body can survive for thousand of years. Retrograde and direct orbits were considered and as usual, the region of stability is much more significant for the case of direct orbit of the spacecraft (Triton's orbit is retrograde). Next we explore the dynamics in a vicinity of the Lagrangian points. The Birkhoff normalization is constructed around L-2, followed by its reduction to the center manifold. In this reduced dynamics, a convenient Poincare section shows the interplay of the Lyapunov and halo periodic orbits, Lissajous and quasi-halo tori as well as the stable and unstable manifolds of the planar Lyapunov orbit. To show the effect of the oblateness, the planar Lyapunov family emanating from the Lagrangian points and three-dimensional halo orbits are obtained by the numerical continuation method. Published by Elsevier Ltd. on behalf of COSPAR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selection of reference genes is an essential consideration to increase the precision and quality of relative expression analysis by the quantitative RT-PCR method. The stability of eight expressed sequence tags was evaluated to define potential reference genes to study the differential expression of common bean target genes under biotic (incompatible interaction between common bean and fungus Colletotrichum lindemuthianum) and abiotic (drought; salinity; cold temperature) stresses. The efficiency of amplification curves and quantification cycle (C (q)) were determined using LinRegPCR software. The stability of the candidate reference genes was obtained using geNorm and NormFinder software, whereas the normalization of differential expression of target genes [beta-1,3-glucanase 1 (BG1) gene for biotic stress and dehydration responsive element binding (DREB) gene for abiotic stress] was defined by REST software. High stability was obtained for insulin degrading enzyme (IDE), actin-11 (Act11), unknown 1 (Ukn1) and unknown 2 (Ukn2) genes during biotic stress, and for SKP1/ASK-interacting protein 16 (Skip16), Act11, Tubulin beta-8 (beta-Tub8) and Unk1 genes under abiotic stresses. However, IDE and Act11 were indicated as the best combination of reference genes for biotic stress analysis, whereas the Skip16 and Act11 genes were the best combination to study abiotic stress. These genes should be useful in the normalization of gene expression by RT-PCR analysis in common bean, the most important edible legume.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] Acute hypoxia (AH) reduces maximal O2 consumption (VO2 max), but after acclimatization, and despite increases in both hemoglobin concentration and arterial O2 saturation that can normalize arterial O2 concentration ([O2]), VO2 max remains low. To determine why, seven lowlanders were studied at VO2 max (cycle ergometry) at sea level (SL), after 9-10 wk at 5,260 m [chronic hypoxia (CH)], and 6 mo later at SL in AH (FiO2 = 0.105) equivalent to 5,260 m. Pulmonary and leg indexes of O2 transport were measured in each condition. Both cardiac output and leg blood flow were reduced by approximately 15% in both AH and CH (P < 0.05). At maximal exercise, arterial [O2] in AH was 31% lower than at SL (P < 0.05), whereas in CH it was the same as at SL due to both polycythemia and hyperventilation. O2 extraction by the legs, however, remained at SL values in both AH and CH. Although at both SL and in AH, 76% of the cardiac output perfused the legs, in CH the legs received only 67%. Pulmonary VO2 max (4.1 +/- 0.3 l/min at SL) fell to 2.2 +/- 0.1 l/min in AH (P < 0.05) and was only 2.4 +/- 0.2 l/min in CH (P < 0.05). These data suggest that the failure to recover VO2 max after acclimatization despite normalization of arterial [O2] is explained by two circulatory effects of altitude: 1) failure of cardiac output to normalize and 2) preferential redistribution of cardiac output to nonexercising tissues. Oxygen transport from blood to muscle mitochondria, on the other hand, appears unaffected by CH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phosphatidylethanol (PEth) is a direct ethanol metabolite, and has recently attracted attention as biomarker of ethanol intake. The aims of the current study are: (1) to characterize the normalization time of PEth in larger samples than previously conducted; (2) to elucidate potential gender differences; and (3) to report the correlation of PEth with other biomarkers and self-reported alcohol consumption. Fifty-seven alcohol-dependent patients (ICD 10 F 10.25; 9 females, 48 males) entering medical detoxification at three study sites were enrolled. The study sample was comprised of 48 males and 9 females, with mean age 43.5. Mean gamma glutamyl transpeptidase (GGT) was 209.61 U/l, average mean corpuscular volume (MCV) was 97.35 fl, mean carbohydrate deficient transferrin (%CDT) was 8.68, and mean total ethanol intake in the last 7 days was 1653 g. PEth was measured in heparinized whole blood with a high-pressure liquid chromatography method, while GGT, MCV and %CDT were measured using routine methods. PEth levels at day 1 of detoxification ranged between 0.63 and 26.95 micromol/l (6.22 mean, 4.70 median, SD 4.97). There were no false negatives at day 1. Sensitivities for the other biomarkers were 40.4% for MCV, 73.1% for GGT and 69.2% for %CDT, respectively. No gender differences were found for PEth levels at any time point. Our data suggest that PEth is (1) a suitable intermediate term marker of ethanol intake in both sexes; and (2) sensitivity is extraordinary high in alcohol dependent patients. The results add further evidence to the data that suggest that PEth has potential as a candidate for a sensitive and specific biomarker, which reflects longer-lasting intake of higher amounts of alcohol and seemingly has the above mentioned certain advantages over traditional biomarkers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In most microarray technologies, a number of critical steps are required to convert raw intensity measurements into the data relied upon by data analysts, biologists and clinicians. These data manipulations, referred to as preprocessing, can influence the quality of the ultimate measurements. In the last few years, the high-throughput measurement of gene expression is the most popular application of microarray technology. For this application, various groups have demonstrated that the use of modern statistical methodology can substantially improve accuracy and precision of gene expression measurements, relative to ad-hoc procedures introduced by designers and manufacturers of the technology. Currently, other applications of microarrays are becoming more and more popular. In this paper we describe a preprocessing methodology for a technology designed for the identification of DNA sequence variants in specific genes or regions of the human genome that are associated with phenotypes of interest such as disease. In particular we describe methodology useful for preprocessing Affymetrix SNP chips and obtaining genotype calls with the preprocessed data. We demonstrate how our procedure improves existing approaches using data from three relatively large studies including one in which large number independent calls are available. Software implementing these ideas are avialble from the Bioconductor oligo package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to measure gene expression on a genome-wide scale is one of the most promising accomplishments in molecular biology. Microarrays, the technology that first permitted this, were riddled with problems due to unwanted sources of variability. Many of these problems are now mitigated, after a decade’s worth of statistical methodology development. The recently developed RNA sequencing (RNA-seq) technology has generated much excitement in part due to claims of reduced variability in comparison to microarrays. However, we show RNA-seq data demonstrates unwanted and obscuring variability similar to what was first observed in microarrays. In particular, we find GC-content has a strong sample specific effect on gene expression measurements that, if left uncorrected, leads to false positives in downstream results. We also report on commonly observed data distortions that demonstrate the need for data normalization. Here we describe statistical methodology that improves precision by 42% without loss of accuracy. Our resulting conditional quantile normalization (CQN) algorithm combines robust generalized regression to remove systematic bias introduced by deterministic features such as GC-content, and quantile normalization to correct for global distortions.