987 resultados para Marker development
Resumo:
The term disorders of sex development (DSD) includes congenital conditions in which development of chromosomal, gonadal or anatomical sex is atypical. Mutations in genes present in X, Y or autosomal chromosomes can cause abnormalities of testis determination or disorders of sex differentiation leading to 46,XY DSD. Detailed clinical phenotypes allow the identification of new factors that can alter the expression or function of mutated proteins helping to understand new undisclosed biochemical pathways. In this review we present an update on 46,XY DSD aetiology, diagnosis and treatment based on extensive review of the literature and our three decades of experience with these patients.
Resumo:
Background and objectives: As well as being a marker of body iron stores, serum ferritin (sFerritin) has also been shown to be a marker of inflammation in hemodialysis (HD) patients. The aim of this study was to analyze whether sFerritin is a reliable marker of the iron stores present in bone marrow of HD patients. Design: Histomorphometric analysis of stored transiliac bone biopsies was used to assess iron stores by determining the number of iron-stained cells per square millimeter of bone marrow. Results: In 96 patients, the laboratory parameters were hemoglobin = 11.3 +/- 1.6 g/dl, hematocrit = 34.3 +/- 5%, sFerritin 609 +/- 305 ng/ml, transferrin saturation = 32.7 +/- 22.5%, and C-reactive protein (CRP) = 0.9 +/- 1.4 mg/dl. sFerritin correlated significantly with CRP, bone marrow iron, and time on HD treatment W = 0.006, 0.001, and 0.048, respectively). The independent determinants of sFerritin were CRP (beta-coef = 0.26; 95% CI = 24.6 to 132.3) and bone marrow iron (beta-coef = 0.32; 95% CI = 0.54 to 2.09). Bone marrow iron was higher in patients with sFerritin >500 ng/ml than in those with sFerritin :5500 ng/ml. In the group of patients with sFerritin :5500 ng/ml, the independent determinant of sFerritin was bone marrow iron (beta-coef = 0.48, 95% CI = 0.48 to 1.78), but in the group of patients with sFerritin >500 ng/ml, no independent determinant of sFerritin was found. Conclusions: sFerritin adequately reflects iron stores in bone marrow of HD patients.
Resumo:
Background: Vascular calcification is common and constitutes a prognostic marker of mortality in the hemodialysis population. Derangements of mineral metabolism may influence its development. The aim of this study is to prospectively evaluate the association between bone remodeling disorders and progression of coronary artery calcification (CAC) in hemodialysis patients. Study Design: Cohort study nested within a randomized controlled trial. Setting & Participants: 64 stable hemodialysis patients. Predictor: Bone-related laboratory parameters and bone histomorphometric characteristics at baseline and after 1 year of follow-up. Outcomes: Progression of CAC assessed by means of coronary multislice tomography at baseline and after 1 year of follow-up. Baseline calcification score of 30 Agatston units or greater was defined as calcification. Change in calcification score of 15% or greater was defined as progression. Results: Of 64 patients, 26 (40%) had CAC at baseline and 38 (60%) did not. Participants without CAC at baseline were younger (P < 0.001), mainly men (P = 0.03) and nonwhite (P = 0.003), and had lower serum osteoprotegerin levels (P = 0.003) and higher trabecular bone volume (P = 0.001). Age (P 0.003; beta coefficient = 1.107; 95% confidence interval [Cl], 1.036 to 1.183) and trabecular bone volume (P = 0.006; beta coefficient = 0.828; 95% Cl, 0.723 to 0.948) were predictors for CAC development. Of 38 participants who had calcification at baseline, 26 (68%) had CAC progression in 1 year. Progressors had lower bone-specific alkaline phosphatase (P = 0.03) and deoxypyridinoline levels (P = 0.02) on follow-up, and low turnover was mainly diagnosed at the 12-month bone biopsy (P = 0.04). Low-turnover bone status at the 12-month bone biopsy was the only independent predictor for CAC progression (P = 0.04; beta coefficient = 4.5; 95% Cl, 1.04 to 19.39). According to bone histological examination, nonprogressors with initially high turnover (n = 5) subsequently had decreased bone formation rate (P = 0.03), and those initially with low turnover (n = 7) subsequently had increased bone formation rate (P = 0.003) and osteoid volume (P = 0.001). Limitations: Relatively small population, absence of patients with severe hyperparathyroidism, short observational period. Conclusions: Lower trabecular bone volume was associated with CAC development, whereas improvement in bone turnover was associated with lower CAC progression in patients with high- and low-turnover bone disorders. Because CAC is implicated in cardiovascular mortality, bone derangements may constitute a modifiable mortality risk factor in hemodialysis patients.
Resumo:
Proteinuria was associated with cardiovascular events and mortality in community-based cohorts. The association of proteinuria with mortality and cardiovascular events in patients undergoing percutaneous coronary intervention (PCI) was unknown. The association of urinary dipstick proteinuria with mortality and cardiovascular events (composite of death, myocardial infarction, or nonhemorrhagic stroke) in 5,835 subjects of the EXCITE trial was evaluated. Dipstick urinalysis was performed before PCI, and proteinuria was defined as trace or greater. Subjects were followed up for 210 days/7 months after enrollment for the occurrence of events. Multivariate Cox regression analysis evaluated the independent association of proteinuria with each outcome. Mean age was 59 years, 21% were women, 18% had diabetes mellitus, and mean estimated glomerular filtration rate was 90 ml/min/1.73 m(2). Proteinuria was present in 750 patients (13%). During follow-up, 22 subjects (2.9%) with proteinuria and 54 subjects (1.1%) without proteinuria died (adjusted hazard ratio 2.83, 95% confidence interval [CI] 1.65 to 4.84, p <0.001). The severity of proteinuria attenuated the strength of the association with mortality after PCI (low-grade proteinuria, hazard ratio 2.67, 95% CI 1.50 to 4.75; high-grade proteinuria, hazard ratio 3.76, 95% CI 1.24 to 11.37). No significant association was present for cardiovascular events during the relatively short follow-up, but high-grade proteinuria tended toward increased risk of cardiovascular events (hazard ratio 1.45, 95% CI 0.81 to 2.61). In conclusion, proteinuria was strongly and independently associated with mortality in patients undergoing PCI. These data suggest that such a relatively simple and clinically easy to use tool as urinary dipstick may be useful to identify and treat patients at high risk of mortality at the time of PCI. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1151-1155)
Resumo:
Lentil is a self-pollinating diploid (2n = 14 chromosomes) annual cool season legume crop that is produced throughout the world and is highly valued as a high protein food. Several abiotic stresses are important to lentil yields world wide and include drought, heat, salt susceptibility and iron deficiency. The biotic stresses are numerous and include: susceptibility to Ascochyta blight, caused by Ascochyta lentis; Anthracnose, caused by Colletotrichum truncatum; Fusarium wilt, caused by Fusarium oxysporum; Sclerotinia white mold, caused by Sclerotinia sclerotiorum; rust, caused by Uromyces fabae; and numerous aphid transmitted viruses. Lentil is also highly susceptible to several species of Orabanche prevalent in the Mediterranean region, for which there does not appear to be much resistance in the germplasm. Plant breeders and geneticists have addressed these stresses by identifying resistant/tolerant germplasm, determining the genetics involved and the genetic map positions of the resistant genes. To this end progress has been made in mapping the lentil genome and several genetic maps are available that eventually will lead to the development of a consensus map for lentil. Marker density has been limited in the published genetic maps and there is a distinct lack of co-dominant markers that would facilitate comparisons of the available genetic maps and efficient identification of markers closely linked to genes of interest. Molecular breeding of lentil for disease resistance genes using marker assisted selection, particularly for resistance to Ascochyta blight and Anthracnose, is underway in Australia and Canada and promising results have been obtained. Comparative genomics and synteny analyses with closely related legumes promises to further advance the knowledge of the lentil genome and provide lentil breeders with additional genes and selectable markers for use in marker assisted selection. Genomic tools such as macro and micro arrays, reverse genetics and genetic transformation are emerging technologies that may eventually be available for use in lentil crop improvement.
Resumo:
The prepartum surge in fetal plasma cortisol is essential for the normal timing of parturition in sheep and may result from an increase in the ratio of ACTH to proopiomelanocortin (POMC) in the fetal circulation. In fetuses subjected to experimental induction of placental restriction, the prepartum surge in fetal cortisol is exaggerated, whereas pituitary POMC mRNA levels are decreased, and in vitro, unstimulated ACTH secretion is elevated in corticotrophs nonresponsive to CRH. We therefore investigated the changes in the relative proportions of cells expressing POMC, ACTH, and the CRH type 1 receptor (CRHR1) shortly before birth and during chronic placental insufficiency. Placental restriction (PR) was induced by removal of the majority of placental attachment sites in five ewes before mating. Pituitaries were collected from control and PR fetal sheep at 140 d (control, n = 4; PR, n = 4) and 144 d (control, n = 6; PR, n = 4). Pituitary sections were labeled with specific antisera raised against POMC, ACTH, and CRHR1. Three major subpopulations of corticotrophs were identified that expressed POMC + ACTH + CRHR1, ACTH + CRHR1, or POMC only. The proportion of pituitary corticotrophs expressing POMC + ACTH + CRHR1 decreased (P < 0.05) between 140 (control, 60 +/- 1%; PR, 66 +/- 4%) and 144 (control, 45 +/- 2%; PR, 56 +/- 6%) d. A significantly higher (P < 0.05) proportion of corticotrophs expressed POMC + ACTH + CRHR1 in the pituitary of the PR group compared with controls. This study is the first to demonstrate subpopulations of corticotrophs in the fetal sheep pituitary that differentially express POMC, ACTH, and CRHR1 and the separate effects of gestational age and placental restriction on these subpopulations of corticotrophs.
Resumo:
Context: A better means to accurately identify malignant thyroid nodules and to distinguish them from benign tumors is needed. We previously identified markers for detecting thyroid malignancy, with sensitivity estimated at or close to 100%. One lingering problem with these markers was that false positives occurred with Hurthle cell adenomas (HCA) which lowered test specificity. Methods: To locate accurate diagnostic markers, we profiled in depth the transcripts of a HCA and a Hurthle cell carcinoma (HCC). From 1146 differentially expressed genes, 18 transcripts specifically expressed in HCA were tested by quantitative PCR in a wide range of thyroid tumors (n = 76). Sensibility and specificity were calculated using receiver operating characteristic (ROC). Selected markers were further validated in an independent set of thyroid tumors (n = 82) by immunohistochemistry. To define the panel that would yield best diagnostic accuracy, these markers were tested in combination with our previous identified markers. Results: Seventeen of the 18 genes showed statistical significance based on a mean relative level of expression (P < 0.05). KLK1 (sensitivity = 0.97) and PVALB (sensitivity = 0.94) were the best candidate markers. The combination of PVALB and C1orf24 increased specificity to > 97% and maintained sensitivity for detection of carcinoma. Conclusion: We identified tumor markers that can be used in combination for a more accurate preoperative diagnosis of thyroid nodules and for postoperative diagnosis of thyroid carcinoma in tumor sections. This improved test would help physicians rapidly focus treatment on true malignancies and avoid unnecessary treatment of benign tumors, simultaneously improving medical care and reducing costs. (J Clin Endocrinol Metab 96: E151-E160, 2011)
Resumo:
Background: The potential involvement of SRY in abnormal gonadal development in 45,X/46,X,der(Y) patients was proposed following the identification of SRY mutations in a few patients with Turner syndrome (TS). However, its exact etiological role in gonadal dysgenesis in patients with Y chromosome mosaicisms has not yet been clarified. Aims: It was the aim of this study to screen for allelic variation in SRY in a large cohort of patients with disorders of sex development due to chromosomal abnormalities with 45, X/46, X, der(Y) karyotype. Patients: Twenty-seven patients, 14 with TS and 13 with mixed gonadal dysgenesis (MGD), harboring 45, X/46, X, der(Y) karyotypes were selected. Methods: Genomic DNA was extracted from peripheral blood leukocytes of all patients and from gonadal tissue in 4 cases. The SRY coding region was PCR amplified and sequenced. Results: We identified only 1 polymorphism (c.561C -> T) in a 45,X/46,XY MGD patient, which was detected in blood and in gonadal tissue. Conclusion: Our results indicate that mutations in SRY are rare findings in patients with Y chromosome mosaicisms. Therefore, a significant role of mutated SRY in the etiology of gonadal dysgenesis in patients harboring 45, X/46, XY karyotype and variants seems very unlikely. Copyright (C) 2010 S. Karger AG, Basel
Resumo:
Nani FS, Torres MLA - Correlation between the Body Mass Index (BMI) of Pregnant Women and the Development of Hypotension after Spinal Anesthesia for Cesarean Section. Background and objectives: Very few publications correlate hypotension in obese pregnant women, and especially morbidly obese, after spinal anesthesia for cesarean section. The objective of the present study was to evaluate the incidence of hypotension according to the BMI. Methods: Forty-nine patients with pregestational BMI below 25 kg.m(-2) were included in the Eutrophia group, and 51 patients with BMI >= 25 kg.m(-2) were included in the Overweight group. After spinal anesthesia, blood pressure, volume of crystalloid infused, and dose of vasopressors used until delivery were recorded. A fall in systolic blood pressure below 100 mmHg or 10% reduction of the initial systolic blood pressure (SBP) was considered as hypotension and it was corrected by the administration of vasopressors. Results: Episodes of hypotension were fewer in the Eutrophia group (5.89 +/- 0.53 vs. 7.80 +/- 0.66, p = 0.027), as well as the amount of crystalloid administered (1,298 +/- 413.6 mL vs. 1,539 +/- 460.0 mL; p = 0.007), and use of vasopressors (5.87 +/- 3.45 bolus vs. 7.70 +/- 4.46 bolus; p = 0.023). As for associated diseases, we observed higher incidence of diabetes among obese pregnant women (29.41% vs. 9.76%, RR 1.60, 95%CI: 1.15-2.22, p = 0.036), however, differences in the incidence of pregnancy-induced hypertension (PIN) were not observe between both groups (overweight: 21.57%, normal weight: 12.20%, RR 1.30, 95%CI: 0.88-1.94, p = 0.28). Conclusions: In the study sample, pregestational BMI >= 25 kg.m(-2) was a risk factor for hypotension after spinal anesthesia in patients undergoing cesarean section. The same group of patients required higher doses of vasopressors. Those results indicate that the anesthetic techniques in those patients should be improved to reduce the consequences of post-spinal anesthesia hypotension, both in pregnant women and fetuses.
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
Background: This study was designed to evaluate serum potassium level variation in a porcine model of hemorrhagic shock ( HS). Methods: Eight pigs were studied in a controlled hemorrhage model of HS. Blood withdrawal began at a 50 mL/min to 70 mL/min rate, adjusted to reach a mean arterial pressure ( MAP) level of 60 mm Hg in 10 minutes. When MAP reached 60 mm Hg, the blood withdrawal rate was adjusted to maintain a MAP decrease rate of 10 mm Hg every 2 minutes to 4 minutes. Arterial and mixed venous blood samples were collected at MAP levels of 60 mm Hg, 50 mm Hg, 40 mm Hg, 30 mm Hg, 20 mm Hg, and 10 mm Hg and analyzed for oxygen saturation, PO(2), PCO(2), potassium, lactate, bicarbonate, hemoglobin, pH, and standard base excess. Results: Significant increase in serum potassium occurred early in all animals. The rate of rise in serum potassium and its levels accompanied the hemodynamic deterioration. Hyperkalemia ( K >5 mmol/L) incidence was 12.5% at 60 mm Hg and 50 mm Hg, 62.5% at 40 mm Hg, 87.5% at 30 mm Hg, and 100% at 20 mm Hg. Strong correlations were found between potassium levels and lactate ( R = 0.82), SvO(2) ( R = 0.87), Delta pH ( R = 0.83), and Delta PCO(2) ( R = 0.82). Conclusions: Serum potassium increase accompanies the onset of HS. The rise in serum potassium was directly related to the hemodynamic deterioration of HS and strongly correlated with markers of tissue hypoxia.
Resumo:
Increased Kt concentration in seawater induces metamorphosis in the ascidian Herdmania momus. Larvae cultivated at 24 degrees C exhibit highest rates of metamorphosis when treated with 40 mM KCl-elevated seawater at 21 degrees C. At 24 degrees C, H. momus larvae develop competence to respond to KCl-seawater and initiate metamorphosis approximately 3 h after hatching. Larval trunks and tails separated from the anterior papillae region, but maintained in a common tunic at a distance of greater than 60 mu m, do not undergo metamorphosis when treated with KCl-seawater; normal muscle degradation does not occur in separated tails while ampullae develop from papillae-containing anterior fragments. Normal programmed degradation of myofibrils occurs when posterior fragments are fused to papillae-containing anterior fragments. These data indicate that H. momus settlement and metamorphosis only occurs when larvae have attained competence, and suggest that an anterior signalling centre is stimulated to release a factor that induces metamorphosis.
Resumo:
Background: Dietary salt restriction has been reported to adversely modify the plasma lipoprotein profile in hypertensive and in normotensive subjects. We investigated the effects of the low sodium intake (LSI) on the plasma lipoprotein profile and on inflammation and thrombosis biomarkers during the fasting and postprandial periods. Methods: Non-obese, non-treated hypertensive adults (n=41) were fed strictly controlled diets. An initial week on a control diet (CID, Na=160 mmol/day) was followed by 3 weeks on LSI (Na=60mmol/day). At admission and on the last day of each period, the 24-h ambulatory blood pressure was monitored and blood was drawn after an overnight fasting period and after a fat-rich test meal. Results: The dietary adherence was confirmed by 24-h urinary sodium excretion. Fasting triglyceride (TG), chylomicron-cholesterol, hsC-reactive protein (CRP), tumor necrosis factor-a (TNF-alpha). interleukin-6 (IL-6) concentrations, renin activity, aldosterone, insulin, and homeostasis model assessment insulin resistance (HOMA-IR) Values were higher, but non-esterified fatty acids (NEFA) were lower on LSI than on CD. For LSI, areas under the curve (AUC) of TG, chylomicron-cholesterol, apoB and the cholesterol/apoB ratio were increased, whereas AUC-NEFA was lowered. LSI did not modify body weight, hematocrit, fasting plasma cholesterol, glucose, adiponectin, leptin, fibrinogen and factor VII (FVII), and AUC of lipoprotein lipase and of lipoprotein remnants. Conclusion: LSI induced alterations in the plasma lipoproteins and in inflammatory markers that are common features of the metabolic syndrome. (C) 2008 Elsevier Ireland Ltd. All rights reserved.