977 resultados para POPULATION ANALYSIS


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We report a high-quality draft sequence of the genome of the horse (Equus caballus). The genome is relatively repetitive but has little segmental duplication. Chromosomes appear to have undergone few historical rearrangements: 53% of equine chromosomes show conserved synteny to a single human chromosome. Equine chromosome 11 is shown to have an evolutionary new centromere devoid of centromeric satellite DNA, suggesting that centromeric function may arise before satellite repeat accumulation. Linkage disequilibrium, showing the influences of early domestication of large herds of female horses, is intermediate in length between dog and human, and there is long-range haplotype sharing among breeds.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Equine insect bite hypersensitivity (IBH) is a seasonal IgE-mediated dermatosis caused by bites of insects of the genus Culicoides. A familial predisposition for the disease has been shown but, except for the MHC, the genes involved have not been identified so far. An immunogenomic analysis of IBH was performed in a model population of Old Kladruby horses, all living in the same environment. Clinical signs of IBH were used as phenotypic manifestation of IBH. Furthermore, total serum IgE levels were determined in the sera of these horses and used as an independent phenotypic marker for the immunogenetic analysis. Single nucleotide polymorphisms (SNPs) in candidate immunity-related genes were used for association analyses. Genotypes composed of two to five genes encoding interferon gamma -IFNG, transforming growth factor beta 1 -TGFB1, Janus kinase 2 -JAK2, thymic stromal lymphopoietin -TSLP, and involucrin -IVL were associated with IBH, indicating a role of the genes in the pathogenesis of IBH. These findings were supported by analysis of gene expression in skin biopsies of 15 affected and 15 unaffected horses. Two markers associated with IBH, IFNG and TGFB1, showed differences in mRNA expression in skin biopsies from IBH-affected and non-affected horses (p<0.05). Expression of the gene coding for the CD14 receptor molecule -CD14 was different in skin biopsies at p<0.06. When total IgE levels were treated as binary traits, genotypes of IGHE, ELA-DRA, and IL10/b were associated with this trait. When treated as a continuous trait, total IgE levels were associated with genes IGHE, FCER1A, IL4, IL4R, IL10, IL1RA, and JAK2. This first report on non-MHC genes associated with IBH in horses is thus supported by differences in expression of genes known to play a role in allergy and immunity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives: Etravirine (ETV) is metabolized by cytochrome P450 (CYP) 3A, 2C9, and 2C19. Metabolites are glucuronidated by uridine diphosphate glucuronosyltransferases (UGT). To identify the potential impact of genetic and non-genetic factors involved in ETV metabolism, we carried out a two-step pharmacogenetics-based population pharmacokinetic study in HIV-1 infected individuals. Materials and methods: The study population included 144 individuals contributing 289 ETV plasma concentrations and four individuals contributing 23 ETV plasma concentrations collected in a rich sampling design. Genetic variants [n=125 single-nucleotide polymorphisms (SNPs)] in 34 genes with a predicted role in ETV metabolism were selected. A first step population pharmacokinetic model included non-genetic and known genetic factors (seven SNPs in CYP2C, one SNP in CYP3A5) as covariates. Post-hoc individual ETV clearance (CL) was used in a second (discovery) step, in which the effect of the remaining 98 SNPs in CYP3A, P450 cytochrome oxidoreductase (POR), nuclear receptor genes, and UGTs was investigated. Results: A one-compartment model with zero-order absorption best characterized ETV pharmacokinetics. The average ETV CL was 41 (l/h) (CV 51.1%), the volume of distribution was 1325 l, and the mean absorption time was 1.2 h. The administration of darunavir/ritonavir or tenofovir was the only non-genetic covariate influencing ETV CL significantly, resulting in a 40% [95% confidence interval (CI): 13–69%] and a 42% (95% CI: 17–68%) increase in ETV CL, respectively. Carriers of rs4244285 (CYP2C19*2) had 23% (8–38%) lower ETV CL. Co-administered antiretroviral agents and genetic factors explained 16% of the variance in ETV concentrations. None of the SNPs in the discovery step influenced ETV CL. Conclusion: ETV concentrations are highly variable, and co-administered antiretroviral agents and genetic factors explained only a modest part of the interindividual variability in ETV elimination. Opposing effects of interacting drugs effectively abrogate genetic influences on ETV CL, and vice-versa.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE Patients with Alzheimer's disease (AD) have an increased risk of developing seizures or epilepsy. Little is known about the role of risk factors and about the risk of developing seizures/epilepsy in patients with vascular dementia (VD). The aim of this study was to assess incidence rates (IRs) of seizures/epilepsy in patients with AD, VD, or without dementia, and to identify potential risk factors of seizures or epilepsy. METHODS We conducted a follow-up study with a nested case-control analysis using the United Kingdom-based General Practice Research Database (GPRD). We identified patients aged ≥65 years with an incident diagnosis of AD or VD between 1998 and 2008 and a matched comparison group of dementia-free patients. Conditional logistic regression was used to estimate the odds ratio (OR) with a 95% confidence interval (CI) of developing seizures/epilepsy in patients with AD or VD, stratified by age at onset and duration of dementia as well as by use of antidementia drugs. KEY FINDINGS Among 7,086 cases with AD, 4,438 with VD, and 11,524 matched dementia-free patients, we identified 180 cases with an incident diagnosis of seizures/epilepsy. The IRs of epilepsy/seizures for patients with AD or VD were 5.6/1,000 person-years (py) (95% CI 4.6-6.9) and 7.5/1,000 py (95% CI 5.7-9.7), respectively, and 0.8/1,000 py (95% CI 0.6-1.1) in the dementia-free group. In the nested case-control analysis, patients with longer standing (≥3 years) AD had a slightly higher risk of developing seizures or epilepsy than those with a shorter disease duration, whereas in patients with VD the contrary was observed. SIGNIFICANCE Seizures or epilepsy were substantially more common in patients with AD and VD than in dementia-free patients. The role of disease duration as a risk factor for seizures/epilepsy seems to differ between AD and VD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Patients and Methods Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988–2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Results Median age was 69 years, 47.4 % were female and 86.0 % White. Median survival was 11 months overall, with an overall increase from 8 to 14 months between 1988 and 2008. Overall survival increased from 8 to 14 months for White, and from 6 to 13 months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p<0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95 % confidence interval 0.991–1.000) per year (p=0.03). Conclusion A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Elevated resting heart rate is known to be detrimental to morbidity and mortality in cardiovascular disease, though its effect in patients with ischemic stroke is unclear. We analyzed the effect of baseline resting heart rate on myocardial infarction (MI) in patients with a recent noncardioembolic cerebral ischemic event participating in PERFORM. METHODS We compared fatal or nonfatal MI using adjusted Cox proportional hazards models for PERFORM patients with baseline heart rate <70 bpm (n=8178) or ≥70 bpm (n=10,802). In addition, heart rate was analyzed as a continuous variable. Other cerebrovascular and cardiovascular outcomes were also explored. RESULTS Heart rate ≥70 bpm was associated with increased relative risk for fatal or nonfatal MI (HR 1.32, 95% CI 1.03-1.69, P=0.029). For every 5-bpm increase in heart rate, there was an increase in relative risk for fatal and nonfatal MI (11.3%, P=0.0002). Heart rate ≥70 bpm was also associated with increased relative risk for a composite of fatal or nonfatal ischemic stroke, fatal or nonfatal MI, or other vascular death (excluding hemorrhagic death) (P<0001); vascular death (P<0001); all-cause mortality (P<0001); and fatal or nonfatal stroke (P=0.04). For every 5-bpm increase in heart rate, there were increases in relative risk for fatal or nonfatal ischemic stroke, fatal or nonfatal MI, or other vascular death (4.7%, P<0.0001), vascular death (11.0%, P<0.0001), all-cause mortality (8.0%, P<0.0001), and fatal and nonfatal stroke (2.4%, P=0.057). CONCLUSION Elevated heart rate ≥70 bpm places patients with a noncardioembolic cerebral ischemic event at increased risk for MI.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As part of the global sheep Hapmap project, 24 individuals from each of seven indigenous Swiss sheep breeds (Bundner Oberländer sheep (BOS), Engadine Red sheep (ERS), Swiss Black-Brown Mountain sheep (SBS), Swiss Mirror sheep (SMS), Swiss White Alpine (SWA) sheep, Valais Blacknose sheep (VBS) and Valais Red sheep (VRS)), were genotyped using Illumina’s Ovine SNP50 BeadChip. In total, 167 animals were subjected to a detailed analysis for genetic diversity using 45 193 informative single nucleotide polymorphisms. The results of the phylogenetic analyses supported the known proximity between populations such as VBS and VRS or SMS and SWA. Average genomic relatedness within a breed was found to be 12 percent (BOS), 5 percent (ERS), 9 percent (SBS), 10 percent (SMS), 9 percent (SWA), 12 percent (VBS) and 20 percent (VRS). Furthermore, genomic relationships between breeds were found for single individuals from SWA and SMS, VRS and VBS as well as VRS and BOS. In addition, seven out of 40 indicated parent–offspring pairs could not be confirmed. These results were further supported by results from the genome-wide population cluster analysis. This study provides a better understanding of fine-scale population structures within and between Swiss sheep breeds. This relevant information will help to increase the conservation activities of the local Swiss sheep breeds.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The 220 abundantly equipped burials from the Late Iron Age cemetery of Münsingen (420 – 240 BC) marked a milestone for Iron Age research. The evident horizontal spread throughout the time of occupancy laid the foundation for the chronology system of the Late Iron Age. Today the skulls of 77 individuals and some postcranial bones are still preserved. The aim was to obtain information about nutrition, social stratification and migration of the individuals from Münsingen. Stable isotope ratios of carbon, nitrogen and sulphur were analysed. The results of 63 individuals show that all consumed C3 plants as staple food with significant differences between males and females in δ13C and δ15N values. The results indicate a gender restriction in access to animal protein. Stable isotope values of one male buried with weapons and meat as grave goods suggest a diet with more animal proteins than the other individuals. It is possible that he was privileged due to high status. Furthermore, the δ34S values indicate minor mobility. Assuming that the subadults represent the local signal of δ34S it is very likely that adults with enriched δ34S could have migrated to Münsingen at some point during their lives. This study presents stable isotope values of one of the most important Late Iron Age burial sites in Central Europe. The presented data provide new insight into diet, migration and social stratification of the population from Münsingen.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Gender and racial/ethnic disparities in colorectal cancer screening (CRC) has been observed and associated with income status, education level, treatment and late diagnosis. According to the American Cancer Society, among both males and females, CRC is the third most frequently diagnosed type of cancer and accounts for 10% of cancer deaths in the United States. Differences in CRC test use have been documented and limited to access to health care, demographics and health behaviors, but few studies have examined the correlates of CRC screening test use by gender. This present study examined the prevalence of CRC screening test use and assessed whether disparities are explained by gender and racial/ethnic differences. To assess these associations, the study utilized a cross-sectional design and examined the distribution of the covariates for gender and racial/ethnic group differences using the chi square statistic. Logistic regression was used to estimate the prevalence odds ratio and to adjust for the confounding effects of the covariates. ^ Results indicated there are disparities in the use of CRC screening test use and there were statistically significant difference in the prevalence for both FOBT and endoscopy screening between gender, χ2, p≤0.003. Females had a lower prevalence of endoscopy colorectal cancer screening than males when adjusting for age and education (OR 0.88, 95% CI 0.82–0.95). However, no statistically significant difference was reported between racial/ethnic groups, χ 2 p≤0.179 after adjusting for age, education and gender. For both FOBT and endoscopy screening Non-Hispanic Blacks and Hispanics had a lower prevalence of screening compared with Non-Hispanic Whites. In the multivariable regression model, the gender disparities could largely be explained by age, income status, education level, and marital status. Overall, individuals between the age "70–79" years old, were married, with some college education and income greater than $20,000 were associated with a higher prevalence of colorectal cancer screening test use within gender and racial/ethnic groups. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The relative influence of race, income, education, and Food Stamp Program participation/nonparticipation on the food and nutrient intake of 102 fecund women ages 18-45 years in a Florida urban clinic population was assessed using the technique of multiple regression analysis. Study subgroups were defined by race and Food Stamp Program participation status. Education was found to have the greatest influence on food and nutrient intake. Race was the next most influential factor followed in order by Food Stamp Program participation and income. The combined effect of the four independent variables explained no more than 19 percent of the variance for any of the food and nutrient intake variables. This would indicate that a more complex model of influences is needed if variations in food and nutrient intake are to be fully explained.^ A socioeconomic questionnaire was administered to investigate other factors of influence. The influence of the mother, frequency and type of restaurant dining, and perceptions of food intake and weight were found to be factors deserving further study.^ Dietary data were collected using the 24-hour recall and food frequency checklist. Descriptive dietary findings indicated that iron and calcium were nutrients where adequacy was of concern for all study subgroups. White Food Stamp Program participants had the greatest number of mean nutrient intake values falling below the 1980 Recommended Dietary Allowances (RDAs). When Food Stamp Program participants were contrasted to nonparticipants, mean intakes of six nutrients (kilocalories, calcium, iron, vitamin A, thiamin, and riboflavin) were below the 1980 RDA compared to five mean nutrient intakes (kilocalories, calcium, iron, thiamin and riboflavin) for the nonparticipants. Use of the Index of Nutritional Quality (INQ), however, revealed that the quality of the diet of Food Stamp Program participants per 1000 kilocalories was adequate with exception of calcium and iron. Intakes of these nutrients were also not adequate on a 1000 kilocalorie basis for the nonparticipant group. When mean nutrient intakes of the groups were compared using Student's t-test oleicacid intake was the only significant difference found. Being a nonparticipant in the Food Stamp Program was found to be associated with more frequent consumption of cookies, sweet rolls, doughnuts, and honey. The findings of this study contradict the negative image of the Food Stamp Program participant and emphasize the importance of education. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this study was to analyze the implementation of national family planning policy in the United States, which was embedded in four separate statutes during the period of study, Fiscal Years 1976-81. The design of the study utilized a modification of the Sabatier and Mazmanian framework for policy analysis, which defined implementation as the carrying out of statutory policy. The study was divided into two phases. The first part of the study compared the implementation of family planning policy by each of the pertinent statutes. The second part of the study identified factors that were associated with implementation of federal family planning policy within the context of block grants.^ Implemention was measured here by federal dollars spent for family planning, adjusted for the size of the respective state target populations. Expenditure data were collected from the Alan Guttmacher Institute and from each of the federal agencies having administrative authority for the four pertinent statutes, respectively. Data from the former were used for most of the analysis because they were more complete and more reliable.^ The first phase of the study tested the hypothesis that the coherence of a statute is directly related to effective implementation. Equity in the distribution of funds to the states was used to operationalize effective implementation. To a large extent, the results of the analysis supported the hypothesis. In addition to their theoretical significance, these findings were also significant for policymakers insofar they demonstrated the effectiveness of categorical legislation in implementing desired health policy.^ Given the current and historically intermittent emphasis on more state and less federal decision-making in health and human serives, the second phase of the study focused on state level factors that were associated with expenditures of social service block grant funds for family planning. Using the Sabatier-Mazmanian implementation model as a framework, many factors were tested. Those factors showing the strongest conceptual and statistical relationship to the dependent variable were used to construct a statistical model. Using multivariable regression analysis, this model was applied cross-sectionally to each of the years of the study. The most striking finding here was that the dominant determinants of the state spending varied for each year of the study (Fiscal Years 1976-1981). The significance of these results was that they provided empirical support of current implementation theory, showing that the dominant determinants of implementation vary greatly over time. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Evaluation of the impact of a disease on life expectancy is an important part of public health. Potential gains in life expectancy (PGLE) that can properly take into account the competing risks are an effective indicator for measuring the impact of the multiple causes of death. This study aimed to measure the PGLEs from reducing/eliminating the major causes of death in the USA from 2001 to 2008. To calculate the PGLEs due to the elimination of specific causes of death, the age-specific mortality rates for heart disease, malignant neoplasms, Alzheimer disease, kidney diseases and HIV/AIDS and life table constructing data were obtained from the National Center for Health Statistics, and the multiple decremental life tables were constructed. The PGLEs by elimination of heart disease, malignant neoplasms or HIV/AIDS continued decreasing from 2001 to 2008, but the PGLE by elimination of Alzheimer's disease or kidney diseases revealed increased trends. The PGLEs (by years) for all race, male, female, white, white male, white female, black, black male and black female at birth by complete elimination of heart disease 2001–2008 were 0.336–0.299, 0.327–0.301, 0.344–0.295, 0.360–0.315, 0.349–0.317, 0.371–0.316,0.278–0.251, 0.272–0.255, and 0.282–0.246 respectively. Similarly, the PGLEs (by years) for all race, male, female, white, white male, white female, black, black male and black female at birth by complete elimination of malignant neoplasms, Alzheimer's disease, kidney disease or HIV/AIDS 2001–2008 were also uncovered, respectively. Most diseases affect specific population, such as, HIV/AIDS tends to have a greater impact on people of working age, heart disease and malignant neoplasms have a greater impact on people over 65 years of age, but Alzheimer's disease and kidney diseases have a greater impact on people over 75 years of age. To measure the impact of these diseases on life expectancy in people of working age, partial multiple decremental life tables were constructed and the PGLEs were computed by partial or complete elimination of various causes of death during the working years. Thus, the results of the study outlined a picture of how each single disease could affect the life expectancy in age-, race-, or sex-specific population in USA. Therefore, the findings would not only assist to evaluate current public health improvements, but also provide useful information for future research and disease control programs.^