29 resultados para Danish wit and humor.
Resumo:
In this paper we compare the performance of two image classification paradigms (object- and pixel-based) for creating a land cover map of Asmara, the capital of Eritrea and its surrounding areas using a Landsat ETM+ imagery acquired in January 2000. The image classification methods used were maximum likelihood for the pixel-based approach and Bhattacharyya distance for the object-oriented approach available in, respectively, ArcGIS and SPRING software packages. Advantages and limitations of both approaches are presented and discussed. Classifications outputs were assessed using overall accuracy and Kappa indices. Pixel- and object-based classification methods result in an overall accuracy of 78% and 85%, respectively. The Kappa coefficient for pixel- and object-based approaches was 0.74 and 0.82, respectively. Although pixel-based approach is the most commonly used method, assessment and visual interpretation of the results clearly reveal that the object-oriented approach has advantages for this specific case-study.
Resumo:
Background Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004–2010, and described subsequent mortality and predictors of these. Methods Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient’s last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient’s death, 1st February 2010 or 6 months after the patient’s last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression. Results Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin’s lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004–2010 in this large observational cohort. Conclusions The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC.
Resumo:
Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.
Resumo:
Context: In virologically suppressed, antiretroviral-treated patients, the effect of switching to tenofovir (TDF) on bone biomarkers compared to patients remaining on stable antiretroviral therapy is unknown. Methods: We examined bone biomarkers (osteocalcin [OC], procollagen type 1 amino-terminal propeptide, and C-terminal cross-linking telopeptide of type 1 collagen) and bone mineral density (BMD) over 48 weeks in virologically suppressed patients (HIV RNA < 50 copies/ml) randomized to switch to TDF/emtricitabine (FTC) or remain on first-line zidovudine (AZT)/lamivudine (3TC). PTH was also measured. Between-group differences in bone biomarkers and associations between change in bone biomarkers and BMD measures were assessed by Student's t tests, Pearson correlation, and multivariable linear regression, respectively. All data are expressed as mean (SD), unless otherwise specified. Results: Of 53 subjects (aged 46.0 y; 84.9% male; 75.5% Caucasian), 29 switched to TDF/FTC. There were reductions in total hip and lumbar spine BMD in those switching to TDF/FTC (total hip, TDF/FTC, −1.73 (2.76)% vs AZT/3TC, −0.39 (2.41)%; between-group P = .07; lumbar spine, TDF/FTC, −1.50 (3.49)% vs AZT/3TC, +0.25 (2.82)%; between-group P = .06), but they did not reach statistical significance. Greater declines in lumbar spine BMD correlated with greater increases in OC (r = −0.28; P = .05). The effect of TDF/FTC on bone biomarkers remained significant when adjusted for baseline biomarker levels, gender, and ethnicity. There was no difference in change in PTH levels over 48 weeks between treatment groups (between-group P = .23). All biomarkers increased significantly from weeks 0 to 48 in the switch group, with no significant change in those remaining on AZT/3TC (between-group, all biomarkers, P < .0001). Conclusion: A switch to TDF/FTC compared to remaining on a stable regimen is associated with increases in bone turnover that correlate with reductions in BMD, suggesting that TDF exposure directly affects bone metabolism in vivo.
Resumo:
Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.
Resumo:
Due to widespread development of anthelmintic resistance in equine parasites, recommendations for their control are currently undergoing marked changes with a shift of emphasis toward more coprological surveillance and reduced treatment intensity. Denmark was the first nation to introduce prescription-only restrictions of anthelmintic drugs in 1999, but other European countries have implemented similar legislations over recent years. A questionnaire survey was performed in 2008 among Danish horse owners to provide a current status of practices and perceptions with relation to parasite control. Questions aimed at describing the current use of coprological surveillance and resulting anthelmintic treatment intensities, evaluating knowledge and perceptions about the importance of various attributes of parasite control, and assessing respondents' willingness to pay for advice and parasite surveillance services from their veterinarians. A total of 1060 respondents completed the questionnaire. A large majority of respondents (71.9%) were familiar with the concept of selective therapy. Results illustrated that the respondents' self-evaluation of their knowledge about parasites and their control associated significantly with their level of interest in the topic and their type of education (P<0.0001). The large majority of respondents either dewormed their horses twice a year and/or performed two fecal egg counts per horse per year. This approach was almost equally pronounced in foals, horses aged 1-3 years old, and adult horses. The respondents rated prevention of parasitic disease and prevention of drug resistance as the most important attributes, while cost and frequent fecal testing were rated least important. Respondents' actual spending on parasite control per horse in the previous year correlated significantly with the amount they declared themselves willing to spend (P<0.0001). However, 44.4% declared themselves willing to pay more than what they were spending. Altogether, results indicate that respondents were generally familiar with equine parasites and the concept of selective therapy, although there was some confusion over the terms small and large strongyles. They used a large degree of fecal surveillance in all age groups, with a majority of respondents sampling and/or treating around twice a year. Finally, respondents appeared willing to spend money on parasite control for their horses. It is of concern that the survey suggested that foals and young horses are treated in a manner very similar to adult horses, which is against current recommendations. Thus, the survey illustrates the importance of clear communication of guidelines for equine parasite control.
Resumo:
OBJECTIVES: The aim of this study was to determine whether the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI)- or Cockcroft-Gault (CG)-based estimated glomerular filtration rates (eGFRs) performs better in the cohort setting for predicting moderate/advanced chronic kidney disease (CKD) or end-stage renal disease (ESRD). METHODS: A total of 9521 persons in the EuroSIDA study contributed 133 873 eGFRs. Poisson regression was used to model the incidence of moderate and advanced CKD (confirmed eGFR < 60 and < 30 mL/min/1.73 m(2) , respectively) or ESRD (fatal/nonfatal) using CG and CKD-EPI eGFRs. RESULTS: Of 133 873 eGFR values, the ratio of CG to CKD-EPI was ≥ 1.1 in 22 092 (16.5%) and the difference between them (CG minus CKD-EPI) was ≥ 10 mL/min/1.73 m(2) in 20 867 (15.6%). Differences between CKD-EPI and CG were much greater when CG was not standardized for body surface area (BSA). A total of 403 persons developed moderate CKD using CG [incidence 8.9/1000 person-years of follow-up (PYFU); 95% confidence interval (CI) 8.0-9.8] and 364 using CKD-EPI (incidence 7.3/1000 PYFU; 95% CI 6.5-8.0). CG-derived eGFRs were equal to CKD-EPI-derived eGFRs at predicting ESRD (n = 36) and death (n = 565), as measured by the Akaike information criterion. CG-based moderate and advanced CKDs were associated with ESRD [adjusted incidence rate ratio (aIRR) 7.17; 95% CI 2.65-19.36 and aIRR 23.46; 95% CI 8.54-64.48, respectively], as were CKD-EPI-based moderate and advanced CKDs (aIRR 12.41; 95% CI 4.74-32.51 and aIRR 12.44; 95% CI 4.83-32.03, respectively). CONCLUSIONS: Differences between eGFRs using CG adjusted for BSA or CKD-EPI were modest. In the absence of a gold standard, the two formulae predicted clinical outcomes with equal precision and can be used to estimate GFR in HIV-positive persons.
Resumo:
In natural hazard research, risk is defined as a function of (1) the probability of occurrence of a hazardous process, and (2) the assessment of the related extent of damage, defined by the value of elements at risk exposed and their physical vulnerability. Until now, various works have been undertaken to determine vulnerability values for objects exposed to geomorphic hazards such as mountain torrents. Yet, many studies only provide rough estimates for vulnerability values based on proxies for process intensities. However, the deduced vulnerability functions proposed in the literature show a wide range, in particular with respect to medium and high process magnitudes. In our study, we compare vulnerability functions for torrent processes derived from studies in test sites located in the Austrian Alps and in Taiwan. Based on this comparison we expose needs for future research in order to enhance mountain hazard risk management with a particular focus on the question of vulnerability on a catchment scale.
Resumo:
People with psychotic disorders have higher mortality rates compared to the general population. Most deaths are due to cardiovascular (CV) disease, reflecting high rates of CV risk factors such as obesity and diabetes. Treatment with antipsychotic drugs is associated with weight gain in clinical trials. However, there is little information about how these drugs affect children and young people, and how early in the course of treatment the elevation in CV risk factors begins. This information is essential in understanding the costs and benefits of these treatments in young people, and establishing preventive and early intervention services to address physical health comorbidities. This symposium reports both prospective and naturalistic data from children and adolescents treated with antipsychotic drugs. These studies demonstrate that adverse effects on cardiometabolic measures, notably BMI and insulin resistance, become apparent very soon after treatment is initiated. Further, children and adolescents appear to be even more sensitive to these effects than adults. Population-wide studies are also informative. Danish data showing that young people exposed to antipsychotics have a higher risk of diabetes, compared with young people who had a psychiatric diagnosis but were not exposed to antipsychotic drugs, will be presented. In addition, an Australian comparison between a large, nationally representative sample of people with psychosis and a general population sample shows that higher rates of obesity and other cardiometabolic abnormalities are already evident in people with psychosis by the age of 25 years. Young people living with psychosis are already disadvantaged by the demands of living with mental illness, stigma, and social factors such as unemployment and low income. The addition of obesity, diabetes and other comorbidities adds a further burden. The data presented highlights the need for careful selection of antipsychotic drugs, regular monitoring of physical health and early intervention when weight gain, glucose dysregulation, or other cardiometabolic abnormalities are detected.
Resumo:
Denmark and Switzerland are small and successful countries with exceptionally content populations. However, they have very different political institutions and economic models. They have followed the general tendency in the West toward economic convergence, but both countries have managed to stay on top. They both have a strong liberal tradition, but otherwise their economic strategies are a welfare state model for Denmark and a safe haven model for Switzerland. The Danish welfare state is tax-based, while the expenditures for social welfare are insurance-based in Switzerland. The political institutions are a multiparty unicameral system in Denmark, and a permanent coalition system with many referenda and strong local government in Switzerland. Both approaches have managed to ensure smoothly working political power-sharing and economic systems that allocate resources in a fairly efficient way. To date, they have also managed to adapt the economies to changes in the external environment with a combination of stability and flexibility.
Resumo:
Ophthalmologists typically acquire different image modalities to diagnose eye pathologies. They comprise e.g., Fundus photography, Optical Coherence Tomography (OCT), Computed Tomography (CT) and Magnetic Resonance Imaging (MRI). Yet, these images are often complementary and do express the same pathologies in a different way. Some pathologies are only visible in a particular modality. Thus, it is beneficial for the ophthalmologist to have these modalities fused into a single patient-specific model. The presented article’s goal is a fusion of Fundus photography with segmented MRI volumes. This adds information to MRI which was not visible before like vessels and the macula. This article’s contributions include automatic detection of the optic disc, the fovea, the optic axis and an automatic segmentation of the vitreous humor of the eye.
Resumo:
BACKGROUND The association between combination antiretroviral therapy (cART) and cancer risk, especially regimens containing protease inhibitors (PIs) or nonnucleoside reverse transcriptase inhibitors (NNRTIs), is unclear. METHODS Participants were followed from the latest of D:A:D study entry or January 1, 2004, until the earliest of a first cancer diagnosis, February 1, 2012, death, or 6 months after the last visit. Multivariable Poisson regression models assessed associations between cumulative (per year) use of either any cART or PI/NNRTI, and the incidence of any cancer, non-AIDS-defining cancers (NADC), AIDS-defining cancers (ADC), and the most frequently occurring ADC (Kaposi sarcoma, non-Hodgkin lymphoma) and NADC (lung, invasive anal, head/neck cancers, and Hodgkin lymphoma). RESULTS A total of 41,762 persons contributed 241,556 person-years (PY). A total of 1832 cancers were diagnosed [incidence rate: 0.76/100 PY (95% confidence interval: 0.72 to 0.79)], 718 ADC [0.30/100 PY (0.28-0.32)], and 1114 NADC [0.46/100 PY (0.43-0.49)]. Longer exposure to cART was associated with a lower ADC risk [adjusted rate ratio: 0.88/year (0.85-0.92)] but a higher NADC risk [1.02/year (1.00-1.03)]. Both PI and NNRTI use were associated with a lower ADC risk [PI: 0.96/year (0.92-1.00); NNRTI: 0.86/year (0.81-0.91)]. PI use was associated with a higher NADC risk [1.03/year (1.01-1.05)]. Although this was largely driven by an association with anal cancer [1.08/year (1.04-1.13)], the association remained after excluding anal cancers from the end point [1.02/year (1.01-1.04)]. No association was seen between NNRTI use and NADC [1.00/year (0.98-1.02)]. CONCLUSIONS Cumulative use of PIs may be associated with a higher risk of anal cancer and possibly other NADC. Further investigation of biological mechanisms is warranted.
Resumo:
BACKGROUND Heritable forms of epidermolysis bullosa (EB) constitute a heterogeneous group of skin disorders of genetic aetiology that are characterised by skin and mucous membrane blistering and ulceration in response to even minor trauma. Here we report the occurrence of EB in three Danish Hereford cattle from one herd. RESULTS Two of the animals were necropsied and showed oral mucosal blistering, skin ulcerations and partly loss of horn on the claws. Lesions were histologically characterized by subepidermal blisters and ulcers. Analysis of the family tree indicated that inbreeding and the transmission of a single recessive mutation from a common ancestor could be causative. We performed whole genome sequencing of one affected calf and searched all coding DNA variants. Thereby, we detected a homozygous 2.4 kb deletion encompassing the first exon of the LAMC2 gene, encoding for laminin gamma 2 protein. This loss of function mutation completely removes the start codon of this gene and is therefore predicted to be completely disruptive. The deletion co-segregates with the EB phenotype in the family and absent in normal cattle of various breeds. Verifying the homozygous private variants present in candidate genes allowed us to quickly identify the causative mutation and contribute to the final diagnosis of junctional EB in Hereford cattle. CONCLUSIONS Our investigation confirms the known role of laminin gamma 2 in EB aetiology and shows the importance of whole genome sequencing in the analysis of rare diseases in livestock.
Resumo:
OBJECTIVES Pre-antiretroviral therapy (ART) inflammation and coagulation activation predict clinical outcomes in HIV-positive individuals. We assessed whether pre-ART inflammatory marker levels predicted the CD4 count response to ART. METHODS Analyses were based on data from the Strategic Management of Antiretroviral Therapy (SMART) trial, an international trial evaluating continuous vs. interrupted ART, and the Flexible Initial Retrovirus Suppressive Therapies (FIRST) trial, evaluating three first-line ART regimens with at least two drug classes. For this analysis, participants had to be ART-naïve or off ART at randomization and (re)starting ART and have C-reactive protein (CRP), interleukin-6 (IL-6) and D-dimer measured pre-ART. Using random effects linear models, we assessed the association between each of the biomarker levels, categorized as quartiles, and change in CD4 count from ART initiation to 24 months post-ART. Analyses adjusted for CD4 count at ART initiation (baseline), study arm, follow-up time and other known confounders. RESULTS Overall, 1084 individuals [659 from SMART (26% ART naïve) and 425 from FIRST] met the eligibility criteria, providing 8264 CD4 count measurements. Seventy-five per cent of individuals were male with the mean age of 42 years. The median (interquartile range) baseline CD4 counts were 416 (350-530) and 100 (22-300) cells/μL in SMART and FIRST, respectively. All of the biomarkers were inversely associated with baseline CD4 count in FIRST but not in SMART. In adjusted models, there was no clear relationship between changing biomarker levels and mean change in CD4 count post-ART (P for trend: CRP, P = 0.97; IL-6, P = 0.25; and D-dimer, P = 0.29). CONCLUSIONS Pre-ART inflammation and coagulation activation do not predict CD4 count response to ART and appear to influence the risk of clinical outcomes through other mechanisms than blunting long-term CD4 count gain.