256 resultados para international organised crime
Resumo:
CONTEXT: Recent data regarding the consequences of untreated human immunodeficiency virus (HIV) infection and the expansion of treatment choices for antiretroviral-naive and antiretroviral-experienced patients warrant an update of the International AIDS Society-USA guidelines for the use of antiretroviral therapy in adults with HIV infection. OBJECTIVES: To provide updated recommendations for management of HIV-infected adults, using antiretroviral drugs and laboratory monitoring tools available in the international, developed-world setting. This report provides guidelines for when to initiate antiretroviral therapy, selection of appropriate initial regimens, patient monitoring, when to change therapy, and what regimens to use when changing. DATA SOURCES AND STUDY SELECTION: A panel with expertise in HIV research and clinical care reviewed relevant data published or presented at selected scientific conferences since the last panel report through April 2010. Data were identified through a PubMed search, review of scientific conference abstracts, and requests to antiretroviral drug manufacturers for updated clinical trials and adverse event data. DATA EXTRACTION AND SYNTHESIS: New evidence was reviewed by the panel. Recommendations were drafted by section writing committees and reviewed and edited by the entire panel. The quality and strength of the evidence were rated and recommendations were made by full panel consensus. CONCLUSIONS: Patient readiness for treatment should be confirmed before initiation of antiretroviral treatment. Therapy is recommended for asymptomatic patients with a CD4 cell count < or = 500/microL, for all symptomatic patients, and those with specific conditions and comorbidities. Therapy should be considered for asymptomatic patients with CD4 cell count > 500/microL. Components of the initial and subsequent regimens must be individualized, particularly in the context of concurrent conditions. Patients receiving antiretroviral treatment should be monitored regularly; treatment failure should be detected and managed early, with the goal of therapy, even in heavily pretreated patients, being HIV-1 RNA suppression below commercially available assay quantification limits.
Resumo:
This is one of the few studies that have explored the value of baseline symptoms and health-related quality of life (HRQOL) in predicting survival in brain cancer patients. Baseline HRQOL scores (from the EORTC QLQ-C30 and the Brain Cancer Module (BN 20)) were examined in 490 newly diagnosed glioblastoma cancer patients for the relationship with overall survival by using Cox proportional hazards regression models. Refined techniques as the bootstrap re-sampling procedure and the computation of C-indexes and R(2)-coefficients were used to try and validate the model. Classical analysis controlled for major clinical prognostic factors selected cognitive functioning (P=0.0001), global health status (P=0.0055) and social functioning (P<0.0001) as statistically significant prognostic factors of survival. However, several issues question the validity of these findings. C-indexes and R(2)-coefficients, which are measures of the predictive ability of the models, did not exhibit major improvements when adding selected or all HRQOL scores to clinical factors. While classical techniques lead to positive results, more refined analyses suggest that baseline HRQOL scores add relatively little to clinical factors to predict survival. These results may have implications for future use of HRQOL as a prognostic factor in cancer patients.
Resumo:
This chapter reviews some basic concepts underlying ethical issues in adolescence and provides a step-by-step procedure to address ethical dilemmas involving minor adolescents, based on a deliberative approach. "Deliberation" with the patient, along with involving the opinion of relevant stakeholders if possible, allows for a careful, multidisciplinary examination of all options, the medical and psychosocial consequences, and the moral values stressed by each option. Although the final decision regarding which ethical option should be chosen usually belongs to the health care providers and his or her patient, the deliberative approach provides the ingredients for sound, unbiased decision-making.
Resumo:
Given the significant impact the use of glucocorticoids can have on fracture risk independent of bone density, their use has been incorporated as one of the clinical risk factors for calculating the 10-year fracture risk in the World Health Organization's Fracture Risk Assessment Tool (FRAX(®)). Like the other clinical risk factors, the use of glucocorticoids is included as a dichotomous variable with use of steroids defined as past or present exposure of 3 months or more of use of a daily dose of 5 mg or more of prednisolone or equivalent. The purpose of this report is to give clinicians guidance on adjustments which should be made to the 10-year risk based on the dose, duration of use and mode of delivery of glucocorticoids preparations. A subcommittee of the International Society for Clinical Densitometry and International Osteoporosis Foundation joint Position Development Conference presented its findings to an expert panel and the following recommendations were selected. 1) There is a dose relationship between glucocorticoid use of greater than 3 months and fracture risk. The average dose exposure captured within FRAX(®) is likely to be a prednisone dose of 2.5-7.5 mg/day or its equivalent. Fracture probability is under-estimated when prednisone dose is greater than 7.5 mg/day and is over-estimated when the prednisone dose is less than 2.5 mg/day. 2) Frequent intermittent use of higher doses of glucocorticoids increases fracture risk. Because of the variability in dose and dosing schedule, quantification of this risk is not possible. 3) High dose inhaled glucocorticoids may be a risk factor for fracture. FRAX(®) may underestimate fracture probability in users of high dose inhaled glucocorticoids. 4) Appropriate glucocorticoid replacement in individuals with adrenal insufficiency has not been found to increase fracture risk. In such patients, use of glucocorticoids should not be included in FRAX(®) calculations.
Resumo:
In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.
Resumo:
BACKGROUND: International comparisons of social inequalities in alcohol use have not been extensively investigated. The purpose of this study was to examine the relationship of country-level characteristics and individual socio-economic status (SES) on individual alcohol consumption in 33 countries. METHODS: Data on 101,525 men and women collected by cross-sectional surveys in 33 countries of the GENACIS study were used. Individual SES was measured by highest attained educational level. Alcohol use measures included drinking status and monthly risky single occasion drinking (RSOD). The relationship between individuals' education and drinking indicators was examined by meta-analysis. In a second step the individual level data and country data were combined and tested in multilevel models. As country level indicators we used the Purchasing Power Parity of the gross national income, the Gini coefficient and the Gender Gap Index. RESULTS: For both genders and all countries higher individual SES was positively associated with drinking status. Also higher country level SES was associated with higher proportions of drinkers. Lower SES was associated with RSOD among men. Women of higher SES in low income countries were more often RSO drinkers than women of lower SES. The opposite was true in higher income countries. CONCLUSION: For the most part, findings regarding SES and drinking in higher income countries were as expected. However, women of higher SES in low and middle income countries appear at higher risk of engaging in RSOD. This finding should be kept in mind when developing new policy and prevention initiatives.
Resumo:
Alcohol and tobacco consumption are well-recognized risk factors for head and neck cancer (HNC). Evidence suggests that genetic predisposition may also play a role. Only a few epidemiologic studies, however, have considered the relation between HNC risk and family history of HNC and other cancers. We pooled individual-level data across 12 case-control studies including 8,967 HNC cases and 13,627 controls. We obtained pooled odds ratios (OR) using fixed and random effect models and adjusting for potential confounding factors. All statistical tests were two-sided. A family history of HNC in first-degree relatives increased the risk of HNC (OR=1.7, 95% confidence interval, CI, 1.2-2.3). The risk was higher when the affected relative was a sibling (OR=2.2, 95% CI 1.6-3.1) rather than a parent (OR=1.5, 95% CI 1.1-1.8) and for more distal HNC anatomic sites (hypopharynx and larynx). The risk was also higher, or limited to, in subjects exposed to tobacco. The OR rose to 7.2 (95% CI 5.5-9.5) among subjects with family history, who were alcohol and tobacco users. A weak but significant association (OR=1.1, 95% CI 1.0-1.2) emerged for family history of other tobacco-related neoplasms, particularly with laryngeal cancer (OR=1.3, 95% CI 1.1-1.5). No association was observed for family history of nontobacco-related neoplasms and the risk of HNC (OR=1.0, 95% CI 0.9-1.1). Familial factors play a role in the etiology of HNC. In both subjects with and without family history of HNC, avoidance of tobacco and alcohol exposure may be the best way to avoid HNC.