880 resultados para Film Making_invisible and indirect factors
Resumo:
BACKGROUND Skull-base chondrosarcoma (ChSa) is a rare disease, and the prognostication of this disease entity is ill defined. METHODS We assessed the long-term local control (LC) results, overall survival (OS), and prognostic factors of skull-base ChSa patients treated with pencil beam scanning proton therapy (PBS PT). Seventy-seven (male, 35; 46%) patients with histologically confirmed ChSa were treated at the Paul Scherrer Institute. Median age was 38.9 years (range, 10.2-70.0y). Median delivered dose was 70.0 GyRBE (range, 64.0-76.0 GyRBE). LC, OS, and toxicity-free survival (TFS) rates were calculated using the Kaplan Meier method. RESULTS After a mean follow-up of 69.2 months (range, 4.6-190.8 mo), 6 local (7.8%) failures were observed, 2 of which were late failures. Five (6.5%) patients died. The actuarial 8-year LC and OS were 89.7% and 93.5%, respectively. Tumor volume > 25 cm(3) (P = .02), brainstem/optic apparatus compression at the time of PT (P = .04) and age >30 years (P = .08) were associated with lower rates of LC. High-grade (≥3) radiation-induced toxicity was observed in 6 (7.8%) patients. The 8-year high-grade TFS was 90.8%. A higher rate of high-grade toxicity was observed for older patients (P = .073), those with larger tumor volume (P = .069), and those treated with 5 weekly fractions (P = .069). CONCLUSIONS This is the largest PT series reporting the outcome of patients with low-grade ChSa of the skull base treated with PBS only. Our data indicate that protons are both safe and effective. Tumor volume, brainstem/optic apparatus compression, and age were prognosticators of local failures.
Resumo:
BACKGROUND In the meantime, catheter ablation is widely used for the treatment of persistent atrial fibrillation (AF). There is a paucity of data about long-term outcomes. This study evaluates (1) 5-year single and multiple procedure success and (2) prognostic factors for arrhythmia recurrences after catheter ablation of persistent AF using the stepwise approach aiming at AF termination. METHODS AND RESULTS A total of 549 patients with persistent AF underwent de novo catheter ablation using the stepwise approach (2007-2009). A total of 493 patients were included (Holter ECGs ≥ every 6 months). Mean follow-up was 59 ± 16 months with 2.1 ± 1.1 procedures per patient. Single and multiple procedure success rates were 20.1% and 55.9%, respectively (80% off antiarrhythmic drug). Antiarrhythmic drug-free multiple procedure success was 46%. Long-term recurrences (n=171) were paroxysmal AF in 48 patients (28%) and persistent AF/atrial tachycardia in 123 patients (72%). Multivariable recurrent event analysis revealed the following factors favoring arrhythmia recurrence: failure to terminate AF during index procedure (hazard ratio [HR], 1.279; 95% confidence interval [CI], 1.093-1.497; P = 0.002), number of procedures (HR, 1.154; 95% CI, 1.051-1.267; P = 0.003), female sex (HR, 1.263; 95% CI, 1.027-1.553; P = 0.027), and the presence of structural heart disease (HR, 1.236; 95% CI, 1.003-1.524; P = 0.047). AF termination was correlated with a higher rate of consecutive procedures because of atrial tachycardia recurrences (P = 0.003; HR, 1.71; 95% CI, 1.20-2.43). CONCLUSIONS Catheter ablation of persistent AF using the stepwise approach provides limited long-term freedom of arrhythmias often requiring multiple procedures. AF termination, the number of procedures, sex, and the presence of structural heart disease correlate with outcome success. AF termination is associated with consecutive atrial tachycardia procedures.
Resumo:
Hypersensitivity of pain pathways is considered a relevant determinant of symptoms in chronic pain patients, but data on its prevalence are very limited. To our knowledge, no data on the prevalence of spinal nociceptive hypersensitivity are available. We studied the prevalence of pain hypersensitivity and spinal nociceptive hypersensitivity in 961 consecutive patients with various chronic pain conditions. Pain threshold and nociceptive withdrawal reflex threshold to electrical stimulation were used to assess pain hypersensitivity and spinal nociceptive hypersensitivity, respectively. Using 10th percentile cutoff of previously determined reference values, the prevalence of pain hypersensitivity and spinal nociceptive hypersensitivity (95% confidence interval) was 71.2 (68.3-74.0) and 80.0 (77.0-82.6), respectively. As a secondary aim, we analyzed demographic, psychosocial, and clinical characteristics as factors potentially associated with pain hypersensitivity and spinal nociceptive hypersensitivity using logistic regression models. Both hypersensitivity parameters were unaffected by most factors analyzed. Depression, catastrophizing, pain-related sleep interference, and average pain intensity were significantly associated with hypersensitivity. However, none of them was significant for both unadjusted and adjusted analyses. Furthermore, the odds ratios were very low, indicating modest quantitative impact. To our knowledge, this is the largest prevalence study on central hypersensitivity and the first one on the prevalence of spinal nociceptive hypersensitivity in chronic pain patients. The results revealed an impressively high prevalence, supporting a high clinical relevance of this phenomenon. Electrical pain thresholds and nociceptive withdrawal reflex explore aspects of pain processing that are mostly independent of sociodemographic, psychological, and clinical pain-related characteristics.
Resumo:
BACKGROUND Physicians traditionally treat ulcerative colitis (UC) using a step-up approach. Given the paucity of data, we aimed to assess the cumulative probability of UC-related need for step-up therapy and to identify escalation-associated risk factors. METHODS Patients with UC enrolled into the Swiss IBD Cohort Study were analyzed. The following steps from the bottom to the top of the therapeutic pyramid were examined: (1) 5-aminosalicylic acid and/or rectal corticosteroids, (2) systemic corticosteroids, (3) immunomodulators (IM) (azathioprine, 6-mercaptopurine, methotrexate), (4) TNF antagonists, (5) calcineurin inhibitors, and (6) colectomy. RESULTS Data on 996 patients with UC with a median disease duration of 9 years were examined. The point estimates of cumulative use of different treatments at years 1, 5, 10, and 20 after UC diagnosis were 91%, 96%, 96%, and 97%, respectively, for 5-ASA and/or rectal corticosteroids, 63%, 69%, 72%, and 79%, respectively, for systemic corticosteroids, 43%, 57%, 59%, and 64%, respectively, for IM, 15%, 28%, and 35% (up to year 10 only), respectively, for TNF antagonists, 5%, 9%, 11%, and 12%, respectively, for calcineurin inhibitors, 1%, 5%, 9%, and 18%, respectively, for colectomy. The presence of extraintestinal manifestations and extended disease location (at least left-sided colitis) were identified as risk factors for step-up in therapy with systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and surgery. Cigarette smoking at diagnosis was protective against surgery. CONCLUSIONS The presence of extraintestinal manifestations, left-sided colitis, and extensive colitis/pancolitis at the time of diagnosis were associated with use of systemic corticosteroids, IM, TNF antagonists, calcineurin inhibitors, and colectomy during the disease course.
Resumo:
Although a trimodality regimen for patients with stage IIIA/pN2 non-small-cell lung cancer (NSCLC) has been variably used owing to limited evidence for its benefits, it remains unknown whether any patient subgroup actually receives benefit from such an approach. To explore this question, the published data were reviewed from 1990 to 2015 to identify the possible predictors and prognosticators in this setting. Overall survival was the endpoint of our study. Of 27 identified studies, none had studied the predictors of improved outcomes with trimodality treatment. Of the potential patient- and tumor-related prognosticators, age, gender, and histologic type were the most frequently formally explored. However, none of the 3 was found to influence overall survival. The most prominent finding of the present review was the substantial lack of data supporting a trimodality treatment approach in any patient subgroup. As demonstrated in completed prospective randomized studies, the use of surgery for stage IIIA NSCLC should be limited to well-defined clinical trials.
Resumo:
BACKGROUND Erosive tooth wear is the irreversible loss of dental hard tissue as a result of chemical processes. When the surface of a tooth is attacked by acids, the resulting loss of structural integrity leaves a softened layer on the tooth's surface, which renders it vulnerable to abrasive forces. The authors' objective was to estimate the prevalence of erosive tooth wear and to identify associated factors in a sample of 14- to 19-year-old adolescents in Mexico. METHODS The authors performed a cross-sectional study on a convenience sample (N = 417) of adolescents in a school in Mexico City, Mexico. The authors used a questionnaire and an oral examination performed according to the Lussi index. RESULTS The prevalence of erosive tooth wear was 31.7% (10.8% with exposed dentin). The final logistic regression model included age (P < .01; odds ratio [OR], 1.64; 95% confidence interval [CI], 1.26-2.13), high intake of sweet carbonated drinks (P = .03; OR, 1.81; 95% CI, 1.06-3.07), and xerostomia (P = .04; OR, 2.31; 95% CI, 1.05-5.09). CONCLUSIONS Erosive tooth wear, mainly on the mandibular first molars, was associated with age, high intake of sweet carbonated drinks, and xerostomia. PRACTICAL IMPLICATIONS Knowledge regarding erosive tooth wear in adolescents with relatively few years of exposure to causal factors will increase the focus on effective preventive measures, the identification of people at high risk, and early treatment.
Resumo:
Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.
Resumo:
Giardia duodenalis is considered the most common protozoan infecting humans worldwide. Molecular characterization of G. duodenalis isolates has revealed the existence of eight groups (assemblages A to H) which differ in their host distribution. A cross-sectional study was conducted in 639 children from La Habana between January and December 2013. Two assemblage-specific PCRs were carried out for the molecular characterization. The overall prevalence of Giardia infection was 11.9%. DNA from 63 of 76 (82.9%) samples was successfully amplified by PCR-tpi, while 58 from 76 (76.3%) were detected by PCRE1-HF. Similar results by both PCRs were obtained in 54 from 76 samples (71%). According to these analyses, assemblage B and mixed assemblages A + B account for most of the Giardia infections in the cohort of children tested. Our current study identified assemblage B as predominant genotype in children infected with Giardia. Univariate analysis indicated that omission of washing hands before eating and keeping dogs at home were significant risk factors for a Giardia infection. In the future, novel molecular tools for a better discrimination of assemblages at the subassemblages level are needed to verify possible correlations between Giardia genotypes and symptomatology of giardiasis.
Resumo:
We analyzed juvenile anadromous alewife migration at Bride Lake, a coastal lake in Connecticut, during summer 2006 and found that migration on 24-hour and seasonal timescales was influenced by conditions of the environment and characteristics of the individual. To identify environmental cues of juvenile migration, we continuously video recorded fish at the lake outflow and employed information-theoretic model selection to identify the best predictors of daily migration rate. More than 80% of the approximately 320,000 juveniles that migrated from mid-June to mid-August departed in three pulses lasting one or two days. Pulses of migration were associated with precipitation events, transient decreases in water temperature and transient increases in stream discharge. Diel timing of migration shifted over the summer. Early in the season most migration occurred around dawn; late in the season migration occurred at night. To identify individual characteristics associated with migratory behavior, we compared migrating juveniles that we collected as they were exiting Bride Lake to non-migrating juveniles that we collected from the center of the lake. Migrants were a non-random subset of the population; they were on average 1 – 12 mm larger, 2 – 14 d older, had grown more rapidly (11% greater length-at-age), and were in better condition (14% greater mass-at-length) than non-migrant fish. We infer that the amount of accumulated energy has a positive effect on the net benefit of migration at any time in the migratory season.
Direct and Indirect Measures of Capacity Utilization: A Nonparametric Analysis of U.S. Manufacturing
Resumo:
We measure the capacity output of a firm as the maximum amount producible by a firm given a specific quantity of the quasi-fixed input and an overall expenditure constraint for its choice of variable inputs. We compute this indirect capacity utilization measure for the total manufacturing sector in the US as well as for a number of disaggregated industries, for the period 1970-2001. We find considerable variation in capacity utilization rates both across industries and over years within industries. Our results suggest that the expenditure constraint was binding, especially in periods of high interest rates.
Resumo:
The purposes of this study were to determine the prevalence of food insecurity and factors associated with food insecurity among households with children enrolled in Head Start programs in Houston, Texas, and Birmingham, Alabama. This cross-sectional study utilized data gathered from 688 households recruited by convenience sample from two Head Start districts in each city. Interviewers collected data from primary caregivers on demographic characteristics, dietary intake, and the six-item USDA food security module. Chi-square and logistic regression analysis were used to determine the association of food security and demographic characteristics. Comparison of means was used to analyze the association between the child's fruit and vegetable intake and the household's food security status. The prevalence of food insecurity among the sample was 34.9% (95% CI: 31.3%, 38.5%). Characteristics associated with food insecurity were the caregiver's national origin (Foreign-born (ref.) v. U.S.-born, adjusted OR = 0.36, 95% CI: 0.14, 0.94), gender of the child (male (ref.) v. female, adjusted OR = 1.44, 95% CI: 1.03, 2.01), and city of residence (Birmingham (ref.) v. Houston, adjusted OR = 0.20, 95% CI: 0.10, 0.39). Children in food insecure households consumed more daily servings of fruits and vegetables on average (mean = 2.44) than children in food secure households (mean = 2.16, p = 0.04). ^
Resumo:
Objective. The objective of this study is to determine the prevalence of MRSA colonization in adult patients admitted to intensive care units at an urban tertiary care hospital in Houston, Texas and to evaluate the risk factors associated with colonization during a three month active-screening pilot project. Design. This study used secondary data from a small cross-sectional pilot project. Methods. All patients admitted to the seven specialty ICUs were screened for MRSA by nasal culture. Results were obtained utilizing the BD GeneOhm™ IDI-MRSA assay in vitro diagnostic test, for rapid MRSA detection. Statistical analysis was performed using the STATA 10, Epi Info, and JavaStat. Results . 1283/1531 (83.4%) adult ICU admissions were screened for nasal MRSA colonization. Of those screened, demographic and risk factor data was available for 1260/1283 (98.2%). Unresolved results were obtained for 73 patients. Therefore, a total of 1187/1531 (77.5%) of all ICU admissions during the three month study period are described in this analysis. Risk factors associated with colonization included the following: hospitalization within the last six months (odds ratio 2.48 [95% CI, 1.70-3.63], p=0.000), hospitalization within the last 12 months, (odds ratio 2.27 [95% CI, 1.57-3.80], p=0.000), and having diabetes mellitus (odds ratio 1.63 [95% CI, 1.14-2.32], p=0.007). Conclusion. Based on the literature, the prevalence of MRSA for this population is typical of other prevalence studies conducted in the United States and coincides with the continual increasing trend of MRSA colonization. Significant risk factors were similar to those found in previous studies. Overall, the active surveillance screening pilot project has provided valuable information on a population not widely addressed. These findings can aid in future interventions for the education, control, prevention, and treatment of MRSA. ^
Resumo:
Research examining programs designed to retain patients in health care focus on repeated interactions between outreach workers and patients (Bradford et al. 2007; Cheever 2007). The purpose of this study was to determine if patients who are peer-mentored at their intake exam remain in care longer and attend more physicians' visits than those who were not mentored. Using patients' medical records and a previously created mentor database, the study determined how many patients attended their intake visit but subsequently failed to establish regular care. The cohort study examined risk factors for establishing care, determined if patients lacking a peer mentor failed to establish care more than peer mentor assisted patients, and subsequently if peer mentored patients had better health outcomes. The sample consists of 1639 patients who were entered into the Thomas Street Patient Mentor Database between May 2005 and June 2007. The assignment to the mentored group was haphazardly conducted based on mentor availability. The data from the Mentor Database was then analyzed using descriptive statistical software (SPSS version 15; SPSS Inc., Chicago, Illinois, USA). Results indicated that patients who had a mentor at intake were more likely to return for primary care HIV visits at 90 and 180 days. Mentored patients also were more likely to be prescribed ART within 180 days from intake. Other risk factors that impacted remaining in care included gender, previous care status, time from diagnosis to intake visit, and intravenous drug use. Clinical health outcomes did not differ significantly between groups. This supports that mentoring did improve outcomes. Continuing to use peer-mentoring programs for HIV care may help in increasing retention of patients in care and improving patients' health in a cost effective manner. Future research on the effects of peer mentoring on mentors, and effects of concordance of mentor and patient demographics may help to further improve peer-mentoring programs. ^
Resumo:
Background. There are 200,000 HIV/HCV co-infected people in the US and IDUs are at highest risk of exposure. Between 52-92% of HIV infected IDUs are chronically infected with HCV. African Americans and Hispanics bear the largest burden of co-infections. Furthermore HIV/HCV co-infection is associated with high morbidity and mortality if not treated. The present study investigates the demographic, sexual and drug related risk factors for HIV/HCV co-infection among predominantly African American injecting and non-injecting drug users living in two innercity neighborhoods in Houston, Texas. ^ Methods. This secondary analysis used data collected between February 2004 and June 2005 from 1,889 drug users. Three case-comparison analyses were conducted to investigate the risk factors for HIV/HCV co-infection. HIV mono-infection, HCV mono-infection and non-infection were compared to HIV/HCV co-infection to build multivariate logistic regression models. Race/ethnicity and age were forced into each model regardless of significance in the univariate analysis. ^ Results. The overall prevalence of HIV/HCV co-infection was 3.9% while 39.8% of HIV infected drug users were co-infected with HCV and 10.7% of HCV infected drug users were co-infected with HIV. Among HIV infected IDUs the prevalence of HCV was 71.7% and among HIV infected NIDUs the prevalence of HCV was 24%. In the multivariate analysis, HIV/HCV co-infection was associated with injecting drug use when compared to HIV mono-infection, with MSM when compared to HCV mono-infection and with injecting drug use as well as MSM when compared to non-infection. ^ Conclusion. HIV/HCV co-infection was associated with a combination of sexual and risky injecting practices. More data on the prevalence and risk factors for co-infection among minority populations is urgently needed to support the development of targeted interventions and treatment options. Additionally there should be a focus on promoting safer sex and injecting practices among drug users as well as the expansion of routine testing for HIV and HCV infections in this high risk population.^