822 resultados para Almost-sectional paths
Resumo:
OBJECTIVE The aim of this cross-sectional study was to estimate bone loss of implants with platform-switching design and analyze possible risk indicators after 5 years of loading in a multi-centered private practice network. METHOD AND MATERIALS Peri-implant bone loss was measured radiographically as the distance from the implant shoulder to the mesial and distal alveolar crest, respectively. Risk factor analysis for marginal bone loss included type of implant prosthetic treatment concept and dental status of the opposite arch. RESULTS A total of 316 implants in 98 study patients after 5 years of loading were examined. The overall mean value for radiographic bone loss was 1.02 mm (SD ± 1.25 mm, 95% CI 0.90- 1.14). Correlation analyses indicated a strong association of peri-implant bone loss > 2 mm for removable implant-retained prostheses with an odds ratio of 53.8. CONCLUSION The 5-year-results of the study show clinically acceptable values of mean bone loss after 5 years of loading. Implant-supported removable prostheses seem to be a strong co-factor for extensive bone level changes compared to fixed reconstructions. However, these results have to be considered for evaluation of the included special cohort under private dental office conditions.
Resumo:
BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.
Resumo:
BACKGROUND Drug resistance is a major barrier to successful antiretroviral treatment (ART). Therefore, it is important to monitor time trends at a population level. METHODS We included 11,084 ART-experienced patients from the Swiss HIV Cohort Study (SHCS) between 1999 and 2013. The SHCS is highly representative and includes 72% of patients receiving ART in Switzerland. Drug resistance was defined as the presence of at least one major mutation in a genotypic resistance test. To estimate the prevalence of drug resistance, data for patients with no resistance test was imputed based on patient's risk of harboring drug resistant viruses. RESULTS The emergence of new drug resistance mutations declined dramatically from 401 to 23 patients between 1999 and 2013. The upper estimated prevalence limit of drug resistance among ART-experienced patients decreased from 57.0% in 1999 to 37.1% in 2013. The prevalence of three-class resistance decreased from 9.0% to 4.4% and was always <0.4% for patients who initiated ART after 2006. Most patients actively participating in the SHCS in 2013 with drug resistant viruses initiated ART before 1999 (59.8%). Nevertheless, in 2013, 94.5% of patients who initiated ART before 1999 had good remaining treatment options based on Stanford algorithm. CONCLUSION HIV-1 drug resistance among ART-experienced patients in Switzerland is a well-controlled relic from the pre-combination ART era. Emergence of drug resistance can be virtually stopped with new potent therapies and close monitoring.
Resumo:
BACKGROUND: Cardiovascular diseases are the leading cause of death worldwide and in Switzerland. When applied, treatment guidelines for patients with acute ST-segment elevation myocardial infarction (STEMI) improve the clinical outcome and should eliminate treatment differences by sex and age for patients whose clinical situations are identical. In Switzerland, the rate at which STEMI patients receive revascularization may vary by patient and hospital characteristics. AIMS: To examine all hospitalizations in Switzerland from 2010-2011 to determine if patient or hospital characteristics affected the rate of revascularization (receiving either a percutaneous coronary intervention or a coronary artery bypass grafting) in acute STEMI patients. DATA AND METHODS: We used national data sets on hospital stays, and on hospital infrastructure and operating characteristics, for the years 2010 and 2011, to identify all emergency patients admitted with the main diagnosis of acute STEMI. We then calculated the proportion of patients who were treated with revascularization. We used multivariable multilevel Poisson regression to determine if receipt of revascularization varied by patient and hospital characteristics. RESULTS: Of the 9,696 cases we identified, 71.6% received revascularization. Patients were less likely to receive revascularization if they were female, and 80 years or older. In the multivariable multilevel Poisson regression analysis, there was a trend for small-volume hospitals performing fewer revascularizations but this was not statistically significant while being female (Relative Proportion = 0.91, 95% CI: 0.86 to 0.97) and being older than 80 years was still associated with less frequent revascularization. CONCLUSION: Female and older patients were less likely to receive revascularization. Further research needs to clarify whether this reflects differential application of treatment guidelines or limitations in this kind of routine data.
Resumo:
The Barchi-Kol terrain is a classic locality of ultrahigh-pressure (UHP) metamorphism within the Kokchetav metamorphic belt. We provide a detailed and systematic characterization of four metasedimentary samples using dominant mineral assemblages, mineral inclusions in zircon and monazite, garnet zonation with respect to major and trace elements, and Zr-in-rutile and Ti-in-zircon temperatures. A typical diamond-bearing gneiss records peak conditions of 49 ± 4 kbar and 950–1000 °C. Near isothermal decompression of this rock resulted in the breakdown of phengite associated with a pervasive recrystallization of the rock. The same terrain also contains mica schists that experienced peak conditions close to those of the diamond-bearing rocks, but they were exhumed along a cooler path where phengite remained stable. In these rocks, major and trace element zoning in garnet has been completely equilibrated. A layered gneiss was metamorphosed at UHP conditions in the coesite field, but did not reach diamond-facies conditions (peak conditions: 30 kbar and 800–900 °C). In this sample, garnet records retrograde zonation in major elements and also retains prograde zoning in trace elements. A garnet-kyanite-micaschist that reached significantly lower pressures (24 ± 2 kbar, 710 ± 20 °C) contains garnet with major and trace element zoning. The diverse garnet zoning in samples that experienced different metamorphic conditions allows to establish that diffusional equilibration of rare earth element in garnet likely occurs at ~900–950 °C. Different metamorphic conditions in the four investigated samples are also documented in zircon trace element zonation and mineral inclusions in zircon and monazite. U-Pb geochronology of metamorphic zircon and monazite domains demonstrates that prograde (528–521 Ma), peak (528–522 Ma), and peak to retrograde metamorphism (503–532 Ma) occurred over a relatively short time interval that is indistinguishable from metamorphism of other UHP rocks within the Kokchetav metamorphic belt. Therefore, the assembly of rocks with contrasting P-T trajectories must have occurred in a single subduction-exhumation cycle, providing a snapshot of the thermal structure of a subducted continental margin prior to collision. The rocks were initially buried along a low geothermal gradient. At 20–25 kbar they underwent near isobaric heating of 200 °C, which was followed by continued burial along a low geothermal gradient. Such a step-wise geotherm is in good agreement with predictions from subduction zone thermal models.
Resumo:
Forty-nine percent of pregnancies in the United States are unintended and significant numbers of pregnancies are unintended for women of all ages. One possible reason for the high rate is that while 85% of women at risk for an unintended pregnancy use contraception, negative attitudes about the method used make them poor contraceptors. Negative attitudes may prevent the remaining 15% of women from using any method of birth control. This study examined adult women's attitudes toward contraception and its use to see if attitudes correlate with unintended pregnancy. ^ To obtain a sample of women experiencing unintended pregnancies, women obtaining therapeutic abortions were surveyed since almost all women obtaining therapeutic abortions are experiencing an unintended pregnancy. The study used a cross-sectional survey design and included 312 women obtaining abortions at the Planned Parenthood Surgical Services Clinic in Houston in the latter half of 1999. ^ The responses revealed a lack of knowledge about the safety and effectiveness of contraception, particularly for methods other than oral contraceptives and condoms. Thirty-four percent of the participants were uncomfortable buying contraception. While 71% of the participants said their physician recommended their use of contraception, 17% were unsure and 35% did not talk to their physician about contraception on a regular basis. ^ The attitudes of women using contraception were compared with those not using contraception and many differences were seen. Women not using contraception responded with more ‘unsure’ answers and believed contraception was more difficult to use. They felt planning ahead for the use of contraception interfered with the enjoyment of sex (p-value = 0.06). They were less likely to use contraception if their partner disapproved (p-value = 0.01) and more of them believed their church disapproved of contraception (p-value = 0.02). In comparison, women using contraception had negative attitudes about the safety of the pill (p-values = 0.01–0.08) and the effectiveness of the condom (p-value = 0.04). Therefore, the negative attitudes women using contraception had about contraception may interfere with their effective use of birth control. Those not using contraception were found to hold attitudes that may contribute to their non-use of contraception. ^
Resumo:
This paper examines how preference correlation and intercorrelation combine to influence the length of a decentralized matching market's path to stability. In simulated experiments, marriage markets with various preference specifications begin at an arbitrary matching of couples and proceed toward stability via the random mechanism proposed by Roth and Vande Vate (1990). The results of these experiments reveal that fundamental preference characteristics are critical in predicting how long the market will take to reach a stable matching. In particular, intercorrelation and correlation are shown to have an exponential impact on the number of blocking pairs that must be randomly satisfied before stability is attained. The magnitude of the impact is dramatically different, however, depending on whether preferences are positively or negatively intercorrelated.
Resumo:
Congressional leadership is a constantly changing phenomenon. New factors and actors are constantly affecting and altering which members ascend to positions of leadership and how that leadership is exercised. A critical change that has occurred in recent times is the inclusion of women in the congressional leadership for the first time. While there has been a great deal of theoretical work on gender and on congressional leadership, there have not been enough actual female leaders in Congress to perform a study until now. The present study examines the impact of gender, committee/legislative performance, ideology, and fundraising ability on leadership ascendancy. The variables are investigated through a comparative case study of Rep. Nancy Pelosi, Rep. Rosa DeLauro, Sen. Hillary Clinton and Sen. Harry Reid.
Resumo:
Reelection and self-interest are recurring themes in the study of our congressional leaders. To date, many studies have already been done on the trends between elections, party affiliation, and voting behavior in Congress. However, because a plethora of data has been collected on both elections and congressional voting, the ability to draw a connection between the two provides a very reasonable prospect. This project analyzes whether voting shifts in congressional elections have an effect on congressional voting. Will a congressman become ideologically more polarized when his electoral margins increase? Essentially, this paper assumes that all congressmen are ideologically polarized, and it is elections which serve to reel congressmen back toward the ideological middle. The election and ideological data for this study, which spans from the 56th to the 107th Congress, finds statistically significant relationships between these two variables. In fact, congressman pay attention to election returns when voting in Congress. When broken down by party, Democrats are more exhibitive of this phenomenon, which suggest that Democrats may be more likely to intrinsically follow the popular model of representation. Meanwhile, it can be hypothesized that insignificant results for Republicans indicate that Republicans may follow a trustee model of representation.
Resumo:
Usual food choices during the past year, self-reported changes in consumption of three important food groups, and weight changes or stability were the questions addressed in this cross-sectional survey and retrospective review. The subjects were 141 patients with Hodgkin's disease or other B-cell types of lymphoma within their first three years following completion of initial treatments for lymphoma at the University of Texas M. D. Anderson Cancer Center in Houston, Texas. ^ The previously validated Block-98 Food Frequency Questionnaire was used to estimate usual food choices during the past year. Supplementary questions asked about changes breads and cereals (white or whole grain) and relative amounts of fruits and vegetables compared with before diagnosis and treatment. Over half of the subjects reported consuming more whole grains, fruits, and/or vegetables and almost three quarters of those not reporting such changes had been consuming whole grains before diagnosis and treatment. ^ Various dietary patterns were defined in order to learn whether proportionately more patients who changed in healthy directions fulfilled recognized nutritional guidelines such as 5-A-day fruits and vegetables and Dietary Reference Intakes (DRIB) for selected nutrients. ^ Small sizes of dietary pattern sub-groups limited the power of this study to detect differences in meeting recommended dietary guidelines. Nevertheless, insufficient and excessive intakes were detected among individuals with respect to fruits and vegetables, fats, calcium, selenium, iron, folate, and Vitamin A. The prevalence of inadequate or excess intakes of foods or nutrients even among those who perceived that they had increased or continued to eat whole grains and/or fruits and vegetables is of concern because of recognized effects upon general health and potential cancer related effects. ^ Over half of the subjects were overweight or obese (by BMI category) on their first visit to this cancer center and that proportion increased to almost three-quarters by their last follow-up visits. Men were significantly heavier than women, but no other significant differences in BMI measures were found even after accounting for prescribed steroids and dietary patterns. ^
Resumo:
Racial/ethnic disparities in diabetes mellitus (DM) and hypertension (HTN) have been observed and explained by socioeconomic status (education level, income level, etc.), screening, early diagnosis, treatment, prognostic factors, and adherence to treatment regimens. To the author's knowledge, there are no studies addressing disparities in hypertension and diabetes mellitus utilizing Hispanics as the reference racial/ethnic group and adjusting for sociodemographics and prognostic factors. This present study examined racial/ethnic disparities in HTN and DM and assessed whether this disparity is explained by sociodemographics. To assess these associations, the study utilized a cross-sectional design and examined the distribution of the covariates for racial/ethnic group differences, using the Pearson Chi Square statistic. The study focused on Non-Hispanic Blacks since this ethnic group is associated with the worst health outcomes. Logistic regression was used to estimate the prevalence odds ratio (POR) and to adjust for the confounding effects of the covariates. Results indicated that except for insurance coverage, there were statistically significant differences between Non-Hispanic Blacks and Non-Hispanic Whites, as well as Hispanics with respect to study covariates. In the unadjusted logistic regression model, there was a statistically significant increased prevalence of hypertension among Non-Hispanic Blacks compared to Hispanics, POR 1.36, 95% CI 1.02-1.80. Low income was statistically significantly associated with increased prevalence of hypertension, POR 0.38, 95% CI 0.32-0.46. Insurance coverage, though not statistically significant, was associated with an increase in the prevalence of hypertension, p>0.05. Concerning DM, Non-Hispanic Blacks were more likely to be diabetic, POR 1.10, 95% CI 0.85-1.47. High income was statistically significantly associated with decreased prevalence of DM, POR 0.47, 95% CI 0.39-0.57. After adjustment for the relevant covariates, the racial disparities between Hispanics and Non-Hispanic Blacks in HTN was removed, adjusted prevalence odds (APOR) 1.21, 95% CI 0.88-1.67. In this sample, there was racial/ethnic disparity in hypertension but not in diabetes mellitus between Hispanics and Non-Hispanic Blacks, with disparities in hypertension associated with socioeconomic status (family income, education, marital status) and also by alcohol, physical activity and age. However, race, education and BMI as class variables were statistically significantly associated with hypertension and diabetes mellitus p<0.0001. ^
Resumo:
Gender and racial/ethnic disparities in colorectal cancer screening (CRC) has been observed and associated with income status, education level, treatment and late diagnosis. According to the American Cancer Society, among both males and females, CRC is the third most frequently diagnosed type of cancer and accounts for 10% of cancer deaths in the United States. Differences in CRC test use have been documented and limited to access to health care, demographics and health behaviors, but few studies have examined the correlates of CRC screening test use by gender. This present study examined the prevalence of CRC screening test use and assessed whether disparities are explained by gender and racial/ethnic differences. To assess these associations, the study utilized a cross-sectional design and examined the distribution of the covariates for gender and racial/ethnic group differences using the chi square statistic. Logistic regression was used to estimate the prevalence odds ratio and to adjust for the confounding effects of the covariates. ^ Results indicated there are disparities in the use of CRC screening test use and there were statistically significant difference in the prevalence for both FOBT and endoscopy screening between gender, χ2, p≤0.003. Females had a lower prevalence of endoscopy colorectal cancer screening than males when adjusting for age and education (OR 0.88, 95% CI 0.82–0.95). However, no statistically significant difference was reported between racial/ethnic groups, χ 2 p≤0.179 after adjusting for age, education and gender. For both FOBT and endoscopy screening Non-Hispanic Blacks and Hispanics had a lower prevalence of screening compared with Non-Hispanic Whites. In the multivariable regression model, the gender disparities could largely be explained by age, income status, education level, and marital status. Overall, individuals between the age "70–79" years old, were married, with some college education and income greater than $20,000 were associated with a higher prevalence of colorectal cancer screening test use within gender and racial/ethnic groups. ^
Resumo:
The scale-up of antiretrovirals (ARVs) to treat HIV/AIDS in Africa has been rapid over the last five years. Botswana was the first African nation to roll out a comprehensive ARV program, where ARVs are available to all citizens who qualify. Excellent adherence to these ARVs is necessary to maintain HIV suppression and on-going health of all individuals taking them. Children rely almost entirely on their caregivers for the administration of these medications, and very little research has been done to examine the factors which affect both adherence and disclosure to the child of their HIV status. ^ Methods. This cross-sectional study used multiple methods to examine adherence, disclosure, and stigma across various dimensions of the child and caregiver's lives, including 30 caregiver questionnaires, interviewer-administered 3-day adherence recalls, pharmacy pill counts, and chart reviews. Fifty in-depth interviews were conducted with caregivers, male caregivers, teenagers, and health care providers. ^ Results. Perceived family stigma was found to be a predictor of excellent adherence. After controlling for age, children who live with their mothers were significantly less likely to know their HIV status than children living with any other relative (OR=0.403, p=0.014). Children who have a grandmother living in the household or taking care of them each day are significantly more likely to have optimal adherence than children who don't have grandmother involvement in their daily lives. ^ Discussion. Visible illness plays an intermediary role between adherence and perceived family stigma: Caregivers know that ARVs suppress physical manifestations of HIV, and in an effort to avoid unnecessary disclosure of the child's status to family members, therefore have children with excellent adherence. Grandmothers play a vital role in supporting the care and treatment of children in Botswana. ^
Resumo:
Stress at the workplace exposes people to increased risk for poor physical and/or mental health. Recently psychological and social disadvantages have been proven to place the worker at risk for mental or physical health outcomes. The overall purpose of this study was to study full time employed study subjects and (1) describe the various psychosocial job characteristics in a population of low income individuals stratified by race/ethnicity residing in Houston and Brownsville, Texas and (2) examine the associations between psychosocial job characteristics and physical, mental, and self rated health. It was observed that having a low level of education is associated with having very little or no control, security, and social support at the workplace. Being Mexican American was associated with having good job control, job security, job social support and having a less demanding job. Furthermore, the psychosocial job characteristics were associated with mental health outcomes but not with physical and self rated health. ^