736 resultados para Cross-sectional studies.
Resumo:
BACKGROUND: The proportion of births attended by skilled health personnel is one of two indicators used to measure progress towards Millennium Development Goal 5, which aims for a 75% reduction in global maternal mortality ratios by 2015. Rwanda has one of the highest maternal mortality ratios in the world, estimated between 249-584 maternal deaths per 100,000 live births. The objectives of this study were to quantify secular trends in health facility delivery and to identify factors that affect the uptake of intrapartum healthcare services among women living in rural villages in Bugesera District, Eastern Province, Rwanda. METHODS: Using census data and probability proportional to size cluster sampling methodology, 30 villages were selected for community-based, cross-sectional surveys of women aged 18-50 who had given birth in the previous three years. Complete obstetric histories and detailed demographic data were elicited from respondents using iPad technology. Geospatial coordinates were used to calculate the path distances between each village and its designated health center and district hospital. Bivariate and multivariate logistic regressions were used to identify factors associated with delivery in health facilities. RESULTS: Analysis of 3106 lifetime deliveries from 859 respondents shows a sharp increase in the percentage of health facility deliveries in recent years. Delivering a penultimate baby at a health facility (OR = 4.681 [3.204 - 6.839]), possessing health insurance (OR = 3.812 [1.795 - 8.097]), managing household finances (OR = 1.897 [1.046 - 3.439]), attending more antenatal care visits (OR = 1.567 [1.163 - 2.112]), delivering more recently (OR = 1.438 [1.120 - 1.847] annually), and living closer to a health center (OR = 0.909 [0.846 - 0.976] per km) were independently associated with facility delivery. CONCLUSIONS: The strongest correlates of facility-based delivery in Bugesera District include previous delivery at a health facility, possession of health insurance, greater financial autonomy, more recent interactions with the health system, and proximity to a health center. Recent structural interventions in Rwanda, including the rapid scale-up of community-financed health insurance, likely contributed to the dramatic improvement in the health facility delivery rate observed in our study.
Resumo:
Environmental conditions play an important role in the transmission of malaria; therefore, regulating these conditions can help to reduce disease burden. Environmental management practices for disease control can be implemented at the community level to complement other malaria control methods. This study assesses current knowledge and practices related to mosquito ecology and environmental management for malaria control in a rural, agricultural region of Tanzania. Household surveys were conducted with 408 randomly selected respondents from 10 villages and qualitative data were collected through focus group discussions and in-depth interviews. Results show that respondents are well aware of the links between mosquitoes, the environment, and malaria. Most respondents stated that cleaning the environment around the home, clearing vegetation around the home, or draining stagnant water can reduce mosquito populations, and 63% of respondents reported performing at least one of these techniques to protect themselves from malaria. It is clear that many respondents believe that these environmental management practices are effective malaria control methods, but the actual efficacy of these techniques for controlling populations of vectors or reducing malaria prevalence in the varying ecological habitats in Mvomero is unknown. Further research should be conducted to determine the effects of different environmental management practices on both mosquito populations and malaria transmission in this region, and increased participation in effective techniques should be promoted.
Resumo:
OBJECTIVE: This study examines the degree to which a married individual's health habits and use of preventive medical care are influenced by his or her spouse's behaviors. STUDY DESIGN: Using longitudinal data on individuals and their spouses, we examine changes over time in the health habits of each person as a function of changes in his or her spouse's health habits. Specifically, we analyze changes in smoking, drinking, exercising, cholesterol screening, and obtaining a flu shot. DATA SOURCE: This study uses data from the Health and Retirement Study (HRS), a nationally representative sample of individuals born between 1931 and 1941 and their spouses. Beginning in 1992, 12,652 persons (age-eligible individuals as well as their spouses) from 7,702 households were surveyed about many aspects of their life, including health behaviors, use of preventive services, and disease diagnosis. SAMPLE: The analytic sample includes 6,072 individuals who are married at the time of the initial HRS survey and who remain married and in the sample at the time of the 1996 and 2000 waves. PRINCIPAL FINDINGS: We consistently find that when one spouse improves his or her behavior, the other spouse is likely to do so as well. This is found across all the behaviors analyzed, and persists despite controlling for many other factors. CONCLUSIONS: Simultaneous changes occur in a number of health behaviors. This conclusion has prescriptive implications for developing interventions, treatments, and policies to improve health habits and for evaluating the impact of such measures.
Resumo:
OBJECTIVE: A study was undertaken to determine whether better cognitive functioning at midlife among more physically fit individuals reflects neuroprotection, by which fitness protects against age-related cognitive decline, or neuroselection, by which children with higher cognitive functioning select more active lifestyles. METHODS: Children in the Dunedin Longitudinal Study (N = 1,037) completed the Wechsler Intelligence Scales and the Trail Making, Rey Delayed Recall, and Grooved Pegboard tasks as children and again at midlife (age = 38 years). Adult cardiorespiratory fitness was assessed using a submaximal exercise test to estimate maximum oxygen consumption adjusted for body weight in milliliters/minute/kilogram. We tested whether more fit individuals had better cognitive functioning than their less fit counterparts (which could be consistent with neuroprotection), and whether better childhood cognitive functioning predisposed to better adult cardiorespiratory fitness (neuroselection). Finally, we examined possible mechanisms of neuroselection. RESULTS: Participants with better cardiorespiratory fitness had higher cognitive test scores at midlife. However, fitness-associated advantages in cognitive functioning were already present in childhood. After accounting for childhood baseline performance on the same cognitive tests, there was no association between cardiorespiratory fitness and midlife cognitive functioning. Socioeconomic and health advantages in childhood and healthier lifestyles during young adulthood explained most of the association between childhood cognitive functioning and adult cardiorespiratory fitness. INTERPRETATION: We found no evidence for a neuroprotective effect of cardiorespiratory fitness as of midlife. Instead, children with better cognitive functioning are selecting healthier lives. Fitness interventions may enhance cognitive functioning. However, observational and experimental studies testing neuroprotective effects of physical fitness should consider confounding by neuroselection.
Resumo:
BACKGROUND: In Tanzania, HIV-1 RNA testing is rarely available and not standard of care. Determining virologic failure is challenging and resistance mutations accumulate, thereby compromising second-line therapy. We evaluated durability of antiretroviral therapy (ART) and predictors of virologic failure among a pediatric cohort at four-year follow-up. METHODS: This was a prospective cross-sectional study with retrospective chart review evaluating a perinatally HIV-infected Tanzanian cohort enrolled in 2008-09 with repeat HIV-1 RNA in 2012-13. Demographic, clinical, and laboratory data were extracted from charts, resistance mutations from 2008-9 were analyzed, and prospective HIV RNA was obtained. RESULTS: 161 (78%) participants of the original cohort consented to repeat HIV RNA. The average age was 12.2 years (55% adolescents ≥12 years). Average time on ART was 6.4 years with 41% receiving second-line (protease inhibitor based) therapy. Among those originally suppressed on a first-line (non-nucleoside reverse transcriptase based regimen) 76% remained suppressed. Of those originally failing first-line, 88% were switched to second-line and 72% have suppressed virus. Increased level of viremia and duration of ART trended with an increased number of thymidine analogue mutations (TAMs). Increased TAMs increased the odds of virologic failure (p = 0.18), as did adolescent age (p < 0.01). CONCLUSIONS: After viral load testing in 2008-09 many participants switched to second-line therapy. The majority achieved virologic suppression despite multiple resistance mutations. Though virologic testing would likely hasten the switch to second-line among those failing, methods to improve adherence is critical to maximize durability of ART and improve virologic outcomes among youth in resource-limited settings.
Resumo:
Antiaging therapies show promise in model organism research. Translation to humans is needed to address the challenges of an aging global population. Interventions to slow human aging will need to be applied to still-young individuals. However, most human aging research examines older adults, many with chronic disease. As a result, little is known about aging in young humans. We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their "biological aging" (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older. Measured biological aging in young adults can be used to identify causes of aging and evaluate rejuvenation therapies.
Resumo:
BACKGROUND: In patients with myelomeningocele (MMC), a high number of fractures occur in the paralyzed extremities, affecting mobility and independence. The aims of this retrospective cross-sectional study are to determine the frequency of fractures in our patient cohort and to identify trends and risk factors relevant for such fractures. MATERIALS AND METHODS: Between March 1988 and June 2005, 862 patients with MMC were treated at our hospital. The medical records, surgery reports, and X-rays from these patients were evaluated. RESULTS: During the study period, 11% of the patients (n = 92) suffered one or more fractures. Risk analysis showed that patients with MMC and thoracic-level paralysis had a sixfold higher risk of fracture compared with those with sacral-level paralysis. Femoral-neck z-scores measured by dual-energy X-ray absorptiometry (DEXA) differed significantly according to the level of neurological impairment, with lower z-scores in children with a higher level of lesion. Furthermore, the rate of epiphyseal separation increased noticeably after cast immobilization. Mainly patients who could walk relatively well were affected. CONCLUSIONS: Patients with thoracic-level paralysis represent a group with high fracture risk. According to these results, fracture and epiphyseal injury in patients with MMC should be treated by plaster immobilization. The duration of immobilization should be kept to a minimum (<4 weeks) because of increased risk of secondary fractures. Alternatively, patients with refractures can be treated by surgery, when nonoperative treatment has failed.
Resumo:
PURPOSE: The readiness assurance process (RAP) of team-based learning (TBL) is an important element that ensures that students come prepared to learn. However, the RAP can use a significant amount of class time which could otherwise be used for application exercises. The authors administered the TBL-associated RAP in class or individual readiness assurance tests (iRATs) at home to compare medical student performance and learning preference for physiology content. METHODS: Using cross-over study design, the first year medical student TBL teams were divided into two groups. One group was administered iRATs and group readiness assurance tests (gRATs) consisting of physiology questions during scheduled class time. The other group was administered the same iRAT questions at home, and did not complete a gRAT. To compare effectiveness of the two administration methods, both groups completed the same 12-question physiology assessment during dedicated class time. Four weeks later, the entire process was repeated, with each group administered the RAP using the opposite method. RESULTS: The performance on the physiology assessment after at-home administration of the iRAT was equivalent to performance after traditional in-class administration of the RAP. In addition, a majority of students preferred the at-home method of administration and reported that the at-home method was more effective in helping them learn course content. CONCLUSION: The at-home administration of the iRAT proved effective. The at-home administration method is a promising alternative to conventional iRATs and gRATs with the goal of preserving valuable in-class time for TBL application exercises.
Resumo:
BACKGROUND: This study examined whether objective measures of food, physical activity and built environment exposures, in home and non-home settings, contribute to children's body weight. Further, comparing GPS and GIS measures of environmental exposures along routes to and from school, we tested for evidence of selective daily mobility bias when using GPS data. METHODS: This study is a cross-sectional analysis, using objective assessments of body weight in relation to multiple environmental exposures. Data presented are from a sample of 94 school-aged children, aged 5-11 years. Children's heights and weights were measured by trained researchers, and used to calculate BMI z-scores. Participants wore a GPS device for one full week. Environmental exposures were estimated within home and school neighbourhoods, and along GIS (modelled) and GPS (actual) routes from home to school. We directly compared associations between BMI and GIS-modelled versus GPS-derived environmental exposures. The study was conducted in Mebane and Mount Airy, North Carolina, USA, in 2011. RESULTS: In adjusted regression models, greater school walkability was associated with significantly lower mean BMI. Greater home walkability was associated with increased BMI, as was greater school access to green space. Adjusted associations between BMI and route exposure characteristics were null. The use of GPS-actual route exposures did not appear to confound associations between environmental exposures and BMI in this sample. CONCLUSIONS: This study found few associations between environmental exposures in home, school and commuting domains and body weight in children. However, walkability of the school neighbourhood may be important. Of the other significant associations observed, some were in unexpected directions. Importantly, we found no evidence of selective daily mobility bias in this sample, although our study design is in need of replication in a free-living adult sample.
Resumo:
Copper (Cu) has been widely used in the under bump metallurgy of chip and substrate metallization for chip packaging. However, due to the rapid formation of Cu–Sn intermetallic compound (IMC) at the tin-based solder/Cu interface during solder reaction, the reliability of this type of solder joint is a serious concern. In this work, electroless nickel–phosphorous (Ni–P) layer was deposited on the Cu pad of the flexible substrate as a diffusion barrier between Cu and the solder materials. The deposition was carried out in a commercial acidic sodium hypophosphite bath at 85 °C for different pH values. It was found that for the same deposition time period, higher pH bath composition (mild acidic) yields thicker Ni–P layer with lower phosphorous content. Solder balls having composition 62%Sn–36%Pb–2%Ag were reflowed at 240 °C for 1 to 180 min on three types of electroless Ni–P layers deposited at the pH value of 4, 4.8 and 6, respectively. Thermal stability of the electroless Ni–P barrier layer against the Sn–36%Pb–2%Ag solder reflowed for different time periods was examined by scanning electron microscopy equipped with energy dispersed X-ray. Solder ball shear test was performed in order to find out the relationship between the mechanical strength of solder joints and the characteristics of the electroless Ni–P layer deposited. The layer deposited in the pH 4 acidic bath showed the weak barrier against reflow soldering whereas layer deposited in pH 6 acidic bath showed better barrier against reflow soldering. Mechanical strength of the joints were deteriorated quickly in the layer deposited at pH 4 acidic bath, which was found to be thin and has a high phosphorous content. From the cross-sectional studies and fracture surface analyses, it was found that the appearance of the dark crystalline phosphorous-rich Ni layer weakened the interface and hence lower solder ball shear strength. Ni–Sn IMC formed at the interfaces was found to be more stable at the low phosphorous content (∼14 at.%) layer. Electroless Ni–P deposited at mild acidic bath resulting phosphorous content of around 14 at.% is suggested as the best barrier layer for Sn–36%Pb–2%Ag solder.
Resumo:
Ball shear test is the most common test method used to assess the reliability of bond strength for ball grid array (BGA) packages. In this work, a combined experimental and numerical study was carried out to realize of BGA solder interface strength. Solder mask defined bond pads on the BGA substrate were used for BGA ball bonding. Different bond pad metallizations and solder alloys were used. Solid state aging at 150degC up to 1000 h has been carried out to change the interfacial microstructure. Cross-sectional studies of the solder-to-bond pad interfaces was conducted by scanning electron microscopy (SEM) equipped with an energy dispersive X-ray (EDX) analyzer to investigate the interfacial reaction phenomena. Ball shear tests have been carried out to obtain the mechanical strength of the solder joints and to correlate shear behaviour with the interfacial reaction products. An attempt has been taken to realize experimental findings by Finite Element Analysis (FEA). It was found that intermetallic compound (IMC) formation at the solder interface plays an important role in the BGA solder bond strength. By changing the morphology and the microchemistry of IMCs, the fracture propagation path could be changed and hence, reliability could be improved
Resumo:
Although intergroup contact is one of the most prominent interventions to reduce prejudice. the generalization of contact effects is still a contentious issue This research further examined the rarely studied secondary transfer effect (STE, Pettigrew, 2009) by which contact with a primary outgroup reduces prejudice toward secondary groups that are not directly involved in the contact Across 3 cross-sectional studies conducted in Cyprus (N = 1.653), Northern Ireland (N = 1,973). and Texas (N = 275) and 1 longitudinal study conducted in Northern Ireland (N = 411). the present research sought to systematically rule out alternative accounts of the STE and to investigate 2 potential mediating mechanisms (ingroup reappraisal and attitude generalization) Results indicated that, consistent with the STE. contact with a primary outgroup predicts attitudes toward secondary outgroups. over and above contact with the secondary outgroup, socially desirable responding. and prior attitudes Mediation analyses found strong evidence for attitude generalization but only limited evidence for ingroup reappraisal as an underlying process Two out of 3 tests of a reverse model, where contact with the secondary outgroup predicts attitudes toward the primary outgroup. provide further evidence for an indirect effect through attitude generalization Theoretical and practical implications of these results are discussed, and directions for future research are identified
Resumo:
Objectives
To determine whether excessive and often inappropriate or dangerous psychotropic drug dispensing to older adults is unique to care homes or is a continuation of community treatment.
Design
Population-based data-linkage study using prescription drug information.
Setting
Northern Ireland's national prescribing database and care home information from the national inspectorate.
Participants
Two hundred fifty thousand six hundred seventeen individuals aged 65 and older.
Measurements
Prescription information was extracted for all psychotropic drugs included in the British National Formulary (BNF) categories 4.1.1, 4.1.2, and 4.2.2 (hypnotics, anxiolytics, and antipsychotics) dispensed over the study period. Repeated cross-sectional analysis was used to monitor changes in psychotropic drug dispensing over time.
Results
Psychotropic drug use was higher in care homes than the community; 20.3% of those in care homes were dispensed an antipsychotic in January 2009, compared with 1.1% of those in the community. People who entered care had higher use of psychotropic medications before entry than those who did not enter care, but this increased sharply in the month of admission and continued to rise. Antipsychotic drug dispensing increased from 8.2% before entry to 18.6% after entering care (risk ratio (RR) = 2.26, 95% confidence interval (CI)=1.96–2.59) and hypnotic drug dispensing from 14.8% to 26.3% (RR=1.78, 95% CI=1.61–1.96).
Conclusion
A continuation of high use before entry cannot wholly explain the higher dispensing of psychotropic drugs to individuals in care homes. Although drug dispensing is high in older people in the community, it increases dramatically on entry to care. Routine medicine reviews are necessary in older people and are especially important during transitions of care.
Resumo:
OBJECTIVE Inflammation and endothelial dysfunction have been associated with the immunobiology of preeclampsia (PE), a significant cause of adverse pregnancy outcomes. The prevalence of PE is elevated several fold in the presence of maternal type 1 diabetes mellitus (T1DM). Although cross-sectional studies of pregnancies among women without diabetes have shown altered inflammatory markers in the presence of PE, longitudinal studies of diabetic women are lacking. In maternal serum samples, we examined the temporal associations of markers of inflammation with the subsequent development of PE in women with T1DM. RESEARCH DESIGN AND METHODS We conducted longitudinal analyses of serum C-reactive protein (CRP), adhesion molecules, and cytokines during the first (mean ± SD, 12.2 ± 1.9 weeks), second (21.6 ± 1.5 weeks), and third (31.5 ± 1.7 weeks) trimesters of pregnancy (visits 1-3, respectively). All study visits took place before the onset of PE. Covariates were BMI, HbA1c, age of onset, duration of diabetes, and mean arterial pressure. RESULTS In women with T1DM who developed PE versus those who remained normotensive, CRP tended to be higher at visits 1 (P = 0.07) and 2 (P = 0.06) and was significantly higher at visit 3 (P <0.05); soluble E-selectin and interferon-?-inducible protein-10 (IP-10) were significantly higher at visit 3; interleukin-1 receptor antagonist (IL-1ra) and eotaxin were higher and lower, respectively, at visit 2 (all P <0.05). These conclusions persisted following adjustment for covariates. CONCLUSIONS In pregnant women with T1DM, elevated CRP, soluble E-selectin, IL-1ra, and IP-10 and lower eotaxin were associated with subsequent PE. The role of inflammatory factors as markers and potential mechanisms of the high prevalence of PE in T1DM merits further investigation.
Resumo:
Context: In nondiabetic pregnancy, cross-sectional studies have shown associations between maternal dyslipidemia and preeclampsia (PE). In type 1 diabetes mellitus (T1DM), the prevalence of PE is increased 4-fold, but prospective associations with plasma lipoproteins are unknown.
Objectives: The aim of this study was to define lipoprotein-related markers and potential mechanisms for PE in T1DM.
Design and Settings: We conducted a multicenter prospective study in T1DM pregnancy.
Patients: We studied 118 T1DM women (26 developed PE, 92 remained normotensive). Subjects were studied at three visits before PE onset [12.2 1.9, 21.6 1.5, and 31.5 1.7 wk gestation (means SD)] and at term (37.6 2.0 wk). Nondiabetic normotensive pregnant women (n 21) were included for reference.
Main Outcome Measures: Conventional lipid profiles, lipoprotein subclasses [defined by size (nuclear magnetic resonance) and by apolipoprotein content], serum apolipoproteins (ApoAI, ApoB, and ApoCIII), and lipolysis (ApoCIII ratio) were measured in T1DM women with and without subsequent PE.
Results: In women with vs. without subsequent PE, at the first and/or second study visits: lowdensity lipoprotein (LDL)-cholesterol, particle concentrations of total LDL and large (but not small) LDL, serum ApoB, and ApoB:ApoAI ratio were all increased (P 0.05); peripheral lipoprotein lipolysis was decreased (P0.01). These early differences remained significant in covariate analysis (glycated hemoglobin, actual prandial status, gravidity, body mass index, and diabetes duration) but were not present at the third study visit. High-density lipoprotein and very low-density lipoprotein subclasses did not differ between groups before PE onset.
Conclusions: Early in pregnancy, increased cholesterol-rich lipoproteins and an index suggesting decreased peripheral lipolysis were associated with subsequent PE in T1DM women. Background maternal lipoprotein characteristics, perhaps masked by effects of late pregnancy, may influence PE risk.