872 resultados para in-class test
Resumo:
BACKGROUND: Effective lectures often incorporate activities that encourage learner participation. A challenge for educators is how to facilitate this in the large group lecture setting. This study investigates the individual student characteristics involved in encouraging (or dissuading) learners to interact, ask questions, and make comments in class. METHODS: Students enrolled in a Doctor of Veterinary Medicine program at Ross University School of Veterinary Medicine, St Kitts, were invited to complete a questionnaire canvassing their participation in the large group classroom. Data from the questionnaire were analyzed using Excel (Microsoft, Redmond, WA, USA) and the R software environment (http://www.r-project.org/). RESULTS: One hundred and ninety-two students completed the questionnaire (response rate, 85.7%). The results showed statistically significant differences between male and female students when asked to self-report their level of participation (P=0.011) and their confidence to participate (P<0.001) in class. No statistically significant difference was identified between different age groups of students (P=0.594). Student responses reflected that an "aversion to public speaking" acted as the main deterrent to participating during a lecture. Female participants were 3.56 times more likely to report a fear of public speaking than male participants (odds ratio 3.56, 95% confidence interval 1.28-12.33, P=0.01). Students also reported "smaller sizes of class and small group activities" and "other students participating" as factors that made it easier for them to participate during a lecture. CONCLUSION: In this study, sex likely played a role in learner participation in the large group veterinary classroom. Male students were more likely to participate in class and reported feeling more confident to participate than female students. Female students in this study commonly identified aversion to public speaking as a factor which held them back from participating in the large group lecture setting. These are important findings for veterinary and medical educators aiming to improve learner participation in the classroom. Potential ways of addressing this challenge include addition of small group activities and audience response systems during lectures, and inclusion of training interventions in public speaking at an early stage of veterinary and medical curricula.
Resumo:
Play has been proposed as a promising indicator of positive animal welfare. We aimed to study play in rats across contexts (conspecific/heterospecific) and types (social: pinning, being pinned; solitary: scampering), and we investigated its structure using behavioral sequence analysis. Group-housed (three per cage) adolescent male Lister Hooded rats (n = 21) were subjected to a Play-In-Pairs test: after a 3 hour isolation period, a pair of cage-mates was returned to the home cage and both social and solitary play were scored for 20 min. This procedure was repeated for each pair combination across three consecutive days, and individual play scores were calculated. Heterospecific play was measured using a Tickling test: rats were individually tickled by the experimenter through bouts of gentle, rapid finger movements on their underside, and the number of positive 50 kHz frequency modulated vocalizations and experimenter-directed approach behaviors were recorded. Both of the above tests were compared with social play in the home cage. While conspecific play in both the Play-In-Pairs test and home cage were correlated, both seemed to be unrelated to heterospecific play in the Tickling test. During the Play-In-Pairs test, although both solitary and social play types occurred, they were unrelated, and solitary locomotor play of one rat did not predict the subsequent play behavior of its cage mate. Analysis of play structure revealed that social play occurred more often in bouts of repeated behaviors while solitary play sequences did not follow a specific pattern. If play is to be used as an indicator of positive welfare in rats, context, type and structure differences should be taken into account.
Resumo:
OBJECTIVES To evaluate possible differences in periodontal inflammatory, microbiological and clinical parameters between women with preterm premature rupture of membranes (PPROM) and controls with uncomplicated pregnancies. MATERIALS AND METHODS Fifty-six women (32 test (PPROM) and 24 controls (uncomplicated pregnancies)) were examined at three time-points (T1: gestational weeks 20-35, T2: within 48 h after parturition, T3: 4-6 weeks after parturition). The examinations included assessment of the Periodontal Screening Index, collection of gingival crevicular fluid (GCF) and subgingival as well as vaginal bacterial sampling. RESULTS Periodontal inflammation was found to be higher in the test compared with the control group (p < 0.05) and decreased over time in both groups (p < 0.05). Microbiological outcomes showed no intergroup differences (p > 0.05) in prevalence of bacteria, but a decrease in subgingival periodontopathogens from T1 to T2 in the test group (p < 0.05) was observed. Interleukin (IL)-1β levels in GCF at T2 were not different between groups (p > 0.05). In women with PPROM, GCF levels of IL-8 (p < 0.05) and C-reactive protein (p < 0.05) were lower and IL-10 levels higher (p < 0.05) compared with controls. CONCLUSIONS Periodontal inflammation is elevated during pregnancy and seems to be more pronounced in women with PPROM. CLINICAL RELEVANCE The findings of the present study revealed an association between periodontal inflammation and PPROM, thus emphasizing the importance of optimizing self-performed oral hygiene in pregnant women.
Resumo:
Bentonite and iron metals are common materials proposed for use in deep-seated geological repositories for radioactive waste. The inevitable corrosion of iron leads to interaction processes with the clay which may affect the sealing properties of the bentonite backfill. The objective of the present study was to improve our understanding of this process by studying the interface between iron and compacted bentonite in a geological repository-type setting. Samples of MX-80 bentonite samples which had been exposed to an iron source and elevated temperatures (up to 115ºC) for 2.5 y in an in situ experiment (termed ABM1) at the Äspö Hard Rock Laboratory, Sweden, were investigated by microscopic means, including scanning electron microscopy, μ-Raman spectroscopy, spatially resolved X-ray diffraction, and X-ray fluorescence. The corrosion process led to the formation of a ~100 mm thick corrosion layer containing siderite, magnetite, some goethite, and lepidocrocite mixed with the montmorillonitic clay. Most of the corroded Fe occurred within a 10 mm-thick clay layer adjacent to the corrosion layer. An average corrosion depth of the steel of 22–35 μm and an average Fe2+ diffusivity of 1–26×10–13 m2/s were estimated based on the properties of the Fe-enriched clay layer. In that layer, the corrosion-derived Fe occurred predominantly in the clay matrix. The nature of this Fe could not be identified. No indications of clay transformation or newly formed clay phases were found. A slight enrichment of Mg close to the Fe–clay contact was observed. The formation of anhydrite and gypsum, and the dissolution of some SiO
Resumo:
The present study explores teacher emotions, in particular how they are predicted by students’ behaviour and the interpersonal aspect of the teacher-student relationship (TSR). One hundred thirty-two secondary teachers participated in a quantitative study relying on self-report questionnaire data. Based on the model of teacher emotions by Frenzel (2014), teachers rated their experienced joy, anger and anxiety during classroom instruction (dependent variable). Students’ motivational behaviour (= engagement), socio-emotional behaviour (= discipline in class) and relational behaviour (= closeness; interpersonal TSR) were assessed as the independent variables. Teachers’ self-efficacy beliefs served as a control variable. Hierarchical regression analysis revealed that the interpersonal relationship formed between teachers and students was the strongest predictor for teachers’ joy (positive relation) and anxiety (negative relation), whereas lack of discipline in class best predicted teachers’ anger experiences. Students’ engagement also proved a significant predictor of teacher emotions. The results suggest that interpersonal TSR plays a particularly important role in teachers’ emotional experiences in class.
Resumo:
BACKGROUND Treatment of furcation defects is a core component of periodontal therapy. The goal of this consensus report is to critically appraise the evidence and to subsequently present interpretive conclusions regarding the effectiveness of regenerative therapy for the treatment of furcation defects and recommendations for future research in this area. METHODS A systematic review was conducted before the consensus meeting. This review aims to evaluate and present the available evidence regarding the effectiveness of different regenerative approaches for the treatment of furcation defects in specific clinical scenarios compared with conventional surgical therapy. During the meeting, the outcomes of the systematic review, as well as other pertinent sources of evidence, were discussed by a committee of nine members. The consensus group members submitted additional material for consideration by the group in advance and at the time of the meeting. The group agreed on a comprehensive summary of the evidence and also formulated recommendations for the treatment of furcation defects via regenerative therapies and the conduction of future studies. RESULTS Histologic proof of periodontal regeneration after the application of a combined regenerative therapy for the treatment of maxillary facial, mesial, distal, and mandibular facial or lingual Class II furcation defects has been demonstrated in several studies. Evidence of histologic periodontal regeneration in mandibular Class III defects is limited to one case report. Favorable outcomes after regenerative therapy for maxillary Class III furcation defects are limited to clinical case reports. In Class I furcation defects, regenerative therapy may be beneficial in certain clinical scenarios, although generally Class I furcation defects may be treated predictably with non-regenerative therapies. There is a paucity of data regarding quantifiable patient-reported outcomes after surgical treatment of furcation defects. CONCLUSIONS Based on the available evidence, it was concluded that regenerative therapy is a viable option to achieve predictable outcomes for the treatment of furcation defects in certain clinical scenarios. Future research should test the efficacy of novel regenerative approaches that have the potential to enhance the effectiveness of therapy in clinical scenarios associated historically with less predictable outcomes. Additionally, future studies should place emphasis on histologic demonstration of periodontal regeneration in humans and also include validated patient-reported outcomes. CLINICAL RECOMMENDATIONS Based on the prevailing evidence, the following clinical recommendations could be offered. 1) Periodontal regeneration has been established as a viable therapeutic option for the treatment of various furcation defects, among which Class II defects represent a highly predictable scenario. Hence, regenerative periodontal therapy should be considered before resective therapy or extraction; 2) The application of a combined therapeutic approach (i.e., barrier, bone replacement graft with or without biologics) appears to offer an advantage over monotherapeutic algorithms; 3) To achieve predictable regenerative outcomes in the treatment of furcation defects, adverse systemic and local factors should be evaluated and controlled when possible; 4) Stringent postoperative care and subsequent supportive periodontal therapy are essential to achieve sustainable long-term regenerative outcomes.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
OBJECTIVE To evaluate the long-term effects of asymmetrical maxillary first molar (M1) extraction in Class II subdivision treatment. MATERIALS AND METHODS Records of 20 Class II subdivision whites (7 boys, 13 girls; mean age, 13.0 years; SD, 1.7 years) consecutively treated with the Begg technique and M1 extraction, and 15 untreated asymmetrical Class II adolescents (4 boys, 11 girls; mean age, 12.2 years; SD, 1.3 years) were examined in this study. Cephalometric analysis and PAR assessment were carried out before treatment (T1), after treatment (T2), and on average 2.5 years posttreatment (T3) for the treatment group, and at similar time points and average follow-up of 1.8 years for the controls. RESULTS The adjusted analysis indicated that the maxillary incisors were 2.3 mm more retracted in relation to A-Pog between T1 and T3 (β = 2.31; 95% CI; 0.76, 3.87), whereas the mandibular incisors were 1.3 mm more protracted (β = 1.34; 95% CI; 0.09, 2.59), and 5.9° more proclined to the mandibular plane (β = 5.92; 95% CI; 1.43, 10.41) compared with controls. The lower lip appeared 1.4 mm more protrusive relative to the subnasale-soft tissue-Pog line throughout the observation period in the treated adolescents (β = 1.43; 95% CI; 0.18, 2.67). There was a significant PAR score reduction over the entire follow-up period in the molar extraction group (β = -6.73; 95% CI; -10.7, -2.7). At T2, 65% of the subjects had maxillary midlines perfectly aligned with the face. CONCLUSIONS Unilateral M1 extraction in asymmetrical Class II cases may lead to favorable occlusal outcomes in the long term without harming the midline esthetics and soft tissue profile.
Resumo:
Among all classes of nanomaterials, silver nanoparticles (AgNPs) have potentially an important ecotoxicological impact, especially in freshwater environments. Fish are particularly susceptible to the toxic effects of silver ions and, with knowledge gaps regarding the contribution of dissolution and unique particle effects to AgNP toxicity, they represent a group of vulnerable organisms. Using cell lines (RTL-W1, RTH-149, RTG-2) and primary hepatocytes of rainbow trout (Oncorhynchus mykiss) as in vitro test systems, we assessed the cytotoxicity of the representative AgNP, NM-300K, and AgNO3 as an Ag+ ion source. Lack of AgNP interference with the cytotoxicity assays (AlamarBlue, CFDA-AM, NRU assay) and their simultaneous application point to the compatibility and usefulness of such a battery of assays. The RTH-149 and RTL-W1 liver cell lines exhibited similar sensitivity as primary hepatocytes towards AgNP toxicity. Leibovitz's L-15 culture medium composition (high amino acid content) had an important influence on the behaviour and toxicity of AgNPs towards the RTL-W1 cell line. The obtained results demonstrate that, with careful consideration, such an in vitro approach can provide valuable toxicological data to be used in an integrated testing strategy for NM-300K risk assessment.
Resumo:
BACKGROUND The aim of this study was to compare the 5-year survival and success rates of 3.3 mm dental implants either made from titanium-zirconium (TiZr) alloy or from Grade IV titanium (Ti Grade IV) in mandibular implant-based removable overdentures. METHODS The core study had a follow-up period of 36 months and was designed as a randomized, controlled, double-blind, split-mouth multicenter clinical trial. Patients with edentulous mandibles received two Straumann Bone Level implants (diameter 3.3 mm, SLActive®), one of TiZr (test) and one of Ti Grade IV (control), in the interforaminal region. This follow-up study recruited patients from the core study and evaluated the plaque and sulcus bleeding indices, radiographic crestal bone level, as well as implant survival and success 60 months after implant placement. RESULTS Of the 91 patients who initially received implants, 75 completed the 36 month follow-up and 49 were available for the 60 month examination. Two patients were excluded so that a total of 47 patients with an average age of 72 ± 8 years were analysed. The characteristics and 36-month performance of the present study cohort did not differ from the non-included initial participants (p > 0.05). In the period since the 36-month follow-up examination, no implant was lost. The cumulative implant survival rate was 98.9 % for the TiZr group and 97.8 % for the Ti Grade IV group. Crestal bone level changes at 60 months were not different in the test and control group (TiZr -0.60 ± 0.69 mm and Ti Grade IV -0.61 ± 0.83 mm; p = 0.96). The cumulative implant success rate after 60 months was 95.8 and 92.6 % for TiZr and Ti Grade IV, respectively. CONCLUSIONS After 60 months, the positive outcomes of the 36 month results for TiZr and Ti Grade IV implants were confirmed, with no significant differences with regard to crestal bone level change, clinical parameters and survival or success rates. TiZr implants performed equally well compared to conventional Ti Grade IV 3.3 mm diameter-reduced implants for mandibular removable overdentures. TRIAL REGISTRATION Registered on www.clinicaltrials.gov: NCT01878331.
Resumo:
Gender and racial/ethnic disparities in colorectal cancer screening (CRC) has been observed and associated with income status, education level, treatment and late diagnosis. According to the American Cancer Society, among both males and females, CRC is the third most frequently diagnosed type of cancer and accounts for 10% of cancer deaths in the United States. Differences in CRC test use have been documented and limited to access to health care, demographics and health behaviors, but few studies have examined the correlates of CRC screening test use by gender. This present study examined the prevalence of CRC screening test use and assessed whether disparities are explained by gender and racial/ethnic differences. To assess these associations, the study utilized a cross-sectional design and examined the distribution of the covariates for gender and racial/ethnic group differences using the chi square statistic. Logistic regression was used to estimate the prevalence odds ratio and to adjust for the confounding effects of the covariates. ^ Results indicated there are disparities in the use of CRC screening test use and there were statistically significant difference in the prevalence for both FOBT and endoscopy screening between gender, χ2, p≤0.003. Females had a lower prevalence of endoscopy colorectal cancer screening than males when adjusting for age and education (OR 0.88, 95% CI 0.82–0.95). However, no statistically significant difference was reported between racial/ethnic groups, χ 2 p≤0.179 after adjusting for age, education and gender. For both FOBT and endoscopy screening Non-Hispanic Blacks and Hispanics had a lower prevalence of screening compared with Non-Hispanic Whites. In the multivariable regression model, the gender disparities could largely be explained by age, income status, education level, and marital status. Overall, individuals between the age "70–79" years old, were married, with some college education and income greater than $20,000 were associated with a higher prevalence of colorectal cancer screening test use within gender and racial/ethnic groups. ^
Resumo:
Children and adults frequently skip breakfast and rates are currently increasing. In addition, the food choices made for breakfast are not always healthy ones. Breakfast skipping, in conjunction with unhealthy breakfast choices, leads to impaired cognitive functioning, poor nutrient intake, and overweight. In response to these public health issues, Skip To Breakfast, a behaviorally based school and family program, was created to increase consistent and healthful breakfast consumption among ethnically diverse fifth grade students and their families, using Intervention Mapping™. Four classroom lessons and four parent newsletters were used to deliver the intervention. For this project, a healthy, "3 Star Breakfast" was promoted, and included a serving each of dairy product, whole grain, and fruit, each with an emphasis on being low in fat and sugar. The goal of this project was to evaluate the feasibility and acceptability of the intervention. A pilot-test of the intervention was conducted in one classroom, in a school in Houston, during the Fall 2007 semester. A qualitative evaluation of the intervention was conducted, which included focus groups with students, phone interviews of parents, process evaluation data from the classroom teacher, and direct observation. Sixteen students and six parents participated in the study. Data were recorded and themes were identified. Initial results showed there is a need for such programs. Based on the initial feedback, edits were made to the intervention and program. Results showed high acceptability among the teacher, students, and parents. It became apparent that students were not reliably getting the parent newsletters to their parents to read, so a change to the protocol was made, in which students will receive incentives for having parents read newsletters and return signed forms, to increase parent participation. Other changes included small modifications to the curriculum, such as, clarifying instructions, changing in-class assignments to homework assignments, and including background reading materials for the teacher. The main trial is planned to be carried out in Spring 2008, in two elementary schools, utilizing four, fifth grade classes from each, with one school acting as the control and one as the intervention school. Results from this study can be used as an adjunct to the Coordinated Approach To Child Health (CATCH) program. ^
Resumo:
Childhood obesity in the US has reached epidemic proportions. Minority children are affected the most by this epidemic. Although there is no clear relationship between obesity and fruits and vegetables consumption, studies suggest that eating fruits and vegetables could be helpful in preventing childhood obesity. A few school-based interventions targeting youth have been effective at increasing fruits and vegetables intake.^ In Austin, Texas, the Sustainable Food Center delivered the Sprouting Healthy Kids (SHK) program that targeted low socio-economic status children in four intervention middle schools. The SHK program delivered six intervention components. This school-based intervention included: a cafeteria component, in-class lessons, an after-school garden program, a field trip to a local farm, food tasting, and farmers' visits to schools. This study aimed to determine the effects of the SHK intervention in middle school students' preferences, motivation, knowledge, and self-efficacy towards fruits and vegetables intake, as well as the actual fruits and vegetables intake. The study also aimed to determine the effects of exposure to different doses of the SHK intervention on participants' fruits and vegetable intake.^ The SHK was delivered during Spring 2009. A total of 214 students completed the pre-and-posttest surveys measuring self-report fruits and vegetables intake as well as intrapersonal factors. The results showed that the school cafeteria, the food tasting, the after school program, and the farmers' visits had a positive effect on the participants' motivation, knowledge, and self-efficacy towards fruits and vegetables intake. The farmers' visits and the food tasting components increased participants' fruits and vegetables intake. Exposure to two or more intervention components increased participants' fruits and vegetables intake. The statistically significant dose-response effect size was .352, which suggests that each intervention component increased participants' fruits and vegetables consumption this amount. Certain intervention components were more effective than others. Food tasting and farmers visits increased participants fruits and vegetables intake, therefore these components should be offered in an ongoing basis. This study suggests that exposure to multiple intervention components increased behaviors and attitudes towards fruits and vegetables consumption. Findings are consistent that SHK can influence behaviors of middle school students.^
Resumo:
A community development program operating in the mountains of North India was studied to assess its potential effects on mortality, fertility and migration patterns in the community which it served. The development program operated in Jaunpur Block, Tehri-Garhwal District, Uttar Pradesh State. Two comparable villages in the district were studied. The development program had been working in one for two years, and the other was completely untouched by the program.^ Since not enough time had elapsed since the beginning of the development program's work for any effects on demographic patterns to be visable in Jaunpur Block, this study looked to attitudes of village residents as indicators of future demographic trends. Existing demographic patterns and their interrelationship with socio-religious customs were examined in the test village. A questionnaire was then administered to ascertain attitudinal differences between the residents of the test village and the control village.^ The primary work of the community development program was to train women as village health workers. The results of the attitudinal comparison of the residents of the two villages showed a marked difference in attitudes relating to the position of women in society. The data showed a higher esteem for women in the test village than in the control village, and it is argued that this difference may be attributable to the work of the development program.^ Predicting future demographic trends in Jaunpur Block on the basis of the observed difference in villagers' attitudes toward the status of women is speculatory. Jaunpur Block appears to be in the demographic stage of pre-transition, maintaining relatively high rates of both mortality and fertility. Based on demographic transition theory the next significant change in demographic patterns in Jaunpur is predicted to be a decline in mortality, and an increase in the status of women is unrelated to this prediction.^ The community development program which was studied terminated unexpectedly during the time of this study. A case study of the program's final months is presented, and speculation on the future course of demographic trends in Jaunpur Block is related to the possible alternatives for future development in the area. ^
Resumo:
Background: Poor communication among health care providers is cited as the most common cause of sentinel events involving patients. Sign-out of patient data at the change of clinician shifts is a component of communication that is especially vulnerable to errors. Sign-outs are particularly extensive and complex in intensive care units (ICUs). There is a paucity of validated tools to assess ICU sign-outs. ^ Objective: To design a valid and reliable survey tool to assess the perceptions of Pediatric ICU (PICU) clinicians about sign-out. ^ Design: Cross-sectional, web-based survey ^ Setting: Academic hospital, 31-bed PICU ^ Subjects: Attending faculty, fellows, nurse practitioners and physician assistants. ^ Interventions: A survey was designed with input from a focus group and administered to PICU clinicians. Test-retest reliability, internal consistency and validity of the survey tool were assessed. ^ Measurements and Main Results: Forty-eight PICU clinicians agreed to participate. We had 42(88%) and 40(83%) responses in the test and retest phases. The mean scores for the ten survey items ranged from 2.79 to 3.67 on a five point Likert scale with no significant test-retest difference and a Pearson correlation between pre and post answers of 0.65. The survey item scores showed internal consistency with a Cronbach's Alpha of 0.85. Exploratory factor analysis revealed three constructs: efficacy of sign-out process, recipient satisfaction and content applicability. Seventy eight % clinicians affirmed the need for improvement of the sign-out process and 83% confirmed the need for face- to-face verbal sign-out. A system-based sign-out format was favored by fellows and advanced level practitioners while attendings preferred a problem-based format (p=0.003). ^ Conclusions: We developed a valid and reliable survey to assess clinician perceptions about the ICU sign-out process. These results can be used to design a verbal template to improve and standardize the sign-out process.^