25 resultados para assessment during practicum
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Background and Purpose: Becoming proficient in laparoscopic surgery is dependent on the acquisition of specialized skills that can only be obtained from specific training. This training could be achieved in various ways using inanimate models, animal models, or live patient surgery-each with its own pros and cons. Currently, there are substantial data that support the benefits of animal model training in the initial learning of laparoscopy. Nevertheless, whether these benefits extent themselves to moderately experienced surgeons is uncertain. The purpose of this study was to determine if training using a porcine model results in a quantifiable gain in laparoscopic skills for moderately experienced laparoscopic surgeons. Materials and Methods: Six urologists with some laparoscopic experience were asked to perform a radical nephrectomy weekly for 10 weeks in a porcine model. The procedures were recorded, and surgical performance was assessed by two experienced laparoscopic surgeons using a previously published surgical performance assessment tool. The obtained data were then submitted to statistical analysis. Results: With training, blood loss was reduced approximately 45% when comparing the averages of the first and last surgical procedures (P = 0.006). Depth perception showed an improvement close to 35% (P = 0.041), and dexterity showed an improvement close to 25% (P = 0.011). Total operative time showed trends of improvement, although it was not significant (P = 0.158). Autonomy, efficiency, and tissue handling were the only aspects that did not show any noteworthy change (P = 0.202, P = 0.677, and P = 0.456, respectively). Conclusions: These findings suggest that there are quantifiable gains in laparoscopic skills obtained from training in an animal model. Our results suggest that these benefits also extend to more advanced stages of the learning curve, but it is unclear how far along the learning curve training with animal models provides a clear benefit for the performance of laparoscopic procedures. Future studies are necessary to confirm these findings and better understand the impact of this learning tool on surgical practice.
Resumo:
Background: Exposure to fine fractions of particulate matter (PM2.5) is associated with increased hospital admissions and mortality for respiratory and cardiovascular disease in children and the elderly. This study aims to estimate the toxicological risk of PM2.5 from biomass burning in children and adolescents between the age of 6 and 14 in Tangara da Serra, a municipality of Subequatorial Brazilian Amazon. Methods: Risk assessment methodology was applied to estimate the risk quotient in two scenarios of exposure according to local seasonality. The potential dose of PM2.5 was estimated using the Monte Carlo simulation, stratifying the population by age, gender, asthma and Body Mass Index (BMI). Results: Male asthmatic children under the age of 8 at normal body rate had the highest risk quotient among the subgroups. The general potential average dose of PM2.5 was 1.95 mu g/kg.day (95% CI: 1.62 - 2.27) during the dry scenario and 0.32 mu g/kg. day (95% CI: 0.29 - 0.34) in the rainy scenario. During the dry season, children and adolescents showed a toxicological risk to PM2.5 of 2.07 mu g/kg. day (95% CI: 1.85 - 2.30). Conclusions: Children and adolescents living in the Subequatorial Brazilian Amazon region were exposed to high levels of PM2.5 resulting in toxicological risk for this multi-pollutant. The toxicological risk quotients of children in this region were comparable or higher to children living in metropolitan regions with PM2.5 air pollution above the recommended limits to human health.
Assessment of referrals to an OT consultation-liaison service: a retrospective and comparative study
Resumo:
The objective was to conduct a retrospective and comparative study of the requests for consultation-liaison (RCLs), during a period of six years, sent to the Occupational Therapy (OT) team that acts as the Consultation-liaison Service in Mental Health. During the studied period 709 RCLs were made and 633 patients received OT consultations. The comparison group was extended to 1 129 consecutive referrals to the psychiatric CL service, within the same period and that were also retrospectively reviewed. Regarding to RCLs to the OT team, most of the subjects were women with incomplete elementary schooling, with a mean age of 39.2 years, and were self-employed or retired. Internal Medicine was responsible for most of the RCLs. The mean length of hospitalization was 51 days and the mean rate of referral was 0.5%, with the most frequent reason for the request being related to the emotional aspects and the most frequent psychiatric diagnosis was mood disorder. It is concluded that there is a clear demand for the development of consultation-liaison in OT, particularly with regard to the promotion of mental health in general hospitals.
Resumo:
The extent to which the hypothalamic-pituitary-adrenal axis is activated by short-term and long-term consequences of stress is still open to investigation. This study aimed to determine (i) the correlation between plasma corticosterone and exploratory behavior exhibited by rats subjected to the elevated plus maze (EPM) following different periods of social isolation, (ii) the effects of the corticosterone synthesis blocker, metyrapone, on the behavioral consequences of isolation, and (iii) whether corticosterone produces its effects through an action on the anterior cingulate cortex, area 1 (Cg1). Rats were subjected to 30-min, 2-h, 24-h, or 7-day isolation periods before EPM exposure and plasma corticosterone assessments. Isolation for longer periods of time produced greater anxiogenic-like effects on the EPM. However, stretched attend posture (SAP) and plasma corticosterone concentrations were increased significantly after 30 min of isolation. Among all of the behavioral categories measured in the EPM, only SAP positively correlated with plasma corticosterone. Metyrapone injected prior to the 24 h isolation period reversed the anxiogenic effects of isolation. Moreover, corticosterone injected into the Cg1 produced a selective increase in SAP. These findings indicate that risk assessment behavior induced by the action of corticosterone on Cg1 neurons initiates a cascade of defensive responses during exposure to stressors.
Resumo:
Objectives: The aim of the present study was to investigate the construct validity of the Assessment of Countertransference Scale (ACS) in the context of the trauma care, through the identification of the underlying latent constructs of the measured items and their homogeneity. Methods: ACS assesses 23 feelings of CT in three factors: closeness, rejection and indifference. ACS was applied to 50 residents in psychiatry after the first appointment with 131 victims of trauma consecutively selected during 4 years. ACS was analyzed by exploratory (EFA) and confirmatory (CFA) factor analysis, internal consistence and convergent-discriminant validity. Results: In spite of the fact that closeness items obtained the highest scores, the EFA showed that the factor rejection (24% of variance, alpha = 0.88) presented a more consistent intercorrelation of the items, followed by closeness (15% of variance, alpha = 0.82) and, a distinct factor, sadness (9% of variance, alpha = 0.72). Thus, a modified version was proposed. In the comparison between the original and the proposed version, CFA detected better goodness-of-fit indexes for the proposed version (GFI = 0.797, TLI = 0.867, CFI = 0.885 vs. GFI = 0.824, TLI = 0.904, CFI = 0.918). Conclusions: ACS is a promising instrument for assessing CT feelings, making it valid to access during the care of trauma victims.
Resumo:
The neurovascular bundle may be vulnerable during surgical procedures involving the mandible, especially when anatomical variations are present. Increased demand of implant surgeries, wider availability of three-dimensional exams, and lack of clear definitions in the literature indicate that features of anatomical variations should be revisited. The objective of the study was to evaluate features of anatomical variations related to mandibular canal (MC), such as bifid canals, anterior loop of mental nerve, and corticalization of MC. Additionally, bone trabeculation at the submandibular gland fossa region (SGF) was assessed and related to visibility of MC. Cone beam computed tomography exams from 100 patients (200 hemimandibles) were analyzed and the following parameters were registered: diameter and corticalization of MC; trabeculation in SGF region; presence of bifid MC, position of bifurcations, diameter, and direction of bifid canals; and measurement of anterior loops by two methods. Corticalization of the MC was observed in 59% of hemimandibles. In 23%, MC could be identified despite absence of corticalization. Diameter of MC was between 2.1 and 4 mm for nearly three quarters of the sample. In 80% of the sample trabeculation at the SGF was either decreased or not visible, and such cases showed correlation with absence of MC corticalization. Bifid MC affected 19% of the patients, mostly associated with additional mental foramina. Clinically significant anterior loop (> 2 mm of anterior extension) was observed in 22-28%, depending on the method. Our findings, together with previously reported limitations of conventional exams, draw attention to the unpredictability related to anatomical variations in neurovascularization, showing the contribution of individual assessment through different views of three-dimensional imaging prior to surgical procedures in the mandible.
Resumo:
An environmental impact study was conducted to determine the Piracicamirim's creek water quality in order to assess the influence of effluents from a sugar industry in this water body. For this, toxicity tests were performed with a water sample upstream and downstream the industry using the microcrustaceans Daphnia magna, Ceriodaphnia dubia and Ceriodaphnia silvestrii as test organisms, as well as physical and chemical analysis of water. Results showed that physical and chemical parameters did not change during the sampling period, except for the dissolved oxygen. No toxicity was observed for D. magna and reproduction of C. dubia and C. silvestrii in both sampling points. Thus, the industry was not negatively impacting the quality of this water body.
Resumo:
OBJECTIVE: Poor sleep quality is one of the factors that adversely affects patient quality of life after kidney transplantation, and sleep disorders represent a significant cardiovascular risk factor. The objective of this study was to investigate the prevalence of changes in sleep quality and their outcomes in kidney transplant recipients and analyze the variables affecting sleep quality in the first years after renal transplantation. METHODS: Kidney transplant recipients were evaluated at two time points after a successful transplantation: between three and six months (Phase 1) and between 12 and 15 months (Phase 2). The following tools were used for assessment: the Pittsburgh Sleep Quality Index; the quality of life questionnaire Short-Form-36; the Hospital Anxiety and Depression scale; the Karnofsky scale; and assessments of social and demographic data. The prevalence of poor sleep was 36.7% in Phase 1 and 38.3% in Phase 2 of the study. RESULTS: There were no significant differences between patients with and without changes in sleep quality between the two phases. We found no changes in sleep patterns throughout the study. Both the physical and mental health scores worsened from Phase 1 to Phase 2. CONCLUSION: Sleep quality in kidney transplant recipients did not change during the first year after a successful renal transplantation.
Resumo:
Background. The prevalence of early childhood caries (ECC) is high in developing countries; thus, sensitive methods for the early diagnosis of ECC are of prime importance to implement the appropriate preventive measures. Aim. To investigate the effects of the addition of early caries lesions (ECL) into WHO threshold caries detection methods on the prevalence of caries in primary teeth and the epidemiological profile of the studied population. Design. In total, 351 3-to 4-year-old preschoolers participated in this cross-sectional study. Clinical exams were conducted by one calibrated examiner using WHO and WHO + ECL criteria. During the exams, a mirror, a ball-ended probe, gauze, and an artificial light were used. The data were analysed by Wilcoxon and Mc-Nemar's tests (a = 0.05). Results. Good intra-examiner Kappa values at tooth /surface levels were obtained for WHO and WHO + ECL criteria (0.93 /0.87 and 0.75 /0.78, respectively). The dmfs scores were significantly higher (P < 0.05) when WHO + ECL criteria were used. ECLs were the predominant caries lesions in the majority of teeth. Conclusions. The results strongly suggest that the WHO + ECL diagnosis method could be used to identify ECL in young children under field conditions, increasing the prevalence and classification of caries activity and providing valuable information for the early establishment of preventive measures.
Resumo:
Liver transplantation has become a standard treatment for end-stage liver disease and the number of recipients has grown rapidly in the last few years. Dental care during pre-transplant workup is important to reduce potential sources of infection in the drug-induced immunosuppression phase of liver transplantation. Objectives: The objectives of this study were to document the prevalence of oral abnormalities in patients on a liver transplant waiting list presenting to an urban dental school clinic, discuss the appropriate dental treatment according their systemic conditions and compare their oral manifestations with those of healthy individuals. Material and Methods: A pilot study was conducted involving 16 end-stage liver disease individuals (study group- SG) attending the Special Care Dentistry Center of the University of So Paulo and 16 control individuals (control group- CG) with no liver diseases, receiving dental care at the Dental School of the University of So Paulo. These individuals were assessed for their dental status (presence of oral disease or abnormalities), coagulation status, and dental treatment indications. Results: The patients from SG exhibited a greater incidence of oral manifestations compared with CG (p=0.0327) and were diagnosed with at least one oral disease or condition that required treatment. Coagulation abnormalities reflecting an increased risk of bleeding were found in 93.75% of the patients. However, no bleeding complications occurred after dental treatment. Conclusions: The patients with chronic liver diseases evaluated in this study exhibited a higher incidence of oral manifestations compared with the control group and had at least one oral disease or abnormality which required dental treatment prior to liver transplantation. Careful oral examination and evaluation of the patient, including laboratory tests, will ensure correct oral preparation and control of oral disease prior to liver transplantation.
Resumo:
During the dyeing process in baths approximately 10 to 15% of the dyes used are lost and reach industrial effluents, thus polluting the environment. Studies showed that some classes of dyes, mainly azo dyes and their by-products, exert adverse effects on humans and local biota, since the wastewater treatment systems and water treatment plants were found to be ineffective in removing the color and reducing toxicity of some dyes. In the present study, the toxicity of the azo dyes disperse orange 1 (DO1), disperse red 1 (DR1), and disperse red 13 (DR13) was evaluated in HepG2 cells grown in monolayers or in three dimensional (3D) culture. Hepatotoxicity of the dyes was measured using 3-(4,5-dimethylthiazol-2yl)2,5-diphenyltetrazolium (MTT) and cell counting kit 8 (CCK-8) assays after 24, 48, and 72 h of incubation of cells with 3 different concentrations of the azo dyes. The dye DO1 only reduced the mitochondrial activity in HepG2 cells grown in a monolayer after 72 h incubation, while the dye DR1 showed this deleterious effect in both monolayer and 3D culture. In contrast, dye DR13 decreased the mitochondrial activity after 24, 48, and 72 h of exposure in both monolayer and 3D culture. With respect to dehydrogenase activity, only the dye DR13 diminished the activity of this enzyme after 72 h of exposure in both monolayer and 3D culture. Our results clearly demonstrated that exposure to the studied dyes induced cytotoxicity in HepG2 cells.
Resumo:
Introduction: The aim of this study was to assess the epidemiological and operational characteristics of the Leprosy Program before and after its integration into the Primary Healthcare Services of the municipality of Aracaju-Sergipe, Brazil. Methods: Data were drawn from the national database. The study periods were divided into preintegration (1996-2000) and postintegration (2001-2007). Annual rates of epidemiological detection were calculated. Frequency data on clinico-epidemiological variables of cases detected and treated for the two periods were compared using the Chi-squared (chi(2)) test adopting a 5% level of significance. Results: Rates of detection overall, and in subjects younger than 15 years, were greater for the postintegration period and were higher than rates recorded for Brazil as a whole during the same periods. A total of 780 and 1,469 cases were registered during the preintegration and postintegration periods, respectively. Observations for the postintegration period were as follows: I) a higher proportion of cases with disability grade assessed at diagnosis, with increase of 60.9% to 78.8% (p < 0.001), and at end of treatment, from 41.4% to 44.4% (p < 0.023); II) an increase in proportion of cases detected by contact examination, from 2.1% to 4.1% (p < 0.001); and III) a lower level of treatment default with a decrease from 5.64 to 3.35 (p < 0.008). Only 34% of cases registered from 2001 to 2007 were examined. Conclusions: The shift observed in rates of detection overall, and in subjects younger than 15 years, during the postintegration period indicate an increased level of health care access. The fall in number of patients abandoning treatment indicates greater adherence to treatment. However, previous shortcomings in key actions, pivotal to attaining the outcomes and impact envisaged for the program, persisted in the postintegration period.
Resumo:
Abstract Background Chronic hemodialysis patients are at higher risk for acquiring hepatitis C virus (HCV). The prevalence varies among different countries and hemodialysis centers. Although guidelines for a comprehensive infection control program exist, the nosocomial transmission still accounts for the new cases of infection. The aim of this study was analyze the follow up of newly acquired acute hepatitis C cases, during the period from January 2002 to May 2005, in the Hemodialysis Center, located in the Southwest region of Parana State, Brazil and to analyze the effectiveness of the measures to restrain the appearance of new cases of acute hepatitis C. Methods Patients were analyzed monthly with anti-HCV tests and ALT measurements. Patients with ALT elevations were monitored for possible acute hepatitis C. Results During this period, 32 new cases were identified with acute hepatitis C virus infection. Blood screening showed variable ALT levels preceding the anti-HCV seroconversion. HCV RNA viremia by PCR analysis was intermittently and even negative in some cases. Ten out of 32 patients received 1 mcg/kg dose of pegylated interferon alfa-2b treatment for 24 weeks. All dialysis personnel were re-trained to strictly follow the regulations and recommendations regarding infection control, proper methods to clean and disinfect equipment were reviewed and HCV-positive patients were isolated. Conclusion Laboratory tests results showed variable ALT preceding anti-HCV seroconversion and intermittent viremia. The applied recommendations contributed importantly to restrain the appearance of new cases of acute hepatitis C in this center and the last case was diagnosed in May 2004.
Resumo:
INTRODUCTION: The aim of this study was to assess the epidemiological and operational characteristics of the Leprosy Program before and after its integration into the Primary healthcare Services of the municipality of Aracaju-Sergipe, Brazil. METHODS: Data were drawn from the national database. The study periods were divided into preintegration (1996-2000) and postintegration (2001-2007). Annual rates of epidemiological detection were calculated. Frequency data on clinico-epidemiological variables of cases detected and treated for the two periods were compared using the Chi-squared (χ2) test adopting a 5% level of significance. RESULTS: Rates of detection overall, and in subjects younger than 15 years, were greater for the postintegration period and were higher than rates recorded for Brazil as a whole during the same periods. A total of 780 and 1,469 cases were registered during the preintegration and postintegration periods, respectively. Observations for the postintegration period were as follows: I) a higher proportion of cases with disability grade assessed at diagnosis, with increase of 60.9% to 78.8% (p < 0.001), and at end of treatment, from 41.4% to 44.4% (p < 0.023); II) an increase in proportion of cases detected by contact examination, from 2.1% to 4.1% (p < 0.001); and III) a lower level of treatment default with a decrease from 5.64 to 3.35 (p < 0.008). Only 34% of cases registered from 2001 to 2007 were examined. CONCLUSIONS: The shift observed in rates of detection overall, and in subjects younger than 15 years, during the postintegration period indicate an increased level of health care access. The fall in number of patients abandoning treatment indicates greater adherence to treatment. However, previous shortcomings in key actions, pivotal to attaining the outcomes and impact envisaged for the program, persisted in the postintegration period.