156 resultados para consecutive
em Scielo Saúde Pública - SP
Resumo:
The occurrence of different viruses in nasopharyngeal secretions from children less than 5 years old with acute respiratory infections (ARI) was investigated over a period of 4 years (1982-1985) in Rio de Janeiro. Of the viruses known to be associated with ARI, all but influenza C and parainfluenza types 1, 2 and 4 were found. Viruses were found more frequently in children attending emergency or pediatric wards than in outpatients. This was clearly related to the high incidence of respiratory syncytial virus (RSV) in the more severe cases of ARI. RSV positive specimens appeared mainly during the fall, over four consecutive years, showing a clear seasonal ocurrence of this virus. Emergency wards provide the best source of data for RSV surveillance, showing sharp increase in the number of positive cases coinciding with increased incidence of ARI cases. Adenovirus were the second most frequent viruses isolated and among these serotypes 1,2 and 7 were predominant. Influenza virus and parainfluenza virus type 3 were next in frequency. Influenza A virus were isolated with equal frequency in outpatient departments, emergency and pediatric wards. Influenza B was more frequent among outpatients. Parainfluenza type 3 caused outbreaks in the shanty town population annually during the late winter or spring and were isolated mainly from outpatients. Herpesvirus, enterovi-rus and rhinovirus were found less frequently. Other viruses than RSV and parainfluenza type 3 did not show a clear seasonal incidence.
Resumo:
OBJECTIVE: To analyze surgical and pathological parameters and outcome and prognostic factors of patients with nonsmall cell lung cancer (NSCLC) who were admitted to a single institution, as well as to correlate these findings to the current staging system. METHOD: Seven hundred and thirty seven patients were diagnosed with NSCLC and admitted to Hospital do Cancer A. C. Camargo from 1990 to 2000. All patients were included in a continuous prospective database, and their data was analyzed. Following staging, a multidisciplinary team decision on adequate management was established. Variables included in this analysis were age, gender, histology, Karnofsky index, weight loss, clinical stage, surgical stage, chemotherapy, radiotherapy, and survival rates. RESULTS: 75.5% of patients were males. The distribution of histologic type was squamous cell carcinoma 51.8%, adenocarcinoma 43.1%, and undifferentiated large cell carcinoma 5.1%. Most patients (73%) presented significant weight loss and a Karnofsky index of 80%. Clinical staging was IA 3.8%, IB 9.2%, IIA 1.4%, IIB 8.1%, IIIA 20.9%, IIIB 22.4%, IV 30.9%. Complete tumor resection was performed in 24.6% of all patients. Surgical stage distribution was IA 25.3%, IB 1.4%, IIB 17.1%, IIIA 16.1%, IIIB 20.3%, IV 11.5%. Chemotherapy and radiotherapy were considered therapeutic options in 43% and 72%, respectively. The overall 5-year survival rate of nonsmall cell lung cancer patients in our study was 28%. Median survival was 18.9 months. CONCLUSIONS: Patients with NSCLC who were admitted to our institution presented with histopathologic and clinical characteristics that were similar to previously published series in cancer hospitals. The best prognosis was associated with complete tumor resection with lymph node dissection, which is only achievable in earlier clinical stages.
Resumo:
PURPOSE: The authors analyzed the 30-day and 6-month outcomes of 1,126 consecutive patients who underwent coronary stent implantation in 1996 and 1997. METHODS: The 30-day results and 6-month angiographic follow-up were analyzed in patients treated with coronary stents in 1996 and 1997. All patients underwent coronary stenting with high-pressure implantation (>12 atm) and antiplatelet drug regimen (aspirin plus ticlopidine). RESULTS: During the study period, 1,390 coronary stents were implanted in 1,200 vessels of 1,126 patients; 477 patients were treated in the year 1996 and 649 in 1997. The number of percutaneous procedures performed using stents increased significantly in 1997 compared to 1996 (64 % vs 48%, p=0.0001). The 30-day results were similar in both years; the success and stent thrombosis rates were equal (97% and 0.8%, respectively). The occurrence of new Q wave MI (1.3% vs 1.1%, 1996 vs 1997, p=NS), emergency coronary bypass surgery (1% vs 0.6%, 1996 vs 1997, p=NS) and 30-day death rates (0.2% vs 0.5%, 1996 vs 1997, p=NS) were similar. The 6-month restenosis rate was 25% in 1996 and 27% in 1997 (p= NS); the target vessel revascularization rate was 15% in 1996 and 16% in 1997 (p = NS). CONCLUSIONS: Intracoronary stenting showed a high success rate and a low incidence of 30-day occurrence of new major coronary events in both periods, despite the greater angiographic complexity of the patients treated with in 1997. These adverse variables did not have a negative influence at the 6-month clinical and angiographic follow-up, with similar rates of restenosis and ischemia-driven target lesion revascularization rates.
Resumo:
OBJECTIVE: Evaluate early and late evolution of patients submitted to primary coronary angioplasty for acute myocardial infarction. METHODS: A prospective study of 135 patients with acute myocardial infarction submitted to primary transcutaneous coronary angioplasty (PTCA). Success was defined as TIMI 3 flow and residual lesion <50%. We performed statistical analyses by univariated, multivariated methods and survival analyze by Kaplan-Meier. RESULTS: PTCA success rate was 78% and early mortality 18,5%. Killip classes III and IV was associated to higher mortality, odds ratio 22.9 (95% CI: 5,7 to 91,8) and inversely related to age <75 years (OR = 0,93; 95% CI: 0.88 to 0.98). If we had chosen success flow as TIMI 2 and had excluded patients in Killip III/IV classes, success rate would be 86% and mortality 8%. The survival probability at the end or study, follow-up time 142 ± 114 days, was 80% and event free survival 35%. Greater survival was associated to stenting (OR = 0.09; 0.01 to 0.75) and univessel disease (OR = 0.21; 0.07 to 0.61). CONCLUSION: The success rate was lower and mortality was higher than randomized trials, however similar to that of non randomized studies. This demonstrated the efficacy of primary PTCA in our local conditions.
Resumo:
Twenty-six human respiratory syncytial virus strains (subgroup A) isolated from three outbreaks in Havana City during the period 1994/95, 1995/96 and 1996/97 were analyzed to determine their antigenic and genetic relationships. Analyses were performed by monoclonal antibodies and restriction mapping (N gene) following amplification of the select region of the virus genome by polymerase chain reaction. All isolated strains were classified as subgroup A by monoclonal antibodies and they showed a restriction pattern NP4 that belonged to subgroup A. Thus the results obtained in this work, showed a close relation (100%) between antigenic and genetic characterization of the isolated strains in our laboratory. These methods permit the examination of large numbers of isolates by molecular techniques, simplifying the researchs into the molecular epidemiology of the virus.
Resumo:
In the process of phosphate rock acidulation, several impure P compounds may be formed along with the desirable Ca and NH4 phosphates. Such compounds normally reduce the content of water-soluble P and thus the agronomic effectiveness of commercial fertilizers. In order to study this problem, a greenhouse experiment consisting of three consecutive corn crops was conducted in samples of a Red-Yellow Latosol (Typical Hapludox) in a completely randomized design (6 x 2 x 2), with four replicates. Six commercial fertilizers were added to 2 kg of soil at a rate of 70 mg kg-1 P, based on the content of soluble P in neutral ammonium citrate plus water (NAC + H2O) of the fertilizers. Fertilizer application occurred either in the original form or leached to remove the water-soluble fraction, either by mixing the fertilizer with the whole soil in the pots or with only 1 % of its volume. The corn plants were harvested 40 days after emergence to determine the shoot dry matter and accumulated P. For the first crop and localized application, the elimination of water-soluble P from the original fertilizers resulted in less bioavailable P for the plants. For the second and third crops, the effects of P source, leaching and application methods were not as evident as for the first, suggesting that the tested P sources may have similar efficiencies when considering successive cropping. The conclusion was drawn that the water-insoluble but NAC-soluble fractions of commercial P fertilizers are not necessarily inert because they can provide P in the long run.
Resumo:
There are several regions of the world where soil N analysis and/or N budgets are not used to determine how much N to apply, resulting in higher than needed N inputs, especially when manure is used. One such region is the North Central "La Comarca Lagunera", one of the most important dairy production areas of Mexico. We conducted a unique controlled greenhouse study using 15N fertilizer and 15N isotopic-labeled manure that was labeled under local conditions to monitor N cycling and recovery under higher N inputs. The manure-N treatment was applied only once and was incorporated in the soil before planting the first forage crop at an equivalent rate of 30, 60 and 120 Mg ha-1 dry manure. The 15N treatments were equivalent to 120 and 240 kg ha-1 (NH4)2SO4-N for each crop. The total N fertilizer for each N fertilized treatment were 360, and 720 kg ha-1 N. We found very low N recoveries: about 9 % from the manure N inputs, lower than the 22 to 25 % from the fertilizer N inputs. The manure N recovered belowground in soil and roots ranged from 82 to 88 %. The low recoveries of N by the aboveground and low soil inorganic nitrate (NO3-N) and ammonium (NH4-N) content after the third harvested suggested that most of the 15N recovered belowground was in the soil organic form. The losses from manure N inputs ranged from 3 to 11 %, lower than the 34 to 39 % lost from fertilizer N sources. Our study shows that excessive applications of manure or fertilizer N that are traditionally used in this region will not increase the rate of N uptake by aboveground compartment but will increase the potential for N losses to the environment.
Resumo:
Tannery sludge contains high concentrations of inorganic elements, such as chromium (Cr), which may lead to environmental pollution and affect human health The behavior of Cr in organic matter fractions and in the growth of cowpea (Vigna unguiculata L.) was studied in a sandy soil after four consecutive annual applications of composted tannery sludge (CTS). Over a four-year period, CTS was applied on permanent plots (2 × 5 m) and incorporated in the soil (0-20 cm) at the rates of 0, 2.5, 5.0, 10.0, and 20.0 Mg ha-1 (dry weight basis). These treatments were replicated four times in a randomized block design. In the fourth year, cowpea was planted and grown for 50 days, at which time we analyzed the Cr concentrations in the soil, in the fulvic acid, humic acid, and humin fractions, and in the leaves, pods, and grains of cowpea. Composted tannery sludge led to an increase in Cr concentration in the soil. Among the humic substances, the highest Cr concentration was found in humin. The application rates of CTS significantly increased Cr concentration in leaves and grains.
Resumo:
Objective: to report the group's experience with a series of patients undergoing pancreatic resection presenting null mortality rates. Methods: we prospectively studied 50 consecutive patients undergoing pancreatic resections for peri-ampullary or pancreatic diseases. Main local complications were defined according to international criteria. In-hospital mortality was defined as death occurring in the first 90 postoperative days. Results: patients' age ranged between 16 and 90 years (average: 53.3). We found anemia (Hb < 12g/dl) and preoperative jaundice in 38% and 40% of cases, respectively. Most patients presented with peri-ampullary tumors (66%). The most common surgical procedure was the Kausch - Whipple operation (70%). Six patients (12%) needed to undergo resection of a segment of the mesenteric-portal axis. The mean operative time was 445.1 minutes. Twenty two patients (44%) showed no clinical complications and presented mean hospital stay of 10.3 days. The most frequent complications were pancreatic fistula (56%), delayed gastric emptying (17.1%) and bleeding (16%). Conclusion : within the last three decades, pancreatic resection is still considered a challenge, especially outside large specialized centers. Nevertheless, even in our country (Brazil), teams seasoned in such procedure can reach low mortality rates.
Resumo:
The survival of hemodialysis patients is likely to be influenced not only by well-known risk factors like age and comorbidity, but also by changes in dialysis technology and practices accumulated along time. We compared the survival curves, dialysis routines and some risk factors of two groups of patients admitted to a Brazilian maintenance hemodialysis program during two consecutive decades: March 1977 to December 1986 (group 1, N = 162) and January 1987 to June 1997 (group 2, N = 237). The median treatment time was 22 months (range 1-198). Survival curves were constructed using the Kaplan-Meier method and compared using the log-rank method. The Cox proportional hazard regression model was used to investigate the more important variables associated with outcome. The most important changes in dialysis routine and in patient care during the total period of observation were the progressive increase in the dose of dialysis delivered, the prohibition of potassium-free dialysate, the use of bicarbonate as a buffer and the upgrading of the dialysis equipment. There were no significant differences between the survival curves of the two groups. Survival rates at 1, 5 and 10 years were 84, 53 and 29%, respectively, for group 1 and 77, 42 and 21% for group 2. Patients in group 1 were younger (45.5 ± 15.2 vs 55.2 ± 15.9 years, P<0.001) and had a lower prevalence of diabetes (11.1 vs 27.4%, P<0.001) and of cardiovascular disease (9.3 vs 20.7%, P<0.001). According to the Cox multivariate model, only age (hazard ratio (HR) 1.04, confidence interval (CI) 1.03-1.05, P<0.001) and diabetes (HR 2.55, CI 1.82-3.58, P<0.001) were independent predictors of mortality for the whole group. Patients of group 2 had a lower prevalence of sudden death (19.1 vs 9.7%, P<0.001). After adjusting for age, diabetes and other mortality risk factors, the risk of death was 17% lower in group 2, although this difference was not statistically significant. We conclude that the negative effects of advanced age and of higher frequency of comorbidity on the survival of group 2 patients were probably offset by improvements in patient care and in the quality and dose of dialysis delivered, so that the survival curves did not undergo significant changes along time.
Resumo:
Rotaviruses are the main cause of infantile acute diarrhea, and a monovalent (G1P[8]) vaccine against the virus was introduced into the Brazilian National Immunization Program for all infants in March 2006. The objectives of this study were to determine the rate and genotype distribution of rotavirus causing infantile diarrhea in the Triângulo Mineiro region of Brazil during 2011-2012 and to assess the impact of local vaccination. Fecal specimens were analyzed for detection and characterization of rotavirus using polyacrylamide gel electrophoresis, reverse transcription followed by polymerase chain reaction (PCR), and PCR-genotyping assays. Overall, rotavirus was diagnosed in 1.7% (6/348) of cases. Rotavirus positivity rates decreased 88% [95% confidence intervals (CI)=15.2, 98.3%; P=0.026] in 2011 and 78% (95%CI=30.6, 93.0%; P=0.007) in 2012 when compared with available data for baseline years (2005/2006) in Uberaba. In Uberlândia, reductions of 95.3% (95%CI=66.0, 99.4%; P=0.002) in 2011, and 94.2% (95%CI=56.4, 99.2%; P=0.004) in 2012 were also observed compared with data for 2008. The circulation of rotavirus G2P[4] strains decreased during the period under study, and strains related to the P[8] genotype reemerged in the region. This study showed a marked and sustained reduction of rotavirus-related cases, with a lack of rotavirus in the 2011 and 2012 seasons, suggesting a positive impact of the vaccination program.
Resumo:
The production of medicinal plants as raw material for industry must associate quality with biomass formation and, with this purpose, the application of plant growth regulators has been studied in these crops. The objective of this study was to evaluate the effect of a biostimulant on growth, inflorescence production and flavonoid content in marigold. The experiment was conducted in a greenhouse and the treatments consisted of increasing doses of the biostimulant (0, 3, 6, 9, 12 and 15 mL L-1) applied by foliar spraying in ten consecutive applications. The experiment was arranged in a completely randomized design, with six treatments and ten repetitions. The number of leaves and flowerheads and dry matter of roots increased linearly with increasing doses of the growth promoter, with 20%, 36.97% and 97.28% increases, respectively, compared with the control. The total dry mass and shoot dry mass showed maximum values at the highest dose tested of 15 mL L-1 (with increases of 40.09% and 46.30%, respectively). Plant height and flavonoid content reached the highest values at a dose of 6 mL L-1. The biostimulant promoted the development of marigold and positively influenced the synthesis of the secondary compound of medicinal interest. Among the tested doses, the application of rates between 6 and 9 mL L-1 of the biostimulant is recommended for more efficient large-scale production of marigold.
Resumo:
Survival analysis is applied when the time until the occurrence of an event is of interest. Such data are routinely collected in plant diseases, although applications of the method are uncommon. The objective of this study was to use two studies on post-harvest diseases of peaches, considering two harvests together and the existence of random effect shared by fruits of a same tree, in order to describe the main techniques in survival analysis. The nonparametric Kaplan-Meier method, the log-rank test and the semi-parametric Cox's proportional hazards model were used to estimate the effect of cultivars and the number of days after full bloom on the survival to the brown rot symptom and the instantaneous risk of expressing it in two consecutive harvests. The joint analysis with baseline effect, varying between harvests, and the confirmation of the tree effect as a grouping factor with random effect were appropriate to interpret the phenomenon (disease) evaluated and can be important tools to replace or complement the conventional analysis, respecting the nature of the variable and the phenomenon.
Resumo:
The aims of this study were a) to assess the ability of primary care doctors to make accurate ratings of psychiatric disturbance and b) to evaluate the use of a case-finding questionnaire in the detection of psychiatric morbidity. The estudy took place in three primary care clinics in the city of São Paulo, Brazil, during a six-month survey. A time sample of consecutive adult attenders were asked to complete a case-finding questionnaire for psychiatric disorders (the Self Report Questionnaire - SRQ) and a subsample were selected for a semi-structured psychiatric interview (the Clinical Interview Schedule - CIS). At the end of the consultation the primary care doctors were asked to assess, in a standardized way, the presence or absence of psychiatric disorder; these assessments were then compared with that ratings obtained in the psychiatric interview. A considerable proportion of minor psychiatric morbidity remained undetected by the three primary care doctors: the hidden morbidity ranged from 22% to 79%. When these were compared to those of the case-finding questionnaire, they were consistently lower, indicating that the use of these instruments can enhance the recognition of psychiatric disorders in primary care settings. Four strategies for adopting the questionnaire are described, and some of the clinical consequences of its use are discussed.
Resumo:
A previously calculated predictive model for health risk selects infants who suffer 4-5 times more morbidity than their unselected peers. Preliminary results suggested that this risk is related to maternal neurotic symptomatology. To evaluate this hypothesis, 52 consecutive mothers whose infants had a positive predictive score (Group 1) and 52 in whom this was negative (Group 2) were evaluated by means of Goldberg's General Health Questionnaire (GHQ - 30). A total of 41.9% and 20.5% of the mothers in Groups 1 and 2, respectively, scored above 11 points in GHQ-30, established as the cut off point. It is concluded that among poor urban families in Santiago mothers of infants with high risk of persistent diarrhoea have increased frequency of detectable neurotic symptoms. New programs aimed at this type of infant should include psychological support for their mothers.