73 resultados para 275
Resumo:
We investigated the effect of age and sex on the serum activity of hexosaminidase (HEX) and ß-glucuronidase (BGLU) in 275 normal term infants aged 12 h to 12 months. Up to six weeks of life, HEX was significantly higher in boys (P<=0.023). During the age period of 1-26 weeks, BGLU was also higher in boys, but differences were significant only at 2-6 and 7-15 weeks (P<=0.016). The developmental pattern of HEX and BGLU was sex dependent. HEX activity increased in both sexes from 4-7 days of life, reaching a maximum of 1.4-fold the birth value at 2-6 weeks of age in boys (P<0.001) and a maximum of 1.6-fold at 7-15 weeks in girls (P<0.001). HEX activity gradually decreased thereafter, reaching significantly lower levels at 27-53 weeks than during the first three days of life in boys (P = 0.002) and the same level of this age interval in girls. BGLU increased in both sexes from 4-7 days of age, showing a maximum increase at 7-15 weeks (3.3-fold in boys and 2.9-fold in girls, both P<0.001). Then BGLU decreased in boys to a value similar to that observed at 4-7 days of age. In girls, BGLU remained elevated until the end of the first year of life. These results indicate a variation of HEX and BGLU activities during the first year of life and a sex influence on their developmental pattern. This observation should be considered in the diagnosis of GM2 gangliosidosis and mucopolysaccharidosis type VII.
Resumo:
Previous studies have shown that saccadic eye responses but not manual responses were sensitive to the kind of warning signal used, with visual onsets producing longer saccadic latencies compared to visual offsets. The aim of the present study was to determine the effects of distinct warning signals on manual latencies and to test the premise that the onset interference, in fact, does not occur for manual responses. A second objective was to determine if the magnitude of the warning effects could be modulated by contextual procedures. Three experimental conditions based on the kind of warning signal used (visual onset, visual offset and auditory warning) were run in two different contexts (blocked and non-blocked). Eighteen participants were asked to respond to the imperative stimulus that would occur some milliseconds (0, 250, 500 or 750 ms) after the warning signal. The experiment consisted in three experimental sessions of 240 trials, where all the variables were counterbalanced. The data showed that visual onsets produced longer manual latencies than visual offsets in the non-blocked context (275 vs 261 ms; P < 0.001). This interference was obtained, however, only for short intervals between the warning and the stimulus, and was abolished when the blocked context was used (256 vs 255 ms; P = 0.789). These results are discussed in terms of bottom-up and top-down interactions, mainly those related to the role of attentional processing in canceling out competitive interactions and suppressive influences of a distractor on the relevant stimulus.
Resumo:
Mood disorders cause many social problems, often involving family relationships. Few studies are available in the literature comparing patients with bipolar, unipolar, dysthymic, and double depressive disorders concerning these aspects. In the present study, demographic and disease data were collected using a specifically prepared questionnaire. Social adjustment was assessed using the Disability Adjustment Scale and family relationships were evaluated using the Global Assessment of Relational Functioning Scale. One hundred patients under treatment for at least 6 months were evaluated at the Psychiatric Outpatient Clinic of the Botucatu School of Medicine, UNESP. Most patients were women (82%) more than 50 (49%) years old with at least two years of follow-up, with little schooling (62% had less than 4 years), and of low socioeconomic level. Logistic regression analysis showed that a diagnosis of unipolar disorder (P = 0.003, OR = 0.075, CI = 0.014-0.403) and dysthymia (P = 0.001, OR = 0.040, CI = 0.006-0.275) as well as family relationships (P = 0.002, OR = 0.953, CI = 0914-0.992) played a significant role in social adjustment. Unipolar and dysthymic patients presented better social adjustment than bipolar and double depressive patients (P < 0.001), results that were not due to social class. These patients, treated at a teaching hospital, may represent the severest mood disorder cases. Evaluations were made knowing the diagnosis of the patients, which might also have influenced some of the results. Social disabilities among mood disorder patients are very frequent and intensive.
Resumo:
The objective of the present study was to assess the incidence, risk factors and outcome of patients who develop acute renal failure (ARF) in intensive care units. In this prospective observational study, 221 patients with a 48-h minimum stay, 18-year-old minimum age and absence of overt acute or chronic renal failure were included. Exclusion criteria were organ donors and renal transplantation patients. ARF was defined as a creatinine level above 1.5 mg/dL. Statistics were performed using Pearsons' chi2 test, Student t-test, and Wilcoxon test. Multivariate analysis was run using all variables with P < 0.1 in the univariate analysis. ARF developed in 19.0% of the patients, with 76.19% resulting in death. Main risk factors (univariate analysis) were: higher intra-operative hydration and bleeding, higher death risk by APACHE II score, logist organ dysfunction system on the first day, mechanical ventilation, shock due to systemic inflammatory response syndrome (SIRS)/sepsis, noradrenaline use, and plasma creatinine and urea levels on admission. Heart rate on admission (OR = 1.023 (1.002-1.044)), male gender (OR = 4.275 (1.340-13642)), shock due to SIRS/sepsis (OR = 8.590 (2.710-27.229)), higher intra-operative hydration (OR = 1.002 (1.000-1004)), and plasma urea on admission (OR = 1.012 (0.980-1044)) remained significant (multivariate analysis). The mortality risk factors (univariate analysis) were shock due to SIRS/sepsis, mechanical ventilation, blood stream infection, potassium and bicarbonate levels. Only potassium levels remained significant (P = 0.037). In conclusion, ARF has a high incidence, morbidity and mortality when it occurs in intensive care unit. There is a very close association with hemodynamic status and multiple organ dysfunction.
Resumo:
To efficiently examine the association of glutamic acid decarboxylase antibody (GADA) positivity with the onset and progression of diabetes in middle-aged adults, we performed a case-cohort study representing the ~9-year experience of 10,275 Atherosclerosis Risk in Communities Study participants, initially aged 45-64 years. Antibodies to glutamic acid decarboxylase (GAD65) were measured by radioimmunoassay in 580 incident diabetes cases and 544 non-cases. The overall weighted prevalence of GADA positivity (³1 U/mL) was 7.3%. Baseline risk factors, with the exception of smoking and interleukin-6 (P £ 0.02), were generally similar between GADA-positive and -negative individuals. GADA positivity did not predict incident diabetes in multiply adjusted (HR = 1.04; 95%CI = 0.55, 1.96) proportional hazard analyses. However, a small non-significant adjusted risk (HR = 1.29; 95%CI = 0.58, 2.88) was seen for those in the highest tertile (³2.38 U/mL) of positivity. GADA-positive and GADA-negative non-diabetic individuals had similar risk profiles for diabetes, with central obesity and elevated inflammation markers, aside from glucose, being the main predictors. Among diabetes cases at study's end, progression to insulin treatment increased monotonically as a function of baseline GADA level. Overall, being GADA positive increased risk of progression to insulin use almost 10 times (HR = 9.9; 95%CI = 3.4, 28.5). In conclusion, in initially non-diabetic middle-aged adults, GADA positivity did not increase diabetes risk, and the overall baseline profile of risk factors was similar for positive and negative individuals. Among middle-aged adults, with the possible exception of those with the highest GADA levels, autoimmune pathophysiology reflected by GADA may become clinically relevant only after diabetes onset.
Resumo:
The prevalence of uncontrolled and controlled asthma, and the factors associated with uncontrolled asthma were investigated in a cross-sectional study. Patients aged 11 years with confirmed asthma diagnosis were recruited from the outpatient asthma clinic of Hospital de Clínicas de Porto Alegre, Brazil. Patients were excluded if they had other chronic pulmonary disease. They underwent an evaluation by a general questionnaire, an asthma control questionnaire (based on the 2006 Global Initiative for Asthma guidelines), assessment of inhaled device technique and pulmonary function tests. Asthma was controlled in 48 of 275 patients (17.5%), partly controlled in 74 (26.9%) and uncontrolled in 153 (55.6%). In the univariate analysis, asthma severity was associated with asthma control (P < 0.001). Availability of asthma medications was associated with asthma control (P = 0.01), so that most patients who could purchase medications had controlled asthma, while patients who depend on the public health system for access to medications had lower rates of controlled asthma. The use of inhaled corticosteroid was lower in the uncontrolled group (P < 0.001). Logistic regression analysis identified three factors associated with uncontrolled asthma: severity of asthma (OR = 5.33, P < 0.0001), access to medications (OR = 1.97, P = 0.025) and use of inhaled corticosteroids (OR = 0.17, P = 0.030). This study showed a high rate of uncontrolled asthma in patients who attended an outpatient asthma clinic. Severity of asthma, access to medications and adequate use of inhaled corticosteroids were associated with the degree of asthma control.
Resumo:
The cardiovascular electrophysiologic basis for the action of pyridostigmine, an acetylcholinesterase inhibitor, has not been investigated. The objective of the present study was to determine the cardiac electrophysiologic effects of a single dose of pyridostigmine bromide in an open-label, quasi-experimental protocol. Fifteen patients who had been indicated for diagnostic cardiac electrophysiologic study underwent two studies just before and 90-120 min after the oral administration of pyridostigmine (45 mg). Pyridostigmine was well tolerated by all patients. Wenckebach nodal anterograde atrioventricular point and basic cycle were not altered by pyridostigmine. Sinus recovery time (ms) was shorter during a 500-ms cycle stimulation (pre: 326 ± 45 vs post: 235 ± 47; P = 0.003) but not during 400-ms (pre: 275 ± 28 vs post: 248 ± 32; P = 0.490) or 600-ms (pre: 252 ± 42 vs post: 179 ± 26; P = 0.080) cycle stimulation. Pyridostigmine increased the ventricular refractory period (ms) during the 400-ms cycle stimulation (pre: 238 ± 7 vs post: 245 ± 9; P = 0.028) but not during the 500-ms (pre: 248 ± 7 vs post: 253 ± 9; P = 0.150) or 600-ms (pre: 254 ± 8 vs post: 259 ± 8; P = 0.255) cycle stimulation. We conclude that pyridostigmine did not produce conduction disturbances and, indeed, increased the ventricular refractory period at higher heart rates. While the effect explains previous results showing the anti-arrhythmic action of pyridostigmine, the clinical impact on long-term outcomes requires further investigation.
Resumo:
Bone mass loss is a major complication of chronic cholestatic liver disease (CCD). However, the long-term impact of CCD on bone mass acquisition is unknown. We longitudinally assessed bone mineral density (BMD) and factors involved in bone remodeling in 9 children and adolescents with CCD Child-Pugh A (5 boys/4 girls) and in 13 controls (6 boys/7 girls). The groups were evaluated twice, at baseline (T0) and after 3 years (T1), when osteocalcin, deoxypyridinoline, 25-hydroxyvitamin-D, parathyroid hormone, insulin-like growth factor-I (IGF-I), and BMD (L1-L4, proximal femur and total body) were determined. Serum levels of receptor activator for nuclear factor kB ligand (RANKL) and osteoprotegerin were measured only at T1. Lumbar spine BMD was reanalyzed twice: after adjustment for bone age and to compensate for the height factor. Volumetric density was also estimated mathematically in L2-L4. The BMD of L1-L4 was lower in the CCD group (Z-score at T0: control = -1.2 ± 0.8 vs CCD = -2.2 ± 1.4, P < 0.05; T1: control = -0.7 ± 0.8 vs CCD = -2.1 ± 1.1, P < 0.05). Osteocalcin and deoxypyridinoline were similar for the two groups. The CCD group presented lower IGF-I (Z-score at T1: control = 1.4 ± 2.8 vs CCD = -1.5 ± 1.0, P < 0.05) and RANKL (control = 0.465 ± 0.275 vs CCD = 0.195 ± 0.250 pM, P < 0.05) than control. Children with compensated CCD Child-Pugh A showed early impairment of bone acquisition, with the impact being more severe in an initial phase and then tapering in a slowly progressive way. Reduction in endocrine IGF-I has a crucial role in this process.
Resumo:
The Caco-2 cell line has been used as a model to predict the in vitro permeability of the human intestinal barrier. The predictive potential of the assay relies on an appropriate in-house validation of the method. The objective of the present study was to develop a single HPLC-UV method for the identification and quantitation of marker drugs and to determine the suitability of the Caco-2 cell permeability assay. A simple chromatographic method was developed for the simultaneous determination of both passively (propranolol, carbamazepine, acyclovir, and hydrochlorothiazide) and actively transported drugs (vinblastine and verapamil). Separation was achieved on a C18 column with step-gradient elution (acetonitrile and aqueous solution of ammonium acetate, pH 3.0) at a flow rate of 1.0 mL/min and UV detection at 275 nm during the total run time of 35 min. The method was validated and found to be specific, linear, precise, and accurate. This chromatographic system can be readily used on a routine basis and its utilization can be extended to other permeability models. The results obtained in the Caco-2 bi-directional transport experiments confirmed the validity of the assay, given that high and low permeability profiles were identified, and P-glycoprotein functionality was established.
Resumo:
A preliminary analysis by GC-MS comparing the mass spectrum of the compounds with the Wiley 275 L mass spectral data base was used to identify the fatty acids and mainly, some volatile compounds responsible for the flavor of the roasted coffee oil. The oil was obtained by mechanical expelling of Brazilian beans (Coffea arabica) roasted at 238ºC for 10 minutes. Different sample preparation methodologies such as headspace, adsorbent suction trapping and esterification were used. It was possible to identify pyrazines, pyridines, furan derivatives and other compounds not reported in the literature.
Resumo:
A qualidade do alimento oferecido à população sempre foi uma preocupação do governo federal, observada com a publicação, em 27 de fevereiro de 1967, do Decreto Lei 209 que institui o Código Brasileiro de Alimentos. A Portaria 1.428 de 26 de novembro de 1993/ANVISA inova na relação dos fatores contribuintes para a contaminação alimentar, apresentando, nas diretrizes para o estabelecimento de boas práticas de produção e de prestação de serviços na área de alimentos, a desinfestação que compreende o plano de sanitização utilizado pelo estabelecimento. A partir deste momento cria-se uma nova visão dos fatores determinantes de contaminação alimentar, o controle de ratos e insetos. Este procedimento passa a integrar todos os documentos legais que foram sendo gerados pelo Ministério da Saúde e da Agricultura no que se refere ao controle de alimentos. Até a publicação da RDC 275/2002-ANVISA, o controle de pragas urbanas poderia ser feito por equipes treinadas dos estabelecimentos que realizassem: produção/industrialização, fracionamento, armazenamento e transportes de alimentos industrializados, manipulação, preparação, fracionamento, armazenamento, distribuição, transporte, exposição à venda e entrega de alimentos preparados ao consumo, tais como cantinas, bufês, confeitarias, cozinhas industriais, cozinhas institucionais, delicatéssens, lanchonetes, padarias, pastelarias, restaurantes, e congêneres. A partir de sua publicação, o controle químico passa a ser realizado apenas pelas desinsetizadoras que estejam em conformidade com a RDC 18/2000-ANVISA. Entretanto isto não tira a responsabilidade legal da empresa de alimentos que deverá ter em seu POP (Procedimento Operacional Padronizado) a inclusão do controle de pragas, seja físico e/ou químico.
Resumo:
INTRODUÇÃO: Há indícios de que a proteína da soja poderia contribuir para reduzir a velocidade de progressão da doença renal, diminuindo colesterol sérico e proteinúria em pacientes com nefropatias. Este estudo foi desenvolvido para avaliar o efeito da die>ta com proteína da soja sobre proteinúria e dislipidemia, em pacientes com glomerulopatias proteinúricas. PACIENTES E MÉTODOS: Os pacientes foram divididos em três grupos: o Grupo Controle (n = 9) recebeu dieta com 0,8 g/kg/dia de proteína animal; o Grupo de Estudo 1 (n = 9) recebeu dieta com 0,8 g/kg/dia de proteína da soja e o Grupo 2 (n = 9), dieta com 0,8 g/kg/dia de proteína da soja mais fibras. O período de estudo foi de oito semanas. Durante o período basal e no final do estudo, os pacientes foram submetidos à avaliação laboratorial e antropométrica. RESULTADOS: Não foram observadas diferenças estatisticamente significantes entre os períodos pré e pós-intervenção em nenhum dos grupos estudados, nos parâmetros antropométricos ou na composição corporal entre os três grupos, nem nos níveis de proteinúria (Controle: 0.7 ± 0.6 versus 0.8 ± 0.6; Grupo 1: 2.0 ± 1.7 versus 1.9 ± 1.8; Grupo 2: 2.0 ± 1.4 versus 2.1 ± 2.0). No entanto, observou-se discreta diminuição nos níveis triglicérides (244.8+-275.9 versus 200.5+-34.0), colesterol total (234.0+-59.4 versus 181.2+-110.3) e LDL (136.0+-59.1 versus 104.1+-39.4) no Grupo 1, embora sem atingir significância estatística. CONCLUSÃO: Não foram detectados efeitos benéficos com a substituição da proteína animal pela proteína da soja em relação aos objetivos de reduzir proteinúria e hiperlipidemia; porém, constatou-se que a dieta de proteína da soja não causou alterações deletérias na composição corporal, mantendo um estado nutricional adequado.
Resumo:
Introdução: Sintomas de desesperança, ideação suicida e depressão influenciam na qualidade e expectativa de vida de doentes renais crônicos. Objetivo: Avaliar se existe diferença nos sintomas de desesperança, ideação suicida e depressão entre pacientes renais crônicos em hemodiálise ou transplantados. Analisamos também se variáveis sociodemográficas como atividade laboral, ter dependentes, sexo e estado civil interferem nesses sintomas. Métodos: Estudo comparativo, de corte transversal, em que 50 pacientes em hemodiálise crônica e 50 transplantados renais, clinicamente estáveis, sem psicopatologias, pareados por sexo e idade, foram selecionados aleatoriamente. Instrumentos -Beck Hopelessness Scale (BHS), Beck Scale for Suicide Ideation (BSI) e Beck Depression Inventory (BDI). Resultados: BHS: 2% de cada grupo tiveram escore > 8 (p = 1,00). BSI: 4% em hemodiálise e 6% dos transplantados tinham escore > 1 (p = 1,00). BDI: 20% em hemodiálise e 12% dos transplantados apresentaram escore > 14 (p = 0,275). Não houve relação entre as variáveis testadas e os sintomas de desesperança e ideação suicida. Não exercer atividade laboral implicou mais sintomas depressivos (escore médio BDI: 10,5 vs. 7,3, p = 0,027). Transplantados de doadores falecidos apresentaram mais sintomas depressivos comparados aos receptores de doadores vivos (escore médio BDI: 11,0 vs. 6,7, p = 0,042). Conclusão: Não houve diferença na intensidade dos sintomas de desesperança, ideação suicida e depressão entre pacientes estáveis em hemodiálise e transplantados. Não exercer atividade laboral e receber transplante de doador falecido levou a mais sintomas depressivos. A prevalência de ideação suicida e sintomas depressivos, nas duas modalidades, merece atenção e indica a necessidade de monitorização e cuidados nesses pacientes.