107 resultados para Global Processing Speed
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Previous work has suggested that decrement in both processing speed and working memory span plays a role in the memory impairment observed in patients with schizophrenia. We undertook a study to examine simultaneously the effect of these two factors. A sample of 49 patients with schizophrenia and 43 healthy controls underwent a battery of verbal and visual memory tasks. Superficial and deep encoding memory measures were tallied. We conducted regression analyses on the various memory measures, using processing speed and working memory span as independent variables. In the patient group, processing speed was a significant predictor of superficial and deep memory measures in verbal and visual memory. Working memory span was an additional significant predictor of the deep memory measures only. Regression analyses involving all participants revealed that the effect of diagnosis on all the deep encoding memory measures was reduced to non-significance when processing speed was entered in the regression. Decreased processing speed is involved in verbal and visual memory deficit in patients, whether the task require superficial or deep encoding. Working memory is involved only insofar as the task requires a certain amount of effort. (JINS, 2011, 17, 485-493)
Resumo:
Objective: The purpose of the present study was to investigate the influence that education and depression have on the performance of elderly people in neuropsychological tests. Methods: The study was conducted at the Institute of Psychiatry, University of Sao Paulo School of Medicine, Hospital das Clinicas. All of the individuals evaluated were aged 60 or older. The study sample consisted of 59 outpatients with depressive disorders and 51 healthy controls. We stratified the sample by level of education: low = 1-4 years of schooling; high = 5 or more years of schooling. Evaluations consisted of psychiatric assessment, cognitive assessment, laboratory tests and cerebral magnetic resonance imaging. Results: We found that level of education influenced all the measures of cognitive domains investigated (intellectual efficiency, processing speed, attention, executive function and memory) except the Digit Span Forward and Fuld Object Memory Evaluation (immediate and delayed recall), whereas depressive symptoms influenced some measures of memory, attention, executive function and processing speed. Although the combination of a low level of education and depression had a significant negative influence on Stroop Test part B, Trail Making Test part B and Logical Memory (immediate recall), we found no other significant effects of the interaction between level of education and depression. Conclusion: The results of this study underscore the importance of considering the level of education in the analysis of cognitive performance in depressed elderly patients, as well as the relevance of developing new cognitive function tests in which level of education has a reduced impact on the results.
Resumo:
This paper presents a rational approach to the design of a catamaran's hydrofoil applied within a modern context of multidisciplinary optimization. The approach used includes the use of response surfaces represented by neural networks and a distributed programming environment that increases the optimization speed. A rational approach to the problem simplifies the complex optimization model; when combined with the distributed dynamic training used for the response surfaces, this model increases the efficiency of the process. The results achieved using this approach have justified this publication.
Resumo:
This paper compares the critical impeller speed results for 6 L Denver and Wemco bench-scale flotation cells with findings from a study by Van der Westhuizen and Deglon [Van der Westhuizen, A.P., Deglon, D.A., 2007. Evaluation of solids suspension in a pilot-scale mechanical flotation cell: the critical impeller speed. Minerals Engineering 20,233-240; Van der Westhuizen, A.P., Deglon, D.A., 2008. Solids suspension in a pilot scale mechanical flotation cell: a critical impeller speed correlation. Minerals Engineering 21, 621-629] conducted in a 125 L Batequip flotation cell. Understanding solids suspension has become increasingly important due to dramatic increases in flotation cell sizes. The critical impeller speed is commonly used to indicate the effectiveness of solids suspension. The minerals used in this study were apatite, quartz and hematite. The critical impeller speed was found to be strongly dependent on particle size, solids density and air flow rate, with solids concentration having a lesser influence. Liquid viscosity was found to have a negligible effect. The general Zwietering-type critical impeller speed correlation developed by Van der Westhuizen and Deglon [Van der Westhuizen, A.P., Deglon, D.A., 2008. Solids suspension in a pilot scale mechanical flotation cell: a critical impeller speed correlation. Minerals Engineering 21, 621-629] was found to be applicable to all three flotation machines. The exponents for particle size, solids concentration and liquid viscosity were equivalent for all three cells. The exponent for solids density was found to be less significant than that obtained by the previous authors, and to be consistent with values reported in the general literature for stirred tanks. Finally, a new dimensionless critical impeller speed correlation is proposed where the particle size is divided by the impeller diameter. This modified equation generally predicts the experimental measurements well, with most predictions within 10% of the experimental. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The goal of this paper is to study the global existence of small data solutions to the Cauchy problem for the nonlinear wave equation u(tt) - a(t)(2) Delta u = u(t)(2) - a(t)(2)vertical bar del u vertical bar(2). In particular we are interested in statements for the 1D case. We will explain how the interplay between the increasing and oscillating behavior of the coefficient will influence global existence of small data solutions. Copyright c 2011 John Wiley & Sons, Ltd.
Resumo:
The Down syndrome (DS) immune phenotype is characterized by thymus hypotrophy, higher propensity to organ-specific autoimmune disorders, and higher susceptibility to infections, among other features. Considering that AIRE (autoimmune regulator) is located on 21q22.3, we analyzed protein and gene expression in surgically removed thymuses from 14 DS patients with congenital heart defects, who were compared with 42 age-matched controls with heart anomaly as an isolated malformation. Immunohistochemistry revealed 70.48 +/- 49.59 AIRE-positive cells/mm(2) in DS versus 154.70 +/- 61.16 AIRE-positive cells/mm(2) in controls (p < 0.0001), and quantitative PCR as well as DNA microarray data confirmed those results. The number of FOXP3-positive cells/mm(2) was equivalent in both groups. Thymus transcriptome analysis showed 407 genes significantly hypoexpressed in DS, most of which were related, according to network transcriptional analysis (FunNet), to cell division and to immunity. Immune response-related genes included those involved in 1) Ag processing and presentation (HLA-DQB1, HLA-DRB3, CD1A, CD1B, CD1C, ERAP) and 2) thymic T cell differentiation (IL2RG, RAG2, CD3D, CD3E, PRDX2, CDK6) and selection (SH2D1A, CD74). It is noteworthy that relevant AIRE-partner genes, such as TOP2A, LAMNB1, and NUP93, were found hypoexpressed in DNA microarrays and quantitative real-time PCR analyses. These findings on global thymic hypofunction in DS revealed molecular mechanisms underlying DS immune phenotype and strongly suggest that DS immune abnormalities are present since early development, rather than being a consequence of precocious aging, as widely hypothesized. Thus, DS should be considered as a non-monogenic primary immunodeficiency. The Journal of Immunology, 2011, 187: 3422-3430.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
Effective incorporation of a probiotic into foods requires the culture to remain viable all along processing and storage, without adverse alterations to sensory characteristics. The objective of this work was developing Minas-type fresh cheese with probiotic properties from buffalo milk. Four batches of Minas-type fresh cheese were prepared using buffalo milk: batch T1 in which neither culture nor lactic acid added; batch T3 in which only lactic acid added; batches T2 and T4 , both added of Lactobacillus acidophilus LAC 4, but T4 was also acidified. Resulting cheeses were evaluated for probiotic culture stability, texture profile, sensory acceptance, and changes in pH. The T4 probiotic cheese presented hardness, gumminess, and chewiness significantly lower than the other treatments. However, values for springiness and cohesiveness did not differ between all cheeses, and no sensory differences (p > 0.05) were found between treatments for texture, taste, and overall acceptance. The addition of probiotic to the acidified cheese (T4) yielded best aroma. The populations of L. acidophilus were greater than 10(6) CFU g-1 after 28 days of storage all products. Minas-type fresh cheese from buffalo milk is a suitable food for the delivery of L. acidophilus, since the culture remained viable during the shelf life of the products and did not negative affect analysed parameters.
Resumo:
Osmotic dehydration is becoming more popular as a complementary treatment in the processing of dehydrated foods, since it presents some advantages such as minimising heat damage to the colour and flavour, inhibiting enzymatic browning and thus dispensing the addition of sulphite and, mainly, reducing energy costs. The objective of the present study was to evaluate the effect of using inverted sugar and sucrose syrups as osmotic agents in the dehydration of mango. The conditions used in the dehydration process were: syrup/fruit ratio of 3:1 (v/w); temperature of 45ºC and constant stirring. The in natura and osmo-dehydrated fruits were evaluated in relation to pH, moisture content, water activity (a w) and soluble solids (ºBrix). Solids incorporation and loss in mass after the dehydration process were also determined. The sensory acceptance of the in natura and osmo-dehydrated fruits was determined for the attributes of aroma, flavour, texture and overall acceptance using a hedonic scale. Osmotic dehydration resulted in a reduction in moisture content and water activity, an increase in Brix and maintenance of the pH. The treatment with inverted sugar syrup resulted in more significant alterations in moisture content, a w, Brix, solids incorporation and loss in mass than the treatment with sucrose syrup. Mangos osmo-dehydrated with inverted sugar (55.3% inversion rate) syrup obtained acceptance similar to in natura mangos, this treatment being considered the most adequate for dehydration purposes.
Resumo:
Este trabalho avalia o desempenho de previsões sazonais do modelo climático regional RegCM3, aninhado ao modelo global CPTEC/COLA. As previsões com o RegCM3 utilizaram 60 km de resolução horizontal num domínio que inclui grande parte da América do Sul. As previsões do RegCM3 e CPTEC/COLA foram avaliadas utilizando as análises de chuva e temperatura do ar do Climate Prediction Center (CPC) e National Centers for Enviromental Prediction (NCEP), respectivamente. Entre maio de 2005 e julho de 2007, 27 previsões sazonais de chuva e temperatura do ar (exceto a temperatura do CPTEC/COLA, que possui 26 previsões) foram avaliadas em três regiões do Brasil: Nordeste (NDE), Sudeste (SDE) e Sul (SUL). As previsões do RegCM3 também foram comparadas com as climatologias das análises. De acordo com os índices estatísticos (bias, coeficiente de correlação, raiz quadrada do erro médio quadrático e coeficiente de eficiência), nas três regiões (NDE, SDE e SUL) a chuva sazonal prevista pelo RegCM3 é mais próxima da observada do que a prevista pelo CPTEC/COLA. Além disto, o RegCM3 também é melhor previsor da chuva sazonal do que da média das observações nas três regiões. Para temperatura, as previsões do RegCM3 são superiores às do CPTEC/COLA nas áreas NDE e SUL, enquanto o CPTEC/COLA é superior no SDE. Finalmente, as previsões de chuva e temperatura do RegCM3 são mais próximas das observações do que a climatologia observada. Estes resultados indicam o potencial de utilização do RegCM3 para previsão sazonal, que futuramente deverá ser explorado através de previsão por conjunto.
Resumo:
The Levei Low Jet (LLJ) observed in the Porto Alegre metropolitan region, Rio Grande do Sul State, Brazil, was analyzed using 1989-2003 at 00:00 and 12:00 UTC upper-air observations. The LLJ classification criteria proposed by Bonner (1968) and modified by Whiteman et aI. (1997) were applied to determine the LLJ occurrence. Afterwards was selected a LLJ event, that was one of the most intense observed in the summer (01/27/2002 at 12:00 UTC), during the study period. ln this study were used as tools: atmospheric soundings, GOES-8 satellite images, and wind, temperature and specific humidity fields from GLOBAL, ETA and BRAMS models. Based on the numerical analysis was possible to verify that the three models overestimated the specific humidity and potential temperature values, at LLJ time occurrence. The wind speed was underestimated by the models. It was observed in the study region, at 12:00 UTC (LLJ detected hour in the Porto Alegre region), by three models, warm and wet air from north, generating conditions to Mesoscale Convective System (MCS) formation and intensification.
Resumo:
Due to the imprecise nature of biological experiments, biological data is often characterized by the presence of redundant and noisy data. This may be due to errors that occurred during data collection, such as contaminations in laboratorial samples. It is the case of gene expression data, where the equipments and tools currently used frequently produce noisy biological data. Machine Learning algorithms have been successfully used in gene expression data analysis. Although many Machine Learning algorithms can deal with noise, detecting and removing noisy instances from the training data set can help the induction of the target hypothesis. This paper evaluates the use of distance-based pre-processing techniques for noise detection in gene expression data classification problems. This evaluation analyzes the effectiveness of the techniques investigated in removing noisy data, measured by the accuracy obtained by different Machine Learning classifiers over the pre-processed data.
Resumo:
This study addressed the use of conventional and vegetable origin polyurethane foams to extract C. I. Acid Orange 61 dye. The quantitative determination of the residual dye was carried out with an UV/Vis absorption spectrophotometer. The extraction of the dye was found to depend on various factors such as pH of the solution, foam cell structure, contact time and dye and foam interactions. After 45 days, better results were obtained for conventional foam when compared to vegetable foam. Despite presenting a lower percentage of extraction, vegetable foam is advantageous as it is considered a polymer with biodegradable characteristics.
Resumo:
The purpose of this study was to measure the prevalence of global and leisure-time physical activity and associated factors in the elderly. This was a population-based cross-sectional study covering a multiple-stage sample of 1,950 subjects 60 years or older living in areas of São Paulo State, Brazil. Prevalence of global physical activity (assessed through the short version of the International Physical Activity Questionnaire - IPAQ) was 73.9%, and prevalence of leisure-time physical activity was 28.4%. The results highlight the differences between factors associated with global and leisure-time physical activities. The social groups most prone to overall sedentary lifestyle and especially to lack of leisure-time physical activity should be the main targets of health policies aimed at promoting healthier lifestyles.
Resumo:
This paper describes a new food classification which assigns foodstuffs according to the extent and purpose of the industrial processing applied to them. Three main groups are defined: unprocessed or minimally processed foods (group 1), processed culinary and food industry ingredients (group 2), and ultra-processed food products (group 3). The use of this classification is illustrated by applying it to data collected in the Brazilian Household Budget Survey which was conducted in 2002/2003 through a probabilistic sample of 48,470 Brazilian households. The average daily food availability was 1,792 kcal/person being 42.5% from group 1 (mostly rice and beans and meat and milk), 37.5% from group 2 (mostly vegetable oils, sugar, and flours), and 20% from group 3 (mostly breads, biscuits, sweets, soft drinks, and sausages). The share of group 3 foods increased with income, and represented almost one third of all calories in higher income households. The impact of the replacement of group 1 foods and group 2 ingredients by group 3 products on the overall quality of the diet, eating patterns and health is discussed.