855 resultados para Theory Of Cognitive Fit
Resumo:
(from the journal abstract) Background: Despite the effectiveness of anti-psychotic pharmacotherapy, residual hallucinations and delusions do not completely resolve in some medicated patients. Additional cognitive behavioral therapy (CBT) seems to improve the management of positive symptoms. Despite promising results, the efficacy of CBT is still unclear. The present study addresses this issue taking into account a number of newly published controlled studies. Method: Fourteen studies including 1484 patients, published between 1990 and 2004 were identified and a meta-analysis of their results performed. Results: Compared to other adjunctive measures, CBT showed significant reduction in positive symptoms and there was a higher benefit of CBT for patients suffering an acute psychotic episode versus the chronic condition (effect size of 0.57 vs. 0.27). Discussion: CBT is a promising adjunctive treatment for positive symptoms in schizophrenia spectrum disorders. However, a number of potentially modifying variables have not yet been examined, such as therapeutic alliance and neuropsychological deficits. (PsycINFO Database Record (c) 2005 APA, all rights reserved)
Resumo:
A mathematical model that describes the behavior of low-resolution Fresnel lenses encoded in any low-resolution device (e.g., a spatial light modulator) is developed. The effects of low-resolution codification, such the appearance of new secondary lenses, are studied for a general case. General expressions for the phase of these lenses are developed, showing that each lens behaves as if it were encoded through all pixels of the low-resolution device. Simple expressions for the light distribution in the focal plane and its dependence on the encoded focal length are developed and commented on in detail. For a given codification device an optimum focal length is found for best lens performance. An optimization method for codification of a single lens with a short focal length is proposed.
Resumo:
A mathematical model describing the behavior of low-resolution Fresnel encoded lenses (LRFEL's) encoded in any low-resolution device (e.g., a spatial light modulator) has recently been developed. From this model, an LRFEL with a short focal length was optimized by our imposing the maximum intensity of light onto the optical axis. With this model, analytical expressions for the light-amplitude distribution, the diffraction efficiency, and the frequency response of the optimized LRFEL's are derived.
Resumo:
In this Contribution we show that a suitably defined nonequilibrium entropy of an N-body isolated system is not a constant of the motion, in general, and its variation is bounded, the bounds determined by the thermodynamic entropy, i.e., the equilibrium entropy. We define the nonequilibrium entropy as a convex functional of the set of n-particle reduced distribution functions (n ? N) generalizing the Gibbs fine-grained entropy formula. Additionally, as a consequence of our microscopic analysis we find that this nonequilibrium entropy behaves as a free entropic oscillator. In the approach to the equilibrium regime, we find relaxation equations of the Fokker-Planck type, particularly for the one-particle distribution function.
Resumo:
Background: Atypical antipsychotics provide better control of the negative and affective symptoms of schizophrenia when compared with conventional neuroleptics; nevertheless, their heightened ability to improve cognitive dysfunction remains a matter of debate. This study aimed to examine the changes in cognition associated with long-term antipsychotic treatment and to evaluate the effect of the type of antipsychotic (conventional versus novel antipsychotic drugs) on cognitive performance over time. Methods: In this naturalistic study, we used a comprehensive neuropsychological battery of tests to assess a sample of schizophrenia patients taking either conventional (n = 13) or novel antipsychotics (n = 26) at baseline and at two years after. Results: Continuous antipsychotic treatment regardless of class was associated with improvement on verbal fluency, executive functions, and visual and verbal memory. Patients taking atypical antipsychotics did not show greater cognitive enhancement over two years than patients taking conventional antipsychotics. Conclusions Although long-term antipsychotic treatment slightly improved cognitive function, the switch from conventional to atypical antipsychotic treatment should not be based exclusively on the presence of these cognitive deficits.
Resumo:
Exploratory and confirmatory factor analyses reported in the French technical manual of the WISC-IV provides evidence supporting a structure with four indices: Verbal Comprehension (VCI), Perceptual Reasoning (PRI), Working Memory (WMI), and Processing Speed (PSI). Although the WISC-IV is more attuned to contemporary theory, it is still not in total accordance with the dominant theory: the Cattell-Horn-Carroll (CHC) theory of cognitive ability. This study was designed to determine whether the French WISC-IV is better described with the four-factor solution or whether an alternative model based on the CHC theory is more appropriate. The intercorrelations matrix reported in the French technical manual was submitted to confirmatory factor analysis. A comparison of competing models suggests that a model based on the CHC theory fits the data better than the current WISC-IV structure. It appears that the French WISC-IV in fact measures six factors: crystallized intelligence (Gc), fluid intelligence (Gf), short-term memory (Gsm), processing speed (Gs), quantitative knowledge (Gq), and visual processing (Gv). We recommend that clinicians interpret the subtests of the French WISC-IV in relation to this CHC model in addition to the four indices.
Resumo:
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Resumo:
This paper evaluates the reception of Léon Walras' ideas in Russia before 1920. Despite an unfavourable institutional context, Walras was read by Russian economists. On the one hand, Bortkiewicz and Winiarski, who lived outside Russia and had the opportunity to meet and correspond with Walras, were first class readers and very good ambassadors for Walras' ideas, while on the other, the economists living in Russia were more selective in their readings. They restricted themselves to Walras' Elements of Pure Economics, in particular, its theory of exchange, while ignoring its theory of production. We introduce a cultural argument to explain their selective reading. JEL classification numbers: B 13, B 19.
Resumo:
Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.
Resumo:
Interest in cognitive pretest methods for evaluating survey questionnaires has been increasing for the last three decades. However, analysing the features of the scientific output in the field can be difficult due to its prevalence in public and private institutes whose main mission is not scientific research. The aim of this research is to characterize the current state of scientific output in the field by means of two bibliometric studies for the period from 1980 to 2007. Study 1 analysed documents obtained from the more commonly used bibliographic databases. Study 2 supplemented the body of documents from Study 1 with documents from non-indexed journals, conference papers, etc. Results show a constant growth in the number of publications. The wide dispersion of publication sources, together with the highlighted role of the public and private institutions as centres of production, can also be identified as relevant characteristics of the scientific output in this field.
Resumo:
A new model for dealing with decision making under risk by considering subjective and objective information in the same formulation is here presented. The uncertain probabilistic weighted average (UPWA) is also presented. Its main advantage is that it unifies the probability and the weighted average in the same formulation and considering the degree of importance that each case has in the analysis. Moreover, it is able to deal with uncertain environments represented in the form of interval numbers. We study some of its main properties and particular cases. The applicability of the UPWA is also studied and it is seen that it is very broad because all the previous studies that use the probability or the weighted average can be revised with this new approach. Focus is placed on a multi-person decision making problem regarding the selection of strategies by using the theory of expertons.
Resumo:
Connectivity among demes in a metapopulation depends on both the landscape's and the focal organism's properties (including its mobility and cognitive abilities). Using individual-based simulations, we contrast the consequences of three different cognitive strategies on several measures of metapopulation connectivity. Model animals search suitable habitat patches while dispersing through a model landscape made of cells varying in size, shape, attractiveness and friction. In the blind strategy, the next cell is chosen randomly among the adjacent ones. In the near-sighted strategy, the choice depends on the relative attractiveness of these adjacent cells. In the far-sighted strategy, animals may additionally target suitable patches that appear within their perceptual range. Simulations show that the blind strategy provides the best overall connectivity, and results in balanced dispersal. The near-sighted strategy traps animals into corridors that reduce the number of potential targets, thereby fragmenting metapopulations in several local clusters of demes, and inducing sink-source dynamics. This sort of local trapping is somewhat prevented in the far-sighted strategy. The colonization success of strategies depends highly on initial energy reserves: blind does best when energy is high, near-sighted wins at intermediate levels, and far-sighted outcompetes its rivals at low energy reserves. We also expect strong effects in terms of metapopulation genetics: the blind strategy generates a migrant-pool mode of dispersal that should erase local structures. By contrast, near- and far-sighted strategies generate a propagule-pool mode of dispersal and source-sink behavior that should boost structures (high genetic variance among- and low variance within local clusters of demes), particularly if metapopulation dynamics is also affected by extinction-colonization processes. Our results thus point to important effects of the cognitive ability of dispersers on the connectivity, dynamics and genetics of metapopulations.