859 resultados para Collision theory model
Resumo:
Using the pure spinor formalism, a quantizable sigma model has been constructed for the superstring in an AdS(5) X S-5 background with manifest PSU(2,2 vertical bar 4) invariance. The PSU(2,2 vertical bar 4) metric g(AB) has both vector components gab and spinor components g, 3, and in the limit where the spinor components g, 3 are taken to infinity, the AdS5 X S5 sigma model reduces to the worldsheet action in a flat background. In this paper, we instead consider the limit where the vector components g(ab) are taken to infinity. In this limit, the AdS5 X S5 sigma model simplifies to a topological A-model constructed from fermionic N=2 superfields whose bosonic components transform like twistor variables. Just as d=3 Chern-Simons theory can be described by the open string sector of a topological A-model, the open string sector of this topological A-model describes d=4 N=4 super-Yang-Mills. These results might be useful for constructing a worldsheet proof of the Maldacena conjecture analogous to the Gopakumar-Vafa-Ooguri worldsheet proof of Chern-Simons/conifold duality.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Física - IFT
Resumo:
This paper shows the application of a hysteretic model for the Magnetorheological Damper (MRD) placed in the plunge degree-of-freedom of aeroelastic model of a wing. This hysteretic MRD model was developed by the researchers of the French Aerospace Lab. (ONERA) and describe, with a very good precision, the hysteretic behavior of the MRD. The aeroelastic model used in this paper do not have structural nonlinearities, the only nonlinearities showed in the model, are in the unsteady flow equations and are the same proposed by Theodorsen and Wagner in their unsteady aerodynamics theory; and the nonlinearity introduced by the hysteretic model used. The main objective of this paper is show the mathematical modeling of the problem and the equations that describes the aeroelastic response of our problem; and the gain obtained with the introduction of this hysteretic model in the equations with respect to other models that do not show the this behavior, through of pictures that represents the time response and Phase diagrams. These pictures are obtained using flow velocities before and after the flutter velocity. Finally, an open-loop control was made to show the effect of the MRD in the aeroelastic behavior.
Resumo:
The Global Workspace Theory (GWT) proposed by Bernard Baars (1988) along with Daniel Dennett’s (1991) Multiple Drafts Model (MDM) of consciousness are renowned cognitive theories of consciousness bearing similarities and differences. Although Dennett displays sympathy for GWT, his own MDM does not seem to be fully compatible with it. This work discusses this compatibility, by asking if GWT suffers from Daniel Dennett’s criticism of what he calls a “Cartesian Theater”. We identified in Dennett 10 requirements for avoiding the Cartesian Theater. We believe that some of these requirements are violated by GWT, but not all, hence there is partial incompatibility with MDM, and it is nonsense to answer if GWT is or is not a Cartesian Theater. However, by asking such question we conclude that the issues around this discussion involve fuzzy claims about degrees of consciousness and we show how the Neuro-Astroglial Interaction Model (NAIM) is fit for solving such conceptual issues.
Resumo:
Molecular Dynamics (MD) simulation is one of the most important computational techniques with broad applications in physics, chemistry, chemical engineering, materials design and biological science. Traditional computational chemistry refers to quantum calculations based on solving Schrodinger equations. Later developed Density Functional Theory (DFT) based on solving Kohn-Sham equations became the more popular ab initio calculation technique which could deal with ~1000 atoms by explicitly considering electron interactions. In contrast, MD simulation based on solving classical mechanics equations of motion is a totally different technique in the field of computational chemistry. Electron interactions were implicitly included in the empirical atom-based potential functions and the system size to be investigated can be extended to ~106 atoms. The thermodynamic properties of model fluids are mainly determined by macroscopic quantities, like temperature, pressure, density. The quantum effects on thermodynamic properties like melting point, surface tension are not dominant. In this work, we mainly investigated the melting point, surface tension (liquid-vapor and liquid-solid) of model fluids including Lennard-Jones model, Stockmayer model and a couple of water models (TIP4P/Ew, TIP5P/Ew) by means of MD simulation. In addition, some new structures of water confined in carbon nanotube were discovered and transport behaviors of water and ions through nano-channels were also revealed.
Resumo:
Evaluations of measurement invariance provide essential construct validity evidence. However, the quality of such evidence is partly dependent upon the validity of the resulting statistical conclusions. The presence of Type I or Type II errors can render measurement invariance conclusions meaningless. The purpose of this study was to determine the effects of categorization and censoring on the behavior of the chi-square/likelihood ratio test statistic and two alternative fit indices (CFI and RMSEA) under the context of evaluating measurement invariance. Monte Carlo simulation was used to examine Type I error and power rates for the (a) overall test statistic/fit indices, and (b) change in test statistic/fit indices. Data were generated according to a multiple-group single-factor CFA model across 40 conditions that varied by sample size, strength of item factor loadings, and categorization thresholds. Seven different combinations of model estimators (ML, Yuan-Bentler scaled ML, and WLSMV) and specified measurement scales (continuous, censored, and categorical) were used to analyze each of the simulation conditions. As hypothesized, non-normality increased Type I error rates for the continuous scale of measurement and did not affect error rates for the categorical scale of measurement. Maximum likelihood estimation combined with a categorical scale of measurement resulted in more correct statistical conclusions than the other analysis combinations. For the continuous and censored scales of measurement, the Yuan-Bentler scaled ML resulted in more correct conclusions than normal-theory ML. The censored measurement scale did not offer any advantages over the continuous measurement scale. Comparing across fit statistics and indices, the chi-square-based test statistics were preferred over the alternative fit indices, and ΔRMSEA was preferred over ΔCFI. Results from this study should be used to inform the modeling decisions of applied researchers. However, no single analysis combination can be recommended for all situations. Therefore, it is essential that researchers consider the context and purpose of their analyses.
Generalizing the dynamic field theory of spatial cognition across real and developmental time scales
Resumo:
Within cognitive neuroscience, computational models are designed to provide insights into the organization of behavior while adhering to neural principles. These models should provide sufficient specificity to generate novel predictions while maintaining the generality needed to capture behavior across tasks and/or time scales. This paper presents one such model, the Dynamic Field Theory (DFT) of spatial cognition, showing new simulations that provide a demonstration proof that the theory generalizes across developmental changes in performance in four tasks—the Piagetian A-not-B task, a sandbox version of the A-not-B task, a canonical spatial recall task, and a position discrimination task. Model simulations demonstrate that the DFT can accomplish both specificity—generating novel, testable predictions—and generality—spanning multiple tasks across development with a relatively simple developmental hypothesis. Critically, the DFT achieves generality across tasks and time scales with no modification to its basic structure and with a strong commitment to neural principles. The only change necessary to capture development in the model was an increase in the precision of the tuning of receptive fields as well as an increase in the precision of local excitatory interactions among neurons in the model. These small quantitative changes were sufficient to move the model through a set of quantitative and qualitative behavioral changes that span the age range from 8 months to 6 years and into adulthood. We conclude by considering how the DFT is positioned in the literature, the challenges on the horizon for our framework, and how a dynamic field approach can yield new insights into development from a computational cognitive neuroscience perspective.
Resumo:
Backgrounds Ea aims: The boundaries between the categories of body composition provided by vectorial analysis of bioimpedance are not well defined. In this paper, fuzzy sets theory was used for modeling such uncertainty. Methods: An Italian database with 179 cases 18-70 years was divided randomly into developing (n = 20) and testing samples (n = 159). From the 159 registries of the testing sample, 99 contributed with unequivocal diagnosis. Resistance/height and reactance/height were the input variables in the model. Output variables were the seven categories of body composition of vectorial analysis. For each case the linguistic model estimated the membership degree of each impedance category. To compare such results to the previously established diagnoses Kappa statistics was used. This demanded singling out one among the output set of seven categories of membership degrees. This procedure (defuzzification rule) established that the category with the highest membership degree should be the most likely category for the case. Results: The fuzzy model showed a good fit to the development sample. Excellent agreement was achieved between the defuzzified impedance diagnoses and the clinical diagnoses in the testing sample (Kappa = 0.85, p < 0.001). Conclusions: fuzzy linguistic model was found in good agreement with clinical diagnoses. If the whole model output is considered, information on to which extent each BIVA category is present does better advise clinical practice with an enlarged nosological framework and diverse therapeutic strategies. (C) 2012 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Resumo:
We consider general d-dimensional lattice ferromagnetic spin systems with nearest neighbor interactions in the high temperature region ('beta' << 1). Each model is characterized by a single site apriori spin distribution taken to be even. We also take the parameter 'alfa' = ('S POT.4') - 3 '(S POT.2') POT.2' > 0, i.e. in the region which we call Gaussian subjugation, where ('S POT.K') denotes the kth moment of the apriori distribution. Associated with the model is a lattice quantum field theory known to contain a particle of asymptotic mass -ln 'beta' and a bound state below the two-particle threshold. We develop a 'beta' analytic perturbation theory for the binding energy of this bound state. As a key ingredient in obtaining our result we show that the Fourier transform of the two-point function is a meromorphic function, with a simple pole, in a suitable complex spectral parameter and the coefficients of its Laurent expansion are analytic in 'beta'.
Resumo:
This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.
Resumo:
We review the status of integrable models from the point of view of their dynamics and integrability conditions. A few integrable models are discussed in detail. We comment on the use it is made of them in string theory. We also discuss the SO(6) symmetric Hamiltonian with SO(6) boundary. This work is especially prepared for the 70th anniversaries of Andr, Swieca (in memoriam) and Roland Koberle.
Resumo:
Reasoning and change over inconsistent knowledge bases (KBs) is of utmost relevance in areas like medicine and law. Argumentation may bring the possibility to cope with both problems. Firstly, by constructing an argumentation framework (AF) from the inconsistent KB, we can decide whether to accept or reject a certain claim through the interplay among arguments and counterarguments. Secondly, by handling dynamics of arguments of the AF, we might deal with the dynamics of knowledge of the underlying inconsistent KB. Dynamics of arguments has recently attracted attention and although some approaches have been proposed, a full axiomatization within the theory of belief revision was still missing. A revision arises when we want the argumentation semantics to accept an argument. Argument Theory Change (ATC) encloses the revision operators that modify the AF by analyzing dialectical trees-arguments as nodes and attacks as edges-as the adopted argumentation semantics. In this article, we present a simple approach to ATC based on propositional KBs. This allows to manage change of inconsistent KBs by relying upon classical belief revision, although contrary to it, consistency restoration of the KB is avoided. Subsequently, a set of rationality postulates adapted to argumentation is given, and finally, the proposed model of change is related to the postulates through the corresponding representation theorem. Though we focus on propositional logic, the results can be easily extended to more expressive formalisms such as first-order logic and description logics, to handle evolution of ontologies.
Resumo:
Two versions of the threshold contact process ordinary and conservative - are studied on a square lattice. In the first, particles are created on active sites, those having at least two nearest neighbor sites occupied, and are annihilated spontaneously. In the conservative version, a particle jumps from its site to an active site. Mean-field analysis suggests the existence of a first-order phase transition, which is confirmed by Monte Carlo simulations. In the thermodynamic limit, the two versions are found to give the same results. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The main goal of this article is to consider influence assessment in models with error-prone observations and variances of the measurement errors changing across observations. The techniques enable to identify potential influential elements and also to quantify the effects of perturbations in these elements on some results of interest. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease.