22 resultados para Extended Hubbard model
Resumo:
The problem of strongly correlated electrons in one dimension attracted attention of condensed matter physicists since early 50’s. After the seminal paper of Tomonaga [1] who suggested the first soluble model in 1950, there were essential achievements reflected in papers by Luttinger [2] (1963) and Mattis and Lieb [3] (1963). A considerable contribution to the understanding of generic properties of the 1D electron liquid has been made by Dzyaloshinskii and Larkin [4] (1973) and Efetov and Larkin [5] (1976). Despite the fact that the main features of the 1D electron liquid were captured and described by the end of 70’s, the investigators felt dissatisfied with the rigour of the theoretical description. The most famous example is the paper by Haldane [6] (1981) where the author developed the fundamentals of a modern bosonisation technique, known as the operator approach. This paper became famous because the author has rigourously shown how to construct the Fermi creation/anihilation operators out of the Bose ones. The most recent example of such a dissatisfaction is the review by von Delft and Schoeller [7] (1998) who revised the approach to the bosonisation and came up with what they called constructive bosonisation.
Resumo:
A generalized Drucker–Prager (GD–P) viscoplastic yield surface model was developed and validated for asphalt concrete. The GD–P model was formulated based on fabric tensor modified stresses to consider the material inherent anisotropy. A smooth and convex octahedral yield surface function was developed in the GD–P model to characterize the full range of the internal friction angles from 0° to 90°. In contrast, the existing Extended Drucker–Prager (ED–P) was demonstrated to be applicable only for a material that has an internal friction angle less than 22°. Laboratory tests were performed to evaluate the anisotropic effect and to validate the GD–P model. Results indicated that (1) the yield stresses of an isotropic yield surface model are greater in compression and less in extension than that of an anisotropic model, which can result in an under-prediction of the viscoplastic deformation; and (2) the yield stresses predicted by the GD–P model matched well with the experimental results of the octahedral shear strength tests at different normal and confining stresses. By contrast, the ED–P model over-predicted the octahedral yield stresses, which can lead to an under-prediction of the permanent deformation. In summary, the rutting depth of an asphalt pavement would be underestimated without considering anisotropy and convexity of the yield surface for asphalt concrete. The proposed GD–P model was demonstrated to be capable of overcoming these limitations of the existing yield surface models for the asphalt concrete.
Resumo:
We use the GN-model to assess Nyquist-WDM 100/200Gbit/s PM-QPSK/16QAM signal reach on low loss, large core area fibre using extended range, variable gain hybrid Raman-EDFAs. 5000/1500km transmission is possible over a wide range of amplifier spans. © OSA 2014.
Resumo:
Previous work has demonstrated that planning behaviours may be more adaptive than avoidance strategies in driving self-regulation, but ways of encouraging planning have not been investigated. The efficacy of an extended theory of planned behaviour (TPB) plus implementation intention based intervention to promote planning self-regulation in drivers across the lifespan was tested. An age stratified group of participants (N=81, aged 18-83 years) was randomly assigned to an experimental or control condition. The intervention prompted specific goal setting with action planning and barrier identification. Goal setting was carried out using an agreed behavioural contract. Baseline and follow-up measures of TPB variables, self-reported, driving self-regulation behaviours (avoidance and planning) and mobility goal achievements were collected using postal questionnaires. Like many previous efforts to change planned behaviour by changing its predictors using models of planned behaviour such as the TPB, results showed that the intervention did not significantly change any of the model components. However, more than 90% of participants achieved their primary driving goal, and self-regulation planning as measured on a self-regulation inventory was marginally improved. The study demonstrates the role of pre-decisional, or motivational components as contrasted with post-decisional goal enactment, and offers promise for the role of self-regulation planning and implementation intentions in assisting drivers in achieving their mobility goals and promoting safer driving across the lifespan, even in the context of unchanging beliefs such as perceived risk or driver anxiety.
Resumo:
The cell:cell bond between an immune cell and an antigen presenting cell is a necessary event in the activation of the adaptive immune response. At the juncture between the cells, cell surface molecules on the opposing cells form non-covalent bonds and a distinct patterning is observed that is termed the immunological synapse. An important binding molecule in the synapse is the T-cell receptor (TCR), that is responsible for antigen recognition through its binding with a major-histocompatibility complex with bound peptide (pMHC). This bond leads to intracellular signalling events that culminate in the activation of the T-cell, and ultimately leads to the expression of the immune eector function. The temporal analysis of the TCR bonds during the formation of the immunological synapse presents a problem to biologists, due to the spatio-temporal scales (nanometers and picoseconds) that compare with experimental uncertainty limits. In this study, a linear stochastic model, derived from a nonlinear model of the synapse, is used to analyse the temporal dynamics of the bond attachments for the TCR. Mathematical analysis and numerical methods are employed to analyse the qualitative dynamics of the nonequilibrium membrane dynamics, with the specic aim of calculating the average persistence time for the TCR:pMHC bond. A single-threshold method, that has been previously used to successfully calculate the TCR:pMHC contact path sizes in the synapse, is applied to produce results for the average contact times of the TCR:pMHC bonds. This method is extended through the development of a two-threshold method, that produces results suggesting the average time persistence for the TCR:pMHC bond is in the order of 2-4 seconds, values that agree with experimental evidence for TCR signalling. The study reveals two distinct scaling regimes in the time persistent survival probability density prole of these bonds, one dominated by thermal uctuations and the other associated with the TCR signalling. Analysis of the thermal fluctuation regime reveals a minimal contribution to the average time persistence calculation, that has an important biological implication when comparing the probabilistic models to experimental evidence. In cases where only a few statistics can be gathered from experimental conditions, the results are unlikely to match the probabilistic predictions. The results also identify a rescaling relationship between the thermal noise and the bond length, suggesting a recalibration of the experimental conditions, to adhere to this scaling relationship, will enable biologists to identify the start of the signalling regime for previously unobserved receptor:ligand bonds. Also, the regime associated with TCR signalling exhibits a universal decay rate for the persistence probability, that is independent of the bond length.
Resumo:
The semantic model developed in this research was in response to the difficulty a group of mathematics learners had with conventional mathematical language and their interpretation of mathematical constructs. In order to develop the model ideas from linguistics, psycholinguistics, cognitive psychology, formal languages and natural language processing were investigated. This investigation led to the identification of four main processes: the parsing process, syntactic processing, semantic processing and conceptual processing. The model showed the complex interdependency between these four processes and provided a theoretical framework in which the behaviour of the mathematics learner could be analysed. The model was then extended to include the use of technological artefacts into the learning process. To facilitate this aspect of the research, the theory of instrumentation was incorporated into the semantic model. The conclusion of this research was that although the cognitive processes were interdependent, they could develop at different rates until mastery of a topic was achieved. It also found that the introduction of a technological artefact into the learning environment introduced another layer of complexity, both in terms of the learning process and the underlying relationship between the four cognitive processes.
Resumo:
Our goal here is a more complete understanding of how information about luminance contrast is encoded and used by the binocular visual system. In two-interval forced-choice experiments we assessed observers' ability to discriminate changes in contrast that could be an increase or decrease of contrast in one or both eyes, or an increase in one eye coupled with a decrease in the other (termed IncDec). The base or pedestal contrasts were either in-phase or out-of-phase in the two eyes. The opposed changes in the IncDec condition did not cancel each other out, implying that along with binocular summation, information is also available from mechanisms that do not sum the two eyes' inputs. These might be monocular mechanisms. With a binocular pedestal, monocular increments of contrast were much easier to see than monocular decrements. These findings suggest that there are separate binocular (B) and monocular (L,R) channels, but only the largest of the three responses, max(L,B,R), is available to perception and decision. Results from contrast discrimination and contrast matching tasks were described very accurately by this model. Stimuli, data, and model responses can all be visualized in a common binocular contrast space, allowing a more direct comparison between models and data. Some results with out-of-phase pedestals were not accounted for by the max model of contrast coding, but were well explained by an extended model in which gratings of opposite polarity create the sensation of lustre. Observers can discriminate changes in lustre alongside changes in contrast.