922 resultados para model with default Vasicek model and Cir model for the short rate


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proinsulin has been characterized as a neuroprotective molecule. In this work we assess the therapeutic potential of proinsulin on photoreceptor degeneration, synaptic connectivity, and functional activity of the retina in the transgenic P23H rat, an animal model of autosomal dominant retinitis pigmentosa (RP). P23H homozygous rats received an intramuscular injection of an adeno-associated viral vector serotype 1 (AAV1) expressing human proinsulin (hPi+) or AAV1-null vector (hPi−) at P20. Levels of hPi in serum were determined by enzyme-linked immunosorbent assay (ELISA), and visual function was evaluated by electroretinographic (ERG) recording at P30, P60, P90, and P120. Preservation of retinal structure was assessed by immunohistochemistry at P120. Human proinsulin was detected in serum from rats injected with hPi+ at all times tested, with average hPi levels ranging from 1.1 nM (P30) to 1.4 nM (P120). ERG recordings showed an amelioration of vision loss in hPi+ animals. The scotopic b-waves were significantly higher in hPi+ animals than in control rats at P90 and P120. This attenuation of visual deterioration correlated with a delay in photoreceptor degeneration and the preservation of retinal cytoarchitecture. hPi+ animals had 48.7% more photoreceptors than control animals. Presynaptic and postsynaptic elements, as well as the synaptic contacts between photoreceptors and bipolar or horizontal cells, were preserved in hPi+ P23H rats. Furthermore, in hPi+ rat retinas the number of rod bipolar cell bodies was greater than in control rats. Our data demonstrate that hPi expression preserves cone and rod structure and function, together with their contacts with postsynaptic neurons, in the P23H rat. These data strongly support the further development of proinsulin-based therapy to counteract retinitis pigmentosa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the last decades, academic research has paid much attention to the phenomenon of revitalizing indigenous cultures and, more precisely, the use of traditional indigenous healing methods both to deal with individuals' mental health problems and with broader cultural issues. The re-evaluation of traditional indigenous healing practices as a mode of psychotherapeutic treatment has been perhaps one of the most interesting sociocultural processes in the postmodern era. In this regard, incorporating indigenous forms of healing in a contemporary framework of indigenous mental health treatment should be interpreted not simply as an alternative therapeutic response to the clinical context of Western psychiatry, but also constitutes a political response on the part of ethno-cultural groups that have been stereotyped as socially inferior and culturally backward. As a result, a postmodern form of "traditional healing" developed with various forms of knowledge, rites and the social uses of medicinal plants, has been set in motion on many Canadian indigenous reserves over the last two decades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new mathematical model for the simultaneous synthesis of heat exchanger networks (HENs), wherein the handling pressure of process streams is used to enhance the heat integration. The proposed approach combines generalized disjunctive programming (GDP) and mixed-integer nonlinear programming (MINLP) formulation, in order to minimize the total annualized cost composed by operational and capital expenses. A multi-stage superstructure is developed for the HEN synthesis, assuming constant heat capacity flow rates and isothermal mixing, and allowing for streams splits. In this model, the pressure and temperature of streams must be treated as optimization variables, increasing further the complexity and difficulty to solve the problem. In addition, the model allows for coupling of compressors and turbines to save energy. A case study is performed to verify the accuracy of the proposed model. In this example, the optimal integration between the heat and work decreases the need for thermal utilities in the HEN design. As a result, the total annualized cost is also reduced due to the decrease in the operational expenses related to the heating and cooling of the streams.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abrupt climate changes from 18 to 15 thousand years before present (kyr BP) associated with Heinrich Event 1 (HE1) had a strong impact on vegetation patterns not only at high latitudes of the Northern Hemisphere, but also in the tropical regions around the Atlantic Ocean. To gain a better understanding of the linkage between high and low latitudes, we used the University of Victoria (UVic) Earth System-Climate Model (ESCM) with dynamical vegetation and land surface components to simulate four scenarios of climate-vegetation interaction: the pre-industrial era, the Last Glacial Maximum (LGM), and a Heinrich-like event with two different climate backgrounds (interglacial and glacial). We calculated mega-biomes from the plant-functional types (PFTs) generated by the model to allow for a direct comparison between model results and palynological vegetation reconstructions. Our calculated mega-biomes for the pre-industrial period and the LGM corresponded well with biome reconstructions of the modern and LGM time slices, respectively, except that our pre-industrial simulation predicted the dominance of grassland in southern Europe and our LGM simulation resulted in more forest cover in tropical and sub-tropical South America. The HE1-like simulation with a glacial climate background produced sea-surface temperature patterns and enhanced inter-hemispheric thermal gradients in accordance with the "bipolar seesaw" hypothesis. We found that the cooling of the Northern Hemisphere caused a southward shift of those PFTs that are indicative of an increased desertification and a retreat of broadleaf forests in West Africa and northern South America. The mega-biomes from our HE1 simulation agreed well with paleovegetation data from tropical Africa and northern South America. Thus, according to our model-data comparison, the reconstructed vegetation changes for the tropical regions around the Atlantic Ocean were physically consistent with the remote effects of a Heinrich event under a glacial climate background.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyse the relation between the entanglement and spin-squeezing parameter in the two-atom Dicke model and identify the source of the discrepancy recently reported by Banerjee (2001 Preprint quant-ph/0110032) and Zhou et al (2002 J. Opt. B. Quantum Semiclass. Opt. 4 425), namely that one can observe entanglement without spin squeezing. Our calculations demonstrate that there are two criteria for entanglement, one associated with the two-photon coherences that create two-photon entangled states, and the other associated with populations of the collective states. We find that the spin-squeezing parameter correctly predicts entanglement in the two-atom Dicke system only if it is associated with two-photon entangled states, but fails to predict entanglement when it is associated with the entangled symmetric state. This explicitly identifies the source of the discrepancy and explains why the system can be entangled without spin squeezing. We illustrate these findings with three examples of the interaction of the system with thermal, classical squeezed vacuum, and quantum squeezed vacuum fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formal specifications can precisely and unambiguously define the required behavior of a software system or component. However, formal specifications are complex artifacts that need to be verified to ensure that they are consistent, complete, and validated against the requirements. Specification testing or animation tools exist to assist with this by allowing the specifier to interpret or execute the specification. However, currently little is known about how to do this effectively. This article presents a framework and tool support for the systematic testing of formal, model-based specifications. Several important generic properties that should be satisfied by model-based specifications are first identified. Following the idea of mutation analysis, we then use variants or mutants of the specification to check that these properties are satisfied. The framework also allows the specifier to test application-specific properties. All properties are tested for a range of states that are defined by the tester in the form of a testgraph, which is a directed graph that partially models the states and transitions of the specification being tested. Tool support is provided for the generation of the mutants, for automatically traversing the testgraph and executing the test cases, and for reporting any errors. The framework is demonstrated on a small specification and its application to three larger specifications is discussed. Experience indicates that the framework can be used effectively to test small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phenotypic and genetic factor structure of performance on five Multidimensional Aptitude Battery (MAB) subtests and one Wechsler Adult Intelligence Scale-Revised (WAIS-R) subtest was explored in 390 adolescent twin pairs (184 monozygotic [MZ]; 206 dizygotic (DZ)). The temporal stability of these measures was derived from a subsample of 49 twin pairs, with test-retest correlations ranging from .67 to .85. A phenotypic factor model, in which performance and verbal factors were correlated, provided a good fit to the data. Genetic modeling was based on the phenotypic factor structure, but also took into account the additive genetic (A), common environmental (C), and unique environmental (E) parameters derived from a fully saturated ACE model. The best fitting model was characterized by a genetic correlated two-factor structure with specific effects, a general common environmental factor, and overlapping unique environmental effects. Results are compared to multivariate genetic models reported in children and adults, with the most notable difference being the growing importance of common genes influencing diverse abilities in adolescence. (C) 2003 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Accelerating Moment Release (AMR) preceding earthquakes with magnitude above 5 in Australia that occurred during the last 20 years was analyzed to test the Critical Point Hypothesis. Twelve earthquakes in the catalog were chosen based on a criterion for the number of nearby events. Results show that seven sequences with numerous events recorded leading up to the main earthquake exhibited accelerating moment release. Two occurred near in time and space to other earthquakes preceded by AM R. The remaining three sequences had very few events in the catalog so the lack of AMR detected in the analysis may be related to catalog incompleteness. Spatio-temporal scanning of AMR parameters shows that 80% of the areas in which AMR occurred experienced large events. In areas of similar background seismicity with no large events, 10 out of 12 cases exhibit no AMR, and two others are false alarms where AMR was observed but no large event followed. The relationship between AMR and Load-Unload Response Ratio (LURR) was studied. Both methods predict similar critical region sizes, however, the critical point time using AMR is slightly earlier than the time of the critical point LURR anomaly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Our aim was to determine if insomnia severity, dysfunctional beliefs about sleep, and depression predicted sleep-related safety behaviors. Method: Standard sleep-related measures (such as the Insomnia Severity Index; the Dysfunctional Beliefs About Sleep scale; the Depression, Anxiety, and Stress Scale; and the Sleep-Related Behaviors Questionnaire) were administered. Additionally, 14 days of sleep diary (Pittsburg Sleep Diary) data and actual use of sleep-related behaviors were collected. Results: Regression analysis revealed that dysfunctional beliefs about sleep predicted sleep-related safety behaviors. Insomnia severity did not predict sleep-related safety behaviors. Depression accounted for the greatest amount of unique variance in the prediction of safety behaviors, followed by dysfunctional beliefs. Exploratory analysis revealed that participants with higher levels of depression used more sleep-related behaviors and reported greater dysfunctional beliefs about their sleep. Conclusion: The findings underlie the significant influence that dysfunctional beliefs have on individuals' behaviors. Moreover, the results suggest that depression may need to be considered as an explicit component of cognitive-behavioral models of insomnia. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To foster ongoing international cooperation beyond ACES (APEC Cooperation for Earthquake Simulation) on the simulation of solid earth phenomena, agreement was reached to work towards establishment of a frontier international research institute for simulating the solid earth: iSERVO = International Solid Earth Research Virtual Observatory institute (http://www.iservo.edu.au). This paper outlines a key Australian contribution towards the iSERVO institute seed project, this is the construction of: (1) a typical intraplate fault system model using practical fault system data of South Australia (i.e., SA interacting fault model), which includes data management and editing, geometrical modeling and mesh generation; and (2) a finite-element based software tool, which is built on our long-term and ongoing effort to develop the R-minimum strategy based finite-element computational algorithm and software tool for modelling three-dimensional nonlinear frictional contact behavior between multiple deformable bodies with the arbitrarily-shaped contact element strategy. A numerical simulation of the SA fault system is carried out using this software tool to demonstrate its capability and our efforts towards seeding the iSERVO Institute.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The size frequency distributions of diffuse, primitive and cored senile plaques (SP) were studied in single sections of the temporal lobe from 10 patients with Alzheimer’s disease (AD). The size distribution curves were unimodal and positively skewed. The size distribution curve of the diffuse plaques was shifted towards larger plaques while those of the neuritic and cored plaques were shifted towards smaller plaques. The neuritic/diffuse plaque ratio was maximal in the 11 – 30 micron size class and the cored/ diffuse plaque ratio in the 21 – 30 micron size class. The size distribution curves of the three types of plaque deviated significantly from a log-normal distribution. Distributions expressed on a logarithmic scale were ‘leptokurtic’, i.e. with excess of observations near the mean. These results suggest that SP in AD grow to within a more restricted size range than predicted from a log-normal model. In addition, there appear to be differences in the patterns of growth of diffuse, primitive and cored plaques. If neuritic and cored plaques develop from earlier diffuse plaques, then smaller diffuse plaques are more likely to be converted to mature plaques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of my research is consumer brand equity (CBE). My thesis is that the success or otherwise of a brand is better viewed from the consumers’ perspective. I specifically focus on consumers as a unique group of stakeholders whose involvement with brands is crucial to the overall success of branding strategy. To this end, this research examines the constellation of ideas on brand equity that have hitherto been offered by various scholars. Through a systematic integration of the concepts and practices identified but these scholars (concepts and practices such as: competitiveness, consumer searching, consumer behaviour, brand image, brand relevance, consumer perceived value, etc.), this research identifies CBE as a construct that is shaped, directed and made valuable by the beliefs, attitudes and the subjective preferences of consumers. This is done by examining the criteria on the basis of which the consumers evaluate brands and make brand purchase decisions. Understanding the criteria by which consumers evaluate brands is crucial for several reasons. First, as the basis upon which consumers select brands changes with consumption norms and technology, understanding the consumer choice process will help in formulating branding strategy. Secondly, an understanding of these criteria will help in formulating a creative and innovative agenda for ‘new brand’ propositions. Thirdly, it will also influence firms’ ability to simulate and mould the plasticity of demand for existing brands. In examining these three issues, this thesis presents a comprehensive account of CBE. This is because the first issue raised in the preceding paragraph deals with the content of CBE. The second issue addresses the problem of how to develop a reliable and valid measuring instrument for CBE. The third issue examines the structural and statistical relationships between the factors of CBE and the consequences of CBE on consumer perceived value (CPV). Using LISREL-SIMPLIS 8.30, the study finds direct and significant influential links between consumer brand equity and consumer value perception.