24 resultados para MCDONALD EXTENDED EXPONENTIAL MODEL


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to investigate enterprise resource planning (ERP) systems development and emerging practices in the management of enterprises (i.e. parts of companies working with parts of other companies to deliver a complex product and/or service) and identify any apparent correlations. Suitable a priori contingency frameworks are then used and extended to explain apparent correlations. Discussion is given to provide guidance for researchers and practitioners to deliver better strategic, structural and operational competitive advantage through this approach; coined here as the "enterprization of operations". Design/methodology/approach: Theoretical induction uses a new empirical longitudinal case study from Zoomlion (a Chinese manufacturing company) built using an adapted form of template analysis to produce a new contingency framework. Findings: Three main types of enterprises and the three main types of ERP systems are defined and correlations between them are explained. Two relevant a priori frameworks are used to induct a new contingency model to support the enterprization of operations; known as the dynamic enterprise reference grid for ERP (DERG-ERP). Research limitations/implications: The findings are based on one longitudinal case study. Further case studies are currently being conducted in the UK and China. Practical implications: The new contingency model, the DERG-ERP, serves as a guide for ERP vendors, information systems management and operations managers hoping to grow and sustain their competitive advantage with respect to effective enterprise strategy, enterprise structure and ERP systems. Originality/value: This research explains how ERP systems and the effective management of enterprises should develop in order to sustain competitive advantage with respect to enterprise strategy, enterprise structure and ERP systems use. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the authors use an exponential generalized autoregressive conditional heteroscedastic (EGARCH) error-correction model (ECM), that is, EGARCH-ECM, to estimate the pass-through effects of foreign exchange (FX) rates and producers’ prices for 20 U.K. export sectors. The long-run adjustment of export prices to FX rates and producers’ prices is within the range of -1.02% (for the Textiles sector) and -17.22% (for the Meat sector). The contemporaneous pricing-to-market (PTM) coefficient is within the range of -72.84% (for the Fuels sector) and -8.05% (for the Textiles sector). Short-run FX rate pass-through is not complete even after several months. Rolling EGARCH-ECMs show that the short and long-run effects of FX rate and producers’ prices fluctuate substantially as are asymmetry and volatility estimates before equilibrium is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generalized Drucker–Prager (GD–P) viscoplastic yield surface model was developed and validated for asphalt concrete. The GD–P model was formulated based on fabric tensor modified stresses to consider the material inherent anisotropy. A smooth and convex octahedral yield surface function was developed in the GD–P model to characterize the full range of the internal friction angles from 0° to 90°. In contrast, the existing Extended Drucker–Prager (ED–P) was demonstrated to be applicable only for a material that has an internal friction angle less than 22°. Laboratory tests were performed to evaluate the anisotropic effect and to validate the GD–P model. Results indicated that (1) the yield stresses of an isotropic yield surface model are greater in compression and less in extension than that of an anisotropic model, which can result in an under-prediction of the viscoplastic deformation; and (2) the yield stresses predicted by the GD–P model matched well with the experimental results of the octahedral shear strength tests at different normal and confining stresses. By contrast, the ED–P model over-predicted the octahedral yield stresses, which can lead to an under-prediction of the permanent deformation. In summary, the rutting depth of an asphalt pavement would be underestimated without considering anisotropy and convexity of the yield surface for asphalt concrete. The proposed GD–P model was demonstrated to be capable of overcoming these limitations of the existing yield surface models for the asphalt concrete.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use the GN-model to assess Nyquist-WDM 100/200Gbit/s PM-QPSK/16QAM signal reach on low loss, large core area fibre using extended range, variable gain hybrid Raman-EDFAs. 5000/1500km transmission is possible over a wide range of amplifier spans. © OSA 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous work has demonstrated that planning behaviours may be more adaptive than avoidance strategies in driving self-regulation, but ways of encouraging planning have not been investigated. The efficacy of an extended theory of planned behaviour (TPB) plus implementation intention based intervention to promote planning self-regulation in drivers across the lifespan was tested. An age stratified group of participants (N=81, aged 18-83 years) was randomly assigned to an experimental or control condition. The intervention prompted specific goal setting with action planning and barrier identification. Goal setting was carried out using an agreed behavioural contract. Baseline and follow-up measures of TPB variables, self-reported, driving self-regulation behaviours (avoidance and planning) and mobility goal achievements were collected using postal questionnaires. Like many previous efforts to change planned behaviour by changing its predictors using models of planned behaviour such as the TPB, results showed that the intervention did not significantly change any of the model components. However, more than 90% of participants achieved their primary driving goal, and self-regulation planning as measured on a self-regulation inventory was marginally improved. The study demonstrates the role of pre-decisional, or motivational components as contrasted with post-decisional goal enactment, and offers promise for the role of self-regulation planning and implementation intentions in assisting drivers in achieving their mobility goals and promoting safer driving across the lifespan, even in the context of unchanging beliefs such as perceived risk or driver anxiety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cell:cell bond between an immune cell and an antigen presenting cell is a necessary event in the activation of the adaptive immune response. At the juncture between the cells, cell surface molecules on the opposing cells form non-covalent bonds and a distinct patterning is observed that is termed the immunological synapse. An important binding molecule in the synapse is the T-cell receptor (TCR), that is responsible for antigen recognition through its binding with a major-histocompatibility complex with bound peptide (pMHC). This bond leads to intracellular signalling events that culminate in the activation of the T-cell, and ultimately leads to the expression of the immune eector function. The temporal analysis of the TCR bonds during the formation of the immunological synapse presents a problem to biologists, due to the spatio-temporal scales (nanometers and picoseconds) that compare with experimental uncertainty limits. In this study, a linear stochastic model, derived from a nonlinear model of the synapse, is used to analyse the temporal dynamics of the bond attachments for the TCR. Mathematical analysis and numerical methods are employed to analyse the qualitative dynamics of the nonequilibrium membrane dynamics, with the specic aim of calculating the average persistence time for the TCR:pMHC bond. A single-threshold method, that has been previously used to successfully calculate the TCR:pMHC contact path sizes in the synapse, is applied to produce results for the average contact times of the TCR:pMHC bonds. This method is extended through the development of a two-threshold method, that produces results suggesting the average time persistence for the TCR:pMHC bond is in the order of 2-4 seconds, values that agree with experimental evidence for TCR signalling. The study reveals two distinct scaling regimes in the time persistent survival probability density prole of these bonds, one dominated by thermal uctuations and the other associated with the TCR signalling. Analysis of the thermal fluctuation regime reveals a minimal contribution to the average time persistence calculation, that has an important biological implication when comparing the probabilistic models to experimental evidence. In cases where only a few statistics can be gathered from experimental conditions, the results are unlikely to match the probabilistic predictions. The results also identify a rescaling relationship between the thermal noise and the bond length, suggesting a recalibration of the experimental conditions, to adhere to this scaling relationship, will enable biologists to identify the start of the signalling regime for previously unobserved receptor:ligand bonds. Also, the regime associated with TCR signalling exhibits a universal decay rate for the persistence probability, that is independent of the bond length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The semantic model developed in this research was in response to the difficulty a group of mathematics learners had with conventional mathematical language and their interpretation of mathematical constructs. In order to develop the model ideas from linguistics, psycholinguistics, cognitive psychology, formal languages and natural language processing were investigated. This investigation led to the identification of four main processes: the parsing process, syntactic processing, semantic processing and conceptual processing. The model showed the complex interdependency between these four processes and provided a theoretical framework in which the behaviour of the mathematics learner could be analysed. The model was then extended to include the use of technological artefacts into the learning process. To facilitate this aspect of the research, the theory of instrumentation was incorporated into the semantic model. The conclusion of this research was that although the cognitive processes were interdependent, they could develop at different rates until mastery of a topic was achieved. It also found that the introduction of a technological artefact into the learning environment introduced another layer of complexity, both in terms of the learning process and the underlying relationship between the four cognitive processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our goal here is a more complete understanding of how information about luminance contrast is encoded and used by the binocular visual system. In two-interval forced-choice experiments we assessed observers' ability to discriminate changes in contrast that could be an increase or decrease of contrast in one or both eyes, or an increase in one eye coupled with a decrease in the other (termed IncDec). The base or pedestal contrasts were either in-phase or out-of-phase in the two eyes. The opposed changes in the IncDec condition did not cancel each other out, implying that along with binocular summation, information is also available from mechanisms that do not sum the two eyes' inputs. These might be monocular mechanisms. With a binocular pedestal, monocular increments of contrast were much easier to see than monocular decrements. These findings suggest that there are separate binocular (B) and monocular (L,R) channels, but only the largest of the three responses, max(L,B,R), is available to perception and decision. Results from contrast discrimination and contrast matching tasks were described very accurately by this model. Stimuli, data, and model responses can all be visualized in a common binocular contrast space, allowing a more direct comparison between models and data. Some results with out-of-phase pedestals were not accounted for by the max model of contrast coding, but were well explained by an extended model in which gratings of opposite polarity create the sensation of lustre. Observers can discriminate changes in lustre alongside changes in contrast.