16 resultados para Model Construction and Estimation

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the forecasting accuracy of alternative vector autoregressive models each in a seven-variable system that comprises in turn of daily, weekly and monthly foreign exchange (FX) spot rates. The vector autoregressions (VARs) are in non-stationary, stationary and error-correction forms and are estimated using OLS. The imposition of Bayesian priors in the OLS estimations also allowed us to obtain another set of results. We find that there is some tendency for the Bayesian estimation method to generate superior forecast measures relatively to the OLS method. This result holds whether or not the data sets contain outliers. Also, the best forecasts under the non-stationary specification outperformed those of the stationary and error-correction specifications, particularly at long forecast horizons, while the best forecasts under the stationary and error-correction specifications are generally similar. The findings for the OLS forecasts are consistent with recent simulation results. The predictive ability of the VARs is very weak.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strategic sourcing has increased in importance in recent years, and now plays an important role in companies’ planning. The current volatility in supply markets means companies face multiple challenges involving lock-in situations, supplier bankruptcies or supply security issues. In addition, their exposure can increase due to natural disasters, as witnessed recently in the form of bird flu, volcanic ash and tsunamis. Therefore, the primary focus of this study is risk management in the context of strategic sourcing. The study presents a literature review on sourcing based on the 15 years from 1998–2012, and considers 131 academic articles. The literature describes strategic sourcing as a strategic, holistic process in managing supplier relationships, with a long-term focus on adding value to the company and realising competitive advantage. Few studies discovered the real risk impact and status of risk management in strategic sourcing, and evaluation across countries and industries was limited, with the construction sector particularly under-researched. This methodology is founded on a qualitative study of twenty cases across Ger-many and the United Kingdom from the construction sector and electronics manufacturing industries. While considering risk management in the context of strategic sourcing, the thesis takes into account six dimensions that cover trends in strategic sourcing, theoretical and practical sourcing models, risk management, supply and demand management, critical success factors and the strategic supplier evaluation. The study contributes in several ways. First, recent trends are traced and future needs identified across the research dimensions of countries, industries and companies. Second, it evaluates critical success factors in contemporary strategic sourcing. Third, it explores the application of theoretical and practical sourcing models in terms of effectiveness and sustainability. Fourth, based on the case study findings, a risk-oriented strategic sourcing framework and a model for strategic sourcing are developed. These are based on the validation of contemporary requirements and a critical evaluation of the existing situation. It contemplates the empirical findings and leads to a structured process to manage risk in strategic sourcing. The risk-oriented framework considers areas such as trends, corporate and sourcing strategy, critical success factors, strategic supplier selection criteria, risk assessment, reporting, strategy alignment and reporting. The proposed model highlights the essential dimensions in strategic sourcing and guides us to a new definition of strategic sourcing supported by this empirical study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper outlines a novel elevation linear Fresnel reflector (ELFR) and presents and validates theoretical models defining its thermal performance. To validate the models, a series of experiments were carried out for receiver temperatures in the range of 30-100 °C to measure the heat loss coefficient, gain in heat transfer fluid (HTF) temperature, thermal efficiency, and stagnation temperature. The heat loss coefficient was underestimated due to the model exclusion of collector end heat losses. The measured HTF temperature gains were found to have a good correlation to the model predictions - less than a 5% difference. In comparison to model predictions for the thermal efficiency and stagnation temperature, measured values had a difference of -39% to +31% and 22-38%, respectively. The difference between the measured and predicted values was attributed to the low-temperature region for the experiments. It was concluded that the theoretical models are suitable for examining linear Fresnel reflector (LFR) systems and can be adopted by other researchers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been recognised for some time that a full code of amino acid-based recognition of DNA sequences would be useful. Several approaches, which utilise small DNA binding motifs called zinc fingers, are presently employed. None of the current approaches successfully combine a combinatorial approach to the elucidation of a code with a single stage high throughput screening assay. The work outlined here describes the development of a model system for the study of DNA protein interactions and the development of a high throughput assay for detection of such interactions. A zinc finger protein was designed which will bind with high affinity and specificity to a known DNA sequence. For future work it is possible to mutate the region of the zinc finger responsible for the specificity of binding, in order to observe the effect on the DNA / protein interactions. The zinc finger protein was initially synthesised as a His tagged product. It was not possible however to develop a high throughput assay using the His tagged zinc finger protein. The gene encoding the zinc finger protein was altered and the protein synthesised as a Glutathione S-Transferase (GST) fusion product. A successful assay was developed using the GST protein and Scintillation Proximity Assay technology (Amersham Pharmacia Biotech). The scintillation proximity assay is a dynamic assay that allows the DNA protein interactions to be studied in "real time". This assay not only provides a high throughput method of screening zinc finger proteins for potential ligands but also allows the effect of addition of reagents or competitor ligands to be monitored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper extends existing understandings of how actors' constructions of ambiguity shape the emergent process of strategic action. We theoretically elaborate the role of rhetoric in exploiting strategic ambiguity, based on analysis of a longitudinal case study of an internationalization strategy within a business school. Our data show that actors use rhetoric to construct three types of strategic ambiguity: protective ambiguity that appeals to common values in order to protect particular interests, invitational ambiguity that appeals to common values in order to invite participation in particular actions, and adaptive ambiguity that enables the temporary adoption of specific values in order to appeal to a particular audience at one point in time. These rhetorical constructions of ambiguity follow a processual pattern that shapes the emergent process of strategic action. Our findings show that (1) the strategic actions that emerge are shaped by the way actors construct and exploit ambiguity, (2) the ambiguity intrinsic to the action is analytically distinct from ambiguity that is constructed and exploited by actors, and (3) ambiguity construction shifts over time to accommodate the emerging pattern of actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many automated negotiation models have been developed to solve the conflict in many distributed computational systems. However, the problem of finding win-win outcome in multiattribute negotiation has not been tackled well. To address this issue, based on an evolutionary method of multiobjective optimization, this paper presents a negotiation model that can find win-win solutions of multiple attributes, but needs not to reveal negotiating agents' private utility functions to their opponents or a third-party mediator. Moreover, we also equip our agents with a general type of utility functions of interdependent multiattributes, which captures human intuitions well. In addition, we also develop a novel time-dependent concession strategy model, which can help both sides find a final agreement among a set of win-win ones. Finally, lots of experiments confirm that our negotiation model outperforms the existing models developed recently. And the experiments also show our model is stable and efficient in finding fair win-win outcomes, which is seldom solved in the existing models. © 2012 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particle breakage due to fluid flow through various geometries can have a major influence on the performance of particle/fluid processes and on the product quality characteristics of particle/fluid products. In this study, whey protein precipitate dispersions were used as a case study to investigate the effect of flow intensity and exposure time on the breakage of these precipitate particles. Computational fluid dynamic (CFD) simulations were performed to evaluate the turbulent eddy dissipation rate (TED) and associated exposure time along various flow geometries. The focus of this work is on the predictive modelling of particle breakage in particle/fluid systems. A number of breakage models were developed to relate TED and exposure time to particle breakage. The suitability of these breakage models was evaluated for their ability to predict the experimentally determined breakage of the whey protein precipitate particles. A "power-law threshold" breakage model was found to provide a satisfactory capability for predicting the breakage of the whey protein precipitate particles. The whey protein precipitate dispersions were propelled through a number of different geometries such as bends, tees and elbows, and the model accurately predicted the mean particle size attained after flow through these geometries. © 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To show that the limited quality of surfaces produced by one model of excimer laser systems can degrade visual performance with a polymethylmethacrylate (PMMA) model. METHODS: A range of lenses of different powers was ablated in PMMA sheets using five DOS-based Nidek EC-5000 laser systems (Nidek Technologies, Gamagori, Japan) from different clinics. Surface quality was objectively assessed using profilometry. Contrast sensitivity and visual acuity were measured through the lenses when their powers were neutralized with suitable spectacle trial lenses. RESULTS: Average surface roughness was found to increase with lens power, roughness values being higher for negative lenses than for positive lenses. Losses in visual contrast sensitivity and acuity measured in two subjects were found to follow a similar pattern. Findings are similar to those previously published with other excimer laser systems. CONCLUSIONS: Levels of surface roughness produced by some laser systems may be sufficient to degrade visual performance under some circumstances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present three jargonaphasic patients who made phonological errors in naming, repetition and reading. We analyse target/response overlap using statistical models to answer three questions: 1) Is there a single phonological source for errors or two sources, one for target-related errors and a separate source for abstruse errors? 2) Can correct responses be predicted by the same distribution used to predict errors or do they show a completion boost (CB)? 3) Is non-lexical and lexical information summed during reading and repetition? The answers were clear. 1) Abstruse errors did not require a separate distribution created by failure to access word forms. Abstruse and target-related errors were the endpoints of a single overlap distribution. 2) Correct responses required a special factor, e.g., a CB or lexical/phonological feedback, to preserve their integrity. 3) Reading and repetition required separate lexical and non-lexical contributions that were combined at output.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identity influences the practice of English language teachers and supervisors, their professional development and their ability to incorporate innovation and change. Talk during post observation feedback meetings provides participants with opportunities to articulate, construct, verify, contest and negotiate identities, processes which often engender issues of face. This study examines the construction and negotiation of identity and face in post observation feedback meetings between in-service English language teachers and supervisors at a tertiary institution in the United Arab Emirates. Within a linguistic ethnography framework, this study combined linguistic microanalysis of audio recorded feedback meetings with ethnographic data gathered from participant researcher knowledge, pre-analysis interviews and post-analysis participant interpretation interviews. Through a detailed, empirical description of situated ‘real life’ institutional talk, this study shows that supervisors construct identities involving authority, power, expertise, knowledge and experience while teachers index identities involving experience, knowledge and reflection. As well as these positive valued identities, other negative, disvalued identities are constructed. Identities are shown to be discursively claimed, verified, contested and negotiated through linguistic actions. This study also shows a link between identity and face. Analysis demonstrates that identity claims verified by an interactional partner can lead to face maintenance or support. However, a contested identity claim can lead to face threat which is usually managed by facework. Face, like identity, is found to be interactionally achieved and endogenous to situated discourse. Teachers and supervisors frequently risk face threat to protect their own identities, to contest their interactional partner’s identities or to achieve the feedback meeting goal i.e. improved teaching. Both identity and face are found to be consequential to feedback talk and therefore influence teacher development, teacher/supervisor relationships and the acceptance of feedback. Analysis highlights the evaluative and conforming nature of feedback in this context which may be hindering opportunities for teacher development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.