11 resultados para Deterministic Trend.

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present in this paper ideas to tackle the problem of analysing and forecasting nonstationary time series within the financial domain. Accepting the stochastic nature of the underlying data generator we assume that the evolution of the generator's parameters is restricted on a deterministic manifold. Therefore we propose methods for determining the characteristics of the time-localised distribution. Starting with the assumption of a static normal distribution we refine this hypothesis according to the empirical results obtained with the methods anc conclude with the indication of a dynamic non-Gaussian behaviour with varying dependency for the time series under consideration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Predicated on the assumption that employee careerist orientation resulting from organizational actions to cut costs constitutes a potential threat to their long-term profitability and success, this study proposed and tested a social exchange model of careerist orientation in the People's Republic of China. Specifically, it was hypothesized that organizational justice and career growth opportunities will be related to careerist orientation, but the relationship will be mediated by trust in employer. Structural equation modeling results provided support for the model. Trust in organization fully mediated the relationship between careerist orientation and its antecedents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For more than forty years, research has been on going in the use of the computer in the processing of natural language. During this period methods have evolved, with various parsing techniques and grammars coming to prominence. Problems still exist, not least in the field of Machine Translation. However, one of the successes in this field is the translation of sublanguage. The present work reports Deterministic Parsing, a relatively new parsing technique, and its application to the sublanguage of an aircraft maintenance manual for Machine Translation. The aim has been to investigate the practicability of using Deterministic Parsers in the analysis stage of a Machine Translation system. Machine Translation, Sublanguage and parsing are described in general terms with a review of Deterministic parsing systems, pertinent to this research, being presented in detail. The interaction between machine Translation, Sublanguage and Parsing, including Deterministic parsing, is also highlighted. Two types of Deterministic Parser have been investigated, a Marcus-type parser, based on the basic design of the original Deterministic parser (Marcus, 1980) and an LR-type Deterministic Parser for natural language, based on the LR parsing algorithm. In total, four Deterministic Parsers have been built and are described in the thesis. Two of the Deterministic Parsers are prototypes from which the remaining two parsers to be used on sublanguage have been developed. This thesis reports the results of parsing by the prototypes, a Marcus-type parser and an LR-type parser which have a similar grammatical and linguistic range to the original Marcus parser. The Marcus-type parser uses a grammar of production rules, whereas the LR-type parser employs a Definite Clause Grammar(DGC).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis investigates the properties of two trends or time series which formed a:part of the Co-Citation bibliometric model "X~Ray Crystallography and Protein Determination in 1978, 1980 and 1982". This model was one of several created for the 1983 ABRC Science Policy Study which aimed to test the utility of bibliometric models in a national science policy context. The outcome of the validation part of that study proved to be especially favourable concerning the utility of trend data, which purport to model the development of speciality areas in science over time. This assessment could have important implications for the use of such data in policy formulation. However one possible problem with the Science Policy Study's conclusions was that insufficient time was available in the study for an in-depth analysis of the data. The thesis aims to continue the validation begun in the ABRC study by providing a detailed.examination of the characteristics of the data contained in the Trends numbered 11 and 44 in the model. A novel methodology for the analysis of the properties of the trends with respect to their literature content is presented. This is followed by an assessment based on questionnaire and interview data, of the ability of Trend 44 to realistically model the historical development of the field of mobile genetic elements research over time, with respect to its scientific content and the activities of its community of researchers. The results of these various analyses are then used to evaluate the strenghts and weaknesses of a trend or time series approach to the modelling of the activities of scientifiic fields. A critical evaluation of the origins of the discovered strengths and weaknesses.in the assumptions underlying the techniques used to generate trends from co-citation data is provided. Possible improvements. to the modelling techniques are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Liberalisation has become an increasingly important policy trend, both in the private and public sectors of advanced industrial economies. This article eschews deterministic accounts of liberalisation by considering why government attempts to institute competition may be successful in some cases and not others. It considers the relative strength of explanations focusing on the institutional context, and on the volume and power of sectoral actors supporting liberalisation. These approaches are applied to two attempts to liberalise, one successful and one unsuccessful, within one sector in one nation – higher education in Britain. Each explanation is seen to have some explanatory power, but none is sufficient to explain why competition was generalised in the one case and not the other. The article counsels the need for scholars of liberalisation to be open to multiple explanations which may require the marshalling of multiple sources and types of evidence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Researchers express concern over a paucity of replications. In line with this, editorial policies of some leading marketing journals now encourage more replications. This article reports on an extension of a 1994 study to see whether these efforts have had an effect on the number of replication studies published in leading marketing journals. Results show that the replication rate has fallen to 1.2%, a decrease in the rate by half. As things now stand, practitioners should be skeptical about using the results published in marketing journals as hardly any of them have been successfully replicated, teachers should ignore the findings until they receive support via replications and researchers should put little stock in the outcomes of one-shot studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the number of using 3PL providers are increasing rapidly in recent years, 3PL providers play a major role in the logistics industry. Due to customers demands are raising and changing, it has facilitated 3PL providers to invest IT systems that could meet customer requirements and create competitive advantage. The use of IT systems could assist 3PL providers to achieve supply chain visibility and enhance supply chain collaboration with business partners. In this paper, it is mainly focus on the Europe and Far East 3PL providers in terms of current and future IT systems, IT motivators and barriers, as well as the future supply chain demands that address by IT systems. The common IT system that implemented in both regions is information technology, which is mainly used to collaborate and share information with supply chain partners. Some of the common motivations and barriers were existed and 3PL providers need to be understood. Given the future demands of IT implementation and supply chain collaboration, IT systems such as RFID and integration systems would be strongly focus in the future. The suggestion about the advanced integration system such as business process management (BPM) could be the next key IT systems in the future logistics industry. © 2012 AICIT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The body of work presented in this thesis are in three main parts: [1] the effect of ultrasound on freezing events of ionic systems, [2] the importance of formulation osmolality in freeze drying, and [3] a novel system for increasing primary freeze drying rate. Chapter 4 briefly presents the work on method optimisation, which is still very much in its infancy. Aspects of freezing such as nucleation and ice crystal growth are strongly related with ice crystal morphology; however, the ice nucleation process typically occurs in a random, non-deterministic and spontaneous manner. In view of this, ultrasound, an emerging application in pharmaceutical sciences, has been applied to aid in the acceleration of nucleation and shorten the freezing process. The research presented in this thesis aimed to study the effect of sonication on nucleation events in ionic solutions, and more importantly how sonication impacts on the freezing process. This work confirmed that nucleation does occur in a random manner. It also showed that ultrasonication aids acceleration of the ice nucleation process and increases the freezing rate of a solution. Cryopreservation of animal sperm is an important aspect of breeding in animal science especially for endangered species. In order for sperm cryopreservation to be successful, cryoprotectants as well as semen extenders are used. One of the factors allowing semen preservation media to be optimum is the osmolality of the semen extenders used. Although preservation of animal sperm has no relation with freeze drying of pharmaceuticals, it was used in this thesis to make a case for considering the osmolality of a formulation (prepared for freeze drying) as a factor for conferring protein protection against the stresses of freeze drying. The osmolalities of some common solutes (mostly sugars) used in freeze drying were determined (molal concentration from 0.1m to 1.2m). Preliminary investigation on the osmolality and osmotic coefficients of common solutes were carried out. It was observed that the osmotic coefficient trend for the sugars analysed could be grouped based on the types of sugar they are. The trends observed show the need for further studies to be carried out with osmolality and to determine how it may be of importance to protein or API protection during freeze drying processes. Primary drying is usually the longest part of the freeze drying process, and primary drying times lasting days or even weeks are not uncommon; however, longer primary drying times lead to longer freeze drying cycles, and consequently increased production costs. Much work has been done previously by others using different processes (such as annealing) in order to improve primary drying times; however, these do not come without drawbacks. A novel system involving the formation of a frozen vial system which results in the creation of a void between the formulation and the inside wall of a vial has been devised to increase the primary freeze drying rate of formulations without product damage. Although the work is not nearly complete, it has been shown that it is possible to improve and increase the primary drying rate of formulations without making any modifications to existing formulations, changing storage vials, or increasing the surface area of freeze dryer shelves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - This research note aims to present a summary of research concerning economic-lot scheduling problem (ELSP). Design/methodology/approach - The paper's approach is to review over 100 selected studies published in the last 15 years (1997-2012), which are then grouped under different research themes. Findings - Five research themes are identified and insights for future studies are reported at the end of this paper. Research limitations/implications - The motivation of preparing this research note is to summarize key research studies in this field since 1997, when the ELSP problems have been verified as NP-hard. Originality/value - ELSP is an important scheduling problem that has been studied since the 1950s. Because of its complexity in delivering a feasible analytical closed form solution, many studies in the last two decades employed heuristic algorithms in order to come up with good and acceptable solutions. As a consequence, the solution approaches are quite diversified. The major contribution of this paper is to provide researchers who are interested in this area with a quick reference guide on the reviewed studies. © Emerald Group Publishing Limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For an erbium-doped fiber laser mode-locked by carbon nanotubes, we demonstrate experimentally and theoretically a new type of the vector rogue waves emerging as a result of the chaotic evolution of the trajectories between two orthogonal states of polarization on the Poincare sphere. In terms of fluctuation induced phenomena, by tuning polarization controller for the pump wave and in-cavity polarization controller, we are able to control the Kramers time, i.e. the residence time of the trajectory in vicinity of each orthogonal state of polarization, and so can cause the rare events satisfying rogue wave criteria and having the form of transitions from the state with the long residence time to the state with a short residence time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper demonstrates a mechanism whereby rules can be extracted from a feedforward neural network trained to characterize the inflation "pass-through" problem in American monetary policy, defined as the relationship between changes in the growth rate(s) of individual commodities and the economy-wide rate of growth of consumer prices. Monthly price data are encoded and used to train a group of candidate connectionist architectures. One candidate is selected for rule extraction, using a custom decompositional extraction algorithm that generates rules in human-readable and machine-executable form. Rule and network accuracy are compared, and comments are made on the relationships expressed within the discovered rules. The types of discovered relationships could be used to guide monetary policy decisions.