887 resultados para full-scale testing
Resumo:
This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.
Resumo:
This volume brings together French and British scholars of France to analyse one of French politics' most intellectually compelling phenomena, the presidency of the republic. It examines the strengths and weaknesses of that leadership as well as the way that executive power has been established in the Fifth Republic; how presidential power and the subsequent full scale development of 'personality politics' developed within an essentially party-driven, democratic and, most importantly, republican system. Hence the authors in this volume examine the phenomenon of a strong presidency in the French republican framework. The individual chapters focus on the presidency and upon the individual presidents and the way in which they have addressed their own relation to the presidencies they presided over on top of a range of other factors informing their terms of office. A conclusion sums up and appraises the contemporary role of the French presidency within the party system and the republic. The project has generated a great deal of interest in the French political studies community.
Resumo:
Previous research into formulaic language has focussed on specialised groups of people (e.g. L1 acquisition by infants and adult L2 acquisition) with ordinary adult native speakers of English receiving less attention. Additionally, whilst some features of formulaic language have been used as evidence of authorship (e.g. the Unabomber’s use of you can’t eat your cake and have it too) there has been no systematic investigation into this as a potential marker of authorship. This thesis reports the first full-scale study into the use of formulaic sequences by individual authors. The theory of formulaic language hypothesises that formulaic sequences contained in the mental lexicon are shaped by experience combined with what each individual has found to be communicatively effective. Each author’s repertoire of formulaic sequences should therefore differ. To test this assertion, three automated approaches to the identification of formulaic sequences are tested on a specially constructed corpus containing 100 short narratives. The first approach explores a limited subset of formulaic sequences using recurrence across a series of texts as the criterion for identification. The second approach focuses on a word which frequently occurs as part of formulaic sequences and also investigates alternative non-formulaic realisations of the same semantic content. Finally, a reference list approach is used. Whilst claiming authority for any reference list can be difficult, the proposed method utilises internet examples derived from lists prepared by others, a procedure which, it is argued, is akin to asking large groups of judges to reach consensus about what is formulaic. The empirical evidence supports the notion that formulaic sequences have potential as a marker of authorship since in some cases a Questioned Document was correctly attributed. Although this marker of authorship is not universally applicable, it does promise to become a viable new tool in the forensic linguist’s tool-kit.
Resumo:
This article examines the development and impact of German citizenship policy over the past decade. As its point of departure, it takes the 2000 Citizenship Law, which sought to undertake a full-scale reform and liberalisation of access to German membership. The article discusses this law’s content and subsequent amendments, focusing particularly on its quantitative impact, asking why the number of naturalisations has been lower than originally expected. The article outlines current challenges to the law’s structure operation and identifies potential trajectories for its future development.
A conceptual framework for supply chain collaboration:empirical evidence from the agri-food industry
Resumo:
Purpose - The purpose of this paper is to analyse the concept of supply chain collaboration and to provide an overall framework that can be used as a conceptual landmark for further empirical research. In addition, the concept is explored in the context of agri-food industry and particularities are identified. Finally, the paper submits empirical evidence from an exploratory case study in the agri-food industry, at the grower-processor interface, and information regarding the way the concept is actually applied in small medium-sized enterprises (SMEs) is presented. Design/methodology/approach - The paper employed case study research by conducting in-depth interviews in the two companies. Findings - Supply chain collaboration concept is of significant importance for the agri-food industry however, some constraints arise due to the nature of industry's products, and the specific structure of the sector. Subsequently, collaboration in the supply chain is often limited to operational issues and to logistics-related activities. Research limitations/implications - Research is limited to a single case study and further qualitative testing of the conceptual model is needed in order to adjust the model before large scale testing. Practical implications - Case study findings may be transferable to other similar dual relationships at the grower-processor interface. Weaker parts in asymmetric relationships have opportunities to improve their position, altering the dependence balance, by achieving product/process excellence. Originality/value - The paper provides evidence regarding the applicability of the supply chain collaboration concept in the agri-food industry. It takes into consideration not relationships between big multinational companies, but SMEs. © Emerald Group Publishing Limited.
Resumo:
Detection thresholds for two visual- and two auditory-processing tasks were obtained for 73 children and young adults who varied broadly in reading ability. A reading-disabled subgroup had significantly higher thresholds than a normal-reading subgroup for the auditory tasks only. When analyzed across the whole group, the auditory tasks and one of the visual tasks, coherent motion detection, were significantly related to word reading. These effects were largely independent of ADHD ratings; however, none of these measures accounted for significant variance in word reading after controlling for full-scale IQ. In contrast, phoneme awareness, rapid naming, and nonword repetition each explained substantial, significant word reading variance after controlling for IQ, suggesting more specific roles for these oral language skills in the development of word reading. © 2004 Elsevier Inc. All rights reserved.
Resumo:
Growth of complexity and functional importance of integrated navigation systems (INS) leads to high losses at the equipment refusals. The paper is devoted to the INS diagnosis system development, allowing identifying the cause of malfunction. The proposed solutions permit taking into account any changes in sensors dynamic and accuracy characteristics by means of the appropriate error models coefficients. Under actual conditions of INS operation, the determination of current values of the sensor models and estimation filter parameters rely on identification procedures. The results of full-scale experiments are given, which corroborate the expediency of INS error models parametric identification in bench test process.
Resumo:
Clogging is the main operational problem associated with horizontal subsurface flow constructed wetlands (HSSF CWs). The measurement of saturated hydraulic conductivity has proven to be a suitable technique to assess clogging within HSSF CWs. The vertical and horizontal distribution of hydraulic conductivity was assessed in two full-scale HSSF CWs by using two different in situ permeameter methods (falling head (FH) and constant head (CH) methods). Horizontal hydraulic conductivity profiles showed that both methods are correlated by a power function (FH= CH 0.7821, r 2=0.76) within the recorded range of hydraulic conductivities (0-70 m/day). However, the FH method provided lower values of hydraulic conductivity than the CH method (one to three times lower). Despite discrepancies between the magnitudes of reported readings, the relative distribution of clogging obtained via both methods was similar. Therefore, both methods are useful when exploring the general distribution of clogging and, specially, the assessment of clogged areas originated from preferential flow paths within full-scale HSSF CWs. Discrepancy between methods (either in magnitude and pattern) aroused from the vertical hydraulic conductivity profiles under highly clogged conditions. It is believed this can be attributed to procedural differences between the methods, such as the method of permeameter insertion (twisting versus hammering). Results from both methods suggest that clogging develops along the shortest distance between water input and output. Results also evidence that the design and maintenance of inlet distributors and outlet collectors appear to have a great influence on the pattern of clogging, and hence the asset lifetime of HSSF CWs. © Springer Science+Business Media B.V. 2011.
Resumo:
Background: Qualitative research has suggested that spousal carers of someone with dementia differ in terms of whether they perceive their relationship with that person as continuous with the premorbid relationship or as radically different, and that a perception of continuity may be associated with more person-centered care and the experience of fewer of the negative emotions associated with caring. The aim of the study was to develop and evaluate a quantitative measure of the extent to which spousal carers perceive the relationship to be continuous. Methods: An initial pool of 42 questionnaire items was generated on the basis of the qualitative research about relationship continuity. These were completed by 51 spousal carers and item analysis was used to reduce the pool to 23 items. The retained items, comprising five subscales, were then administered to a second sample of 84 spousal carers, and the questionnaire's reliability, discriminative power, and validity were evaluated. Results: The questionnaire showed good reliability: Cronbach's α for the full scale was 0.947, and test-retest reliability was 0.932. Ferguson's δ was 0.987, indicating good discriminative power. Evidence of construct validity was provided by predicted patterns of subscale correlations with the Closeness and Conflict Scale and the Marwit-Meuser Caregiver Grief Inventory. Conclusion: Initial psychometric evaluation of the measure was encouraging. The measure provides a quantitative means of investigating ideas from qualitative research about the role of relationship continuity in influencing how spousal carers provide care and how they react emotionally to their caring role. © 2012 International Psychogeriatric Association.
Resumo:
A method of accurately controlling the position of a mobile robot using an external large volume metrology (LVM) instrument is presented in this article. By utilising an LVM instrument such as a laser tracker or indoor GPS (iGPS) in mobile robot navigation, many of the most difficult problems in mobile robot navigation can be simplified or avoided. Using the real-time position information from the laser tracker, a very simple navigation algorithm, and a low cost robot, 5mm repeatability was achieved over a volume of 30m radius. A surface digitisation scan of a wind turbine blade section was also demonstrated, illustrating possible applications of the method for manufacturing processes. Further, iGPS guidance of a small KUKA omni-directional robot has been demonstrated, and a full scale prototype system is being developed in cooperation with KUKA Robotics, UK. © 2011 Taylor & Francis.
Resumo:
The tobacco industry's future depends on increasing tobacco use in low-income and middle-income countries (LMICs), which face a growing burden of tobacco-related disease, yet have potential to prevent full-scale escalation of this epidemic. To drive up sales the industry markets its products heavily, deliberately targeting non-smokers and keeps prices low until smoking and local economies are sufficiently established to drive prices and profits up. The industry systematically flaunts existing tobacco control legislation and works aggressively to prevent future policies using its resource advantage to present highly misleading economic arguments, rebrand political activities as corporate social responsibility, and establish and use third parties to make its arguments more palatable. Increasingly it is using domestic litigation and international arbitration to bully LMICs from implementing effective policies and hijacking the problem of tobacco smuggling for policy gain, attempting to put itself in control of an illegal trade in which there is overwhelming historical evidence of its complicity. Progress will not be realised until tobacco industry interference is actively addressed as outlined in Article 5.3 of the Framework Convention on Tobacco Control. Exemplar LMICs show this action can be achieved and indicate that exposing tobacco industry misconduct is an essential first step.
Resumo:
This article considers North Korea and the notion of crisis, by linking historical development over the Korean peninsula to the conflict resolution literature, and investigates why despite a large number of destabilising events, a war involving Pyongyang has yet to erupt. The paper considers historical data and uses a framework developed by Aggarwal et al. in order to highlight patterns of interaction between states such as the United States, North Korea and South Korea, organisations such as the United Nations, as well as processes such as the Six-Party Talk and the Agreed Framework. The paper then develops a crisis framework based on conflict resolution and negotiation literature, and applies it to three North Korean administrations. Findings suggests that an elastic understanding of time (for all parties involved on the peninsula) leads to an impossibility to reach a threshold where full-scale war would be triggered, thus leaving parties in a stable state of crisis for which escalating moves and de-escalating techniques might become irrelevant.
Resumo:
The paper intends to give an insight into the relations of the economic and political systems of the Central Asian republics using the theoretical framework of the "rentier economy" and "rentier state" approach. The main findings of the paper are that two (Kazakhstan and Turkmenistan) of the five states examined are commodity export dependent “full-scale” rentier states. The two political systems are of a stable neo-patrimonial regime character, while the Kyrgyz Republic and Tajikistan, poor in natural resources but dependent on external rents, may be described as "semi-rentier" states or "rentier economies". They are politically more instable, but have an altogether authoritarian, oligarchical “clan-based” character. Uzbekistan with its closed economy, showing tendencies of economic autarchy, is also a potentially politically unstable clan-based regime. Thus, in the Central Asian context, the rentier state or rentier economy character affects the political stability of the actual regimes rather than having a direct impact on whether power is exercised in an autocratic or democratic way.
Resumo:
Hazardous radioactive liquid waste is the legacy of more than 50 years of plutonium production associated with the United States' nuclear weapons program. It is estimated that more than 245,000 tons of nitrate wastes are stored at facilities such as the single-shell tanks (SST) at the Hanford Site in the state of Washington, and the Melton Valley storage tanks at Oak Ridge National Laboratory (ORNL) in Tennessee. In order to develop an innovative, new technology for the destruction and immobilization of nitrate-based radioactive liquid waste, the United State Department of Energy (DOE) initiated the research project which resulted in the technology known as the Nitrate to Ammonia and Ceramic (NAC) process. However, inasmuch as the nitrate anion is highly mobile and difficult to immobilize, especially in relatively porous cement-based grout which has been used to date as a method for the immobilization of liquid waste, it presents a major obstacle to environmental clean-up initiatives. Thus, in an effort to contribute to the existing body of knowledge and enhance the efficacy of the NAC process, this research involved the experimental measurement of the rheological and heat transfer behaviors of the NAC product slurry and the determination of the optimal operating parameters for the continuous NAC chemical reaction process. Test results indicate that the NAC product slurry exhibits a typical non-Newtonian flow behavior. Correlation equations for the slurry's rheological properties and heat transfer rate in a pipe flow have been developed; these should prove valuable in the design of a full-scale NAC processing plant. The 20-percent slurry exhibited a typical dilatant (shear thickening) behavior and was in the turbulent flow regime due to its lower viscosity. The 40-percent slurry exhibited a typical pseudoplastic (shear thinning) behavior and remained in the laminar flow regime throughout its experimental range. The reactions were found to be more efficient in the lower temperature range investigated. With respect to leachability, the experimental final NAC ceramic waste form is comparable to the final product of vitrification, the technology chosen by DOE to treat these wastes. As the NAC process has the potential of reducing the volume of nitrate-based radioactive liquid waste by as much as 70 percent, it not only promises to enhance environmental remediation efforts but also effect substantial cost savings. ^
Resumo:
In a previous issue, DL Alan J. Parker presented a case for the smart utilization of microcomputers in the hospitality industry. But what should hotel managers of today look for when utilizing a full scale hotel computer system? This article attempts to aid the hotelier in compiling a series of functions which management should expect from any system chosen