70 resultados para Contracts of execution
Resumo:
Price movements in many commodity markets exhibit significant seasonal patterns. However, given an observed futures price, a deterministic seasonal component at the price level is not relevant for the pricing of commodity options. In contrast, this is not true for the seasonal pattern observed in the volatility of the commodity price. Analyzing an extensive sample of soybean, corn, heating oil and natural gas options, we find that seasonality in volatility is an important aspect to consider when valuing these contracts. The inclusion of an appropriate seasonality adjustment significantly reduces pricing errors in these markets and yields more improvement in valuation accuracy than increasing the number of stochastic factors.
Resumo:
Purpose – The purpose of this paper is to demonstrate how key strategic decisions are made in practice at successful FTSE 100 companies. Design/methodology/approach – The paper is based on a semi-structured interview with Ms Cynthia Carroll, Chief Executive of Anglo American plc. Findings – The interview outlines a number of important factors on: the evolution of strategy within Anglo American, strategy execution, leadership at board and executive levels, and capturing synergies within the company. Originality/value – The paper bridges the gap between theory and practice. It provides a practical view and demonstrates how corporate leaders think about key strategic issues
Resumo:
Temporary work has expanded in the last three decades with adverse implications for inequalities. Because temporary workers are a constituency that is unlikely to impose political costs, governments often choose to reduce temporary work regulations. While most European countries have indeed implemented such reforms, France went in the opposite direction, despite having both rigid labour markets and high unemployment. My argument to solve this puzzle is that where replaceability is high, workers in permanent and temporary contracts have overlapping interests, and governments choose to regulate temporary work to protect permanent workers. In turn, replaceability is higher where permanent workers’ skills are general and wage coordination is low. Logistic regression analysis of the determinants of replaceability — and how this affects governments’ reforms of temporary work regulations — supports my argument. Process tracing of French reforms also confirm that the left has tightened temporary work regulations to compensate for the high replaceability.
Resumo:
We discuss public policy towards vertical relations, comparing different types of contracts between a manufacturer and a maximum of two retailers. Together with (potential) price competition between the retailers, we study the role of a (sunk) differentiation cost paid by them in order to relax competition in the retail market and broaden the market potential of the distributed product. This non-price competition element in the downstream market is responsible for our conclusion that, unlike in standard policy guidelines and previous theoretical analysis, restrictions in intra-brand competition may deserve a permissive treatment even in the absence of inter-brand competition, if retailer differentiation is costly.
Resumo:
Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick
Resumo:
This is a fully revised edition of the UK’s leading textbook on the law governing construction contracts and the management and administration of those contracts. Although the legal principles involved are an aspect of general contract law, the practical and commercial complexities of the construction industry have increasingly made this a specialist area. This new edition has been brought up to date with recent cases and developments in the law as it stands at March 2007. The basic approach of the book has been retained. Rather than provide a commentary on standard-form contracts, our approach is to introduce the general principles that underlie contracts in construction, illustrating them by reference to the most important standard forms currently in use. Some of the common standard-form contracts have been revised since the previous edition, and the text has been revised to take account of these changes. Practitioners (consultants, builders, clients and lawyers) will find this an extremely useful source of reference, providing in-depth explanations for all of the features found in contemporary construction contracts, with reasons. A unique feature of this book is the way that it brings together the relevant principles of law with the practical issues arising in construction cases. It is a key text for construction undergraduates and postgraduates as well as for those taking the RIBA Part III and CIOB Part II examinations.
Resumo:
This paper presents and implements a number of tests for non-linear dependence and a test for chaos using transactions prices on three LIFFE futures contracts: the Short Sterling interest rate contract, the Long Gilt government bond contract, and the FTSE 100 stock index futures contract. While previous studies of high frequency futures market data use only those transactions which involve a price change, we use all of the transaction prices on these contracts whether they involve a price change or not. Our results indicate irrefutable evidence of non-linearity in two of the three contracts, although we find no evidence of a chaotic process in any of the series. We are also able to provide some indications of the effect of the duration of the trading day on the degree of non-linearity of the underlying contract. The trading day for the Long Gilt contract was extended in August 1994, and prior to this date there is no evidence of any structure in the return series. However, after the extension of the trading day we do find evidence of a non-linear return structure.
Resumo:
Enterprise Architecture (EA) has been recognised as an important tool in modern business management for closing the gap between strategy and its execution. The current literature implies that for EA to be successful, it should have clearly defined goals. However, the goals of different stakeholders are found to be different, even contradictory. In our explorative research, we seek an answer to the questions: What kind of goals are set for the EA implementation? How do the goals evolve during the time? Are the goals different among stakeholders? How do they affect the success of EA? We analysed an EA pilot conducted among eleven Finnish Higher Education Institutions (HEIs) in 2011. The goals of the pilot were gathered from three different stages of the pilot: before the pilot, during the pilot, and after the pilot, by means of a project plan, interviews during the pilot and a questionnaire after the pilot. The data was analysed using qualitative and quantitative methods. Eight distinct goals were recognised by the coding: Adopt EA Method, Build Information Systems, Business Development, Improve Reporting, Process Improvement, Quality Assurance, Reduce Complexity, and Understand the Big Picture. The success of the pilot was analysed statistically using the scale 1-5. Results revealed that goals set before the pilot were very different from those mentioned during the pilot, or after the pilot. Goals before the pilot were mostly related to expected benefits from the pilot, whereas the most important result was to adopt the EA method. Results can be explained by possibly different roles of respondents, which in turn were most likely caused by poor communication. Interestingly, goals mentioned by different stakeholders were not limited to their traditional areas of responsibility. For example, in some cases Chief Information Officers' goals were Quality Assurance and Process Improvement, whereas managers’ goals were Build Information Systems and Adopt EA Method. This could be a result of a good understanding of the meaning of EA, or stakeholders do not regard EA as their concern at all. It is also interesting to notice that regardless of the different perceptions of goals among stakeholders, all HEIs felt the pilot to be successful. Thus the research does not provide support to confirm the link between clear goals and success.
Resumo:
This paper presents a software-based study of a hardware-based non-sorting median calculation method on a set of integer numbers. The method divides the binary representation of each integer element in the set into bit slices in order to find the element located in the middle position. The method exhibits a linear complexity order and our analysis shows that the best performance in execution time is obtained when slices of 4-bit in size are used for 8-bit and 16-bit integers, in mostly any data set size. Results suggest that software implementation of bit slice method for median calculation outperforms sorting-based methods with increasing improvement for larger data set size. For data set sizes of N > 5, our simulations show an improvement of at least 40%.
Resumo:
Purpose – The purpose of this paper is to examine the critical assumptions lying behind the Anglo American model of corporate governance. Design/methodology/approach – Literature review examining the concept of a nexus of contracts underpinning agency theory which, it is argued, act as the platform for neo-liberal corporate governance focusing on shareholder wealth creation. Findings – The paper highlights the unaddressed critical challenge of why eighteenth century ownership structures are readily adopted in the twenty-first century. Social implications – A re-examination of wealth creation and wealth redistribution. Originality/value – The paper is highly original due to the fact that few contributions have been made in the area of rethinking shareholder value.
Resumo:
Event-related desynchronization (ERD) of the electroencephalogram (EEG) from the motor cortex is associated with execution, observation, and mental imagery of motor tasks. Generation of ERD by motor imagery (MI) has been widely used for brain-computer interfaces (BCIs) linked to neuroprosthetics and other motor assistance devices. Control of MI-based BCIs can be acquired by neurofeedback training to reliably induce MI-associated ERD. To develop more effective training conditions, we investigated the effect of static and dynamic visual representations of target movements (a picture of forearms or a video clip of hand grasping movements) during the BCI training. After 4 consecutive training days, the group that performed MI while viewing the video showed significant improvement in generating MI-associated ERD compared with the group that viewed the static image. This result suggests that passively observing the target movement during MI would improve the associated mental imagery and enhance MI-based BCIs skills.
Resumo:
Background Event-related desynchronization/synchronization (ERD/ERS) is a relative power decrease/increase of electroencephalogram (EEG) in a specific frequency band during physical motor execution and mental motor imagery, thus it is widely used for the brain-computer interface (BCI) purpose. However what the ERD really reflects and its frequency band specific role have not been agreed and are under investigation. Understanding the underlying mechanism which causes a significant ERD would be crucial to improve the reliability of the ERD-based BCI. We systematically investigated the relationship between conditions of actual repetitive hand movements and resulting ERD. Methods Eleven healthy young participants were asked to close/open their right hand repetitively at three different speeds (Hold, 1/3 Hz, and 1 Hz) and four distinct motor loads (0, 2, 10, and 15 kgf). In each condition, participants repeated 20 experimental trials, each of which consisted of rest (8–10 s), preparation (1 s) and task (6 s) periods. Under the Hold condition, participants were instructed to keep clenching their hand (i.e., isometric contraction) during the task period. Throughout the experiment, EEG signals were recorded from left and right motor areas for offline data analysis. We obtained time courses of EEG power spectrum to discuss the modulation of mu and beta-ERD/ERS due to the task conditions. Results We confirmed salient mu-ERD (8–13 Hz) and slightly weak beta-ERD (14–30 Hz) on both hemispheres during repetitive hand grasping movements. According to a 3 × 4 ANOVA (speed × motor load), both mu and beta-ERD during the task period were significantly weakened under the Hold condition, whereas no significant difference in the kinetics levels and interaction effect was observed. Conclusions This study investigates the effect of changes in kinematics and kinetics on resulting ERD during repetitive hand grasping movements. The experimental results suggest that the strength of ERD may reflect the time differentiation of hand postures in motor planning process or the variation of proprioception resulting from hand movements, rather than the motor command generated in the down stream, which recruits a group of motor neurons.
Resumo:
Various complex oscillatory processes are involved in the generation of the motor command. The temporal dynamics of these processes were studied for movement detection from single trial electroencephalogram (EEG). Autocorrelation analysis was performed on the EEG signals to find robust markers of movement detection. The evolution of the autocorrelation function was characterised via the relaxation time of the autocorrelation by exponential curve fitting. It was observed that the decay constant of the exponential curve increased during movement, indicating that the autocorrelation function decays slowly during motor execution. Significant differences were observed between movement and no moment tasks. Additionally, a linear discriminant analysis (LDA) classifier was used to identify movement trials with a peak accuracy of 74%.
Resumo:
Many human behaviours and pathologies have been attributed to the putative mirror neuron system, a neural system that is active during both the observation and execution of actions. While there are now a very large number of papers on the mirror neuron system, variations in the methods and analyses employed by researchers mean that the basic characteristics of the mirror response are not clear. This review focuses on three important aspects of the mirror response, as measured by modulations in corticospinal excitability: (1) muscle specificity, (2) direction, and (3) timing of modulation. We focus mainly on electromyographic (EMG) data gathered following single-pulse transcranial magnetic stimulation (TMS), because this method provides precise information regarding these three aspects of the response. Data from paired-pulse TMS paradigms and peripheral nerve stimulation (PNS) are also considered when we discuss the possible mechanisms underlying the mirror response. In this systematic review of the literature, we examine the findings of 85 TMS and PNS studies of the human mirror response, and consider the limitations and advantages of the different methodological approaches these have adopted in relation to discrepancies between their findings. We conclude by proposing a testable model of how action observation modulates corticospinal excitability in humans. Specifically, we propose that action observation elicits an early, non-specific facilitation of corticospinal excitability (at around 90 ms from action onset), followed by a later modulation of activity specific to the muscles involved in the observed action (from around 200 ms). Testing this model will greatly advance our understanding of the mirror mechanism and provide a more stable grounding on which to base inferences about its role in human behaviour.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.