960 resultados para conditional
Resumo:
The existence of sting jets as a potential source of damaging surface winds during the passage of extratropical cyclones has recently been recognized However, there are still very few published studies on the subject Furthermore, although ills known that other models are capable of reproducing sting jets, in the published literature only one numerical model [the Met Office Unified Model (MetUM)] has been used to numerically analyze these phenomena This article alms to improve our understanding of the processes that contribute to the development of sting jets and show that model differences affect the evolution of modeled sting jets A sting jet event during the passage of a cyclone over the United Kingdom on 26 February 2002 has been simulated using two mesoscale models namely the MetUM and the Consortium for Small Scale Modeling (COSMO) model to compare their performance Given the known critical importance of vertical resolution in the simulation of sting jets the vertical resolution of both models has been enhanced with respect to their operational versions Both simulations have been verified against surface measurements of maximum gusts, satellite imagery and Met Office operational synoptic analyses, as well as operational analyses from the ECMWF It is shown that both models are capable of reproducing sting jets with similar, though not identical. features Through the comparison of the results from these two models, the relevance of physical mechanisms, such as evaporative cooling and the release of conditional symmetric instability, in the generation and evolution of sting jets is also discussed
Resumo:
Numerous studies have documented the failure of the static and conditional capital asset pricing models to explain the difference in returns between value and growth stocks. This paper examines the post-1963 value premium by employing a model that captures the time-varying total risk of the value-minus-growth portfolios. Our results show that the time-series of value premia is strongly and positively correlated with its volatility. This conclusion is robust to the criterion used to sort stocks into value and growth portfolios and to the country under review (the US and the UK). Our paper is consistent with evidence on the possible role of idiosyncratic risk in explaining equity returns, and also with a separate strand of literature concerning the relative lack of reversibility of value firms' investment decisions.
Resumo:
We introduce the perspex machine which unifies projective geometry and the Turing machine, resulting in a supra-Turing machine. Specifically, we show that a Universal Register Machine (URM) can be implemented as a conditional series of whole numbered projective transformations. This leads naturally to a suggestion that it might be possible to construct a perspex machine as a series of pin-holes and stops. A rough calculation shows that an ultraviolet perspex machine might operate up to the petahertz range of operations per second. Surprisingly, we find that perspex space is irreversible in time, which might make it a candidate for an anisotropic spacetime geometry in physical theories. We make a bold hypothesis that the apparent irreversibility of physical time is due to the random nature of quantum events, but suggest that a sum over histories might be achieved by sampling fluctuations in the direction of time flow. We propose an experiment, based on the Casimir apparatus, that should measure fluctuations of time flow with respect to time duration- if such fluctuations exist.
Resumo:
International Perspective The development of GM technology continues to expand into increasing numbers of crops and conferred traits. Inevitably, the focus remains on the major field crops of soybean, maize, cotton, oilseed rape and potato with introduced genes conferring herbicide tolerance and/or pest resistance. Although there are comparatively few GM crops that have been commercialised to date, GM versions of 172 plant species have been grown in field trials in 31 countries. European Crops with Containment Issues Of the 20 main crops in the EU there are four for which GM varieties are commercially available (cotton, maize for animal feed and forage, and oilseed rape). Fourteen have GM varieties in field trials (bread wheat, barley, durum wheat, sunflower, oats, potatoes, sugar beet, grapes, alfalfa, olives, field peas, clover, apples, rice) and two have GM varieties still in development (rye, triticale). Many of these crops have hybridisation potential with wild and weedy relatives in the European flora (bread wheat, barley, oilseed rape, durum wheat, oats, sugar beet and grapes), with escapes (sunflower); and all have potential to cross-pollinate fields non-GM crops. Several fodder crops, forestry trees, grasses and ornamentals have varieties in field trials and these too may hybridise with wild relatives in the European flora (alfalfa, clover, lupin, silver birch, sweet chestnut, Norway spruce, Scots pine, poplar, elm, Agrostis canina, A. stolonifera, Festuca arundinacea, Lolium perenne, L. multiflorum, statice and rose). All these crops will require containment strategies to be in place if it is deemed necessary to prevent transgene movement to wild relatives and non-GM crops. Current Containment Strategies A wide variety of GM containment strategies are currently under development, with a particular focus on crops expressing pharmaceutical products. Physical containment in greenhouses and growth rooms is suitable for some crops (tomatoes, lettuce) and for research purposes. Aquatic bioreactors of some non-crop species (algae, moss, and duckweed) expressing pharmaceutical products have been adopted by some biotechnology companies. There are obvious limitations of the scale of physical containment strategies, addressed in part by the development of large underground facilities in the US and Canada. The additional resources required to grow plants underground incurs high costs that in the long term may negate any advantage of GM for commercial productioNatural genetic containment has been adopted by some companies through the selection of either non-food/feed crops (algae, moss, duckweed) as bio-pharming platforms or organisms with no wild relatives present in the local flora (safflower in the Americas). The expression of pharmaceutical products in leafy crops (tobacco, alfalfa, lettuce, spinach) enables growth and harvesting prior to and in the absence of flowering. Transgenically controlled containment strategies range in their approach and degree of development. Plastid transformation is relatively well developed but is not suited to all traits or crops and does not offer complete containment. Male sterility is well developed across a range of plants but has limitations in its application for fruit/seed bearing crops. It has been adopted in some commercial lines of oilseed rape despite not preventing escape via seed. Conditional lethality can be used to prevent flowering or seed development following the application of a chemical inducer, but requires 100% induction of the trait and sufficient application of the inducer to all plants. Equally, inducible expression of the GM trait requires equally stringent application conditions. Such a method will contain the trait but will allow the escape of a non-functioning transgene. Seed lethality (‘terminator’ technology) is the only strategy at present that prevents transgene movement via seed, but due to public opinion against the concept it has never been trialled in the field and is no longer under commercial development. Methods to control flowering and fruit development such as apomixis and cleistogamy will prevent crop-to-wild and wild-to-crop pollination, but in nature both of these strategies are complex and leaky. None of the genes controlling these traits have as yet been identified or characterised and therefore have not been transgenically introduced into crop species. Neither of these strategies will prevent transgene escape via seed and any feral apomicts that form are arguably more likely to become invasives. Transgene mitigation reduces the fitness of initial hybrids and so prevents stable introgression of transgenes into wild populations. However, it does not prevent initial formation of hybrids or spread to non-GM crops. Such strategies could be detrimental to wild populations and have not yet been demonstrated in the field. Similarly, auxotrophy prevents persistence of escapes and hybrids containing the transgene in an uncontrolled environment, but does not prevent transgene movement from the crop. Recoverable block of function, intein trans-splicing and transgene excision all use recombinases to modify the transgene in planta either to induce expression or to prevent it. All require optimal conditions and 100% accuracy to function and none have been tested under field conditions as yet. All will contain the GM trait but all will allow some non-native DNA to escape to wild populations or to non-GM crops. There are particular issues with GM trees and grasses as both are largely undomesticated, wind pollinated and perennial, thus providing many opportunities for hybridisation. Some species of both trees and grass are also capable of vegetative propagation without sexual reproduction. There are additional concerns regarding the weedy nature of many grass species and the long-term stability of GM traits across the life span of trees. Transgene stability and conferred sterility are difficult to trial in trees as most field trials are only conducted during the juvenile phase of tree growth. Bio-pharming of pharmaceutical and industrial compounds in plants Bio-pharming of pharmaceutical and industrial compounds in plants offers an attractive alternative to mammalian-based pharmaceutical and vaccine production. Several plantbased products are already on the market (Prodigene’s avidin, β-glucuronidase, trypsin generated in GM maize; Ventria’s lactoferrin generated in GM rice). Numerous products are in clinical trials (collagen, antibodies against tooth decay and non-Hodgkin’s lymphoma from tobacco; human gastric lipase, therapeutic enzymes, dietary supplements from maize; Hepatitis B and Norwalk virus vaccines from potato; rabies vaccines from spinach; dietary supplements from Arabidopsis). The initial production platforms for plant-based pharmaceuticals were selected from conventional crops, largely because an established knowledge base already existed. Tobacco and other leafy crops such as alfalfa, lettuce and spinach are widely used as leaves can be harvested and no flowering is required. Many of these crops can be grown in contained greenhouses. Potato is also widely used and can also be grown in contained conditions. The introduction of morphological markers may aid in the recognition and traceability of crops expressing pharmaceutical products. Plant cells or plant parts may be transformed and maintained in culture to produce recombinant products in a contained environment. Plant cells in suspension or in vitro, roots, root cells and guttation fluid from leaves may be engineered to secrete proteins that may be harvested in a continuous, non-destructive manner. Most strategies in this category remain developmental and have not been commercially adopted at present. Transient expression produces GM compounds from non-GM plants via the utilisation of bacterial or viral vectors. These vectors introduce the trait into specific tissues of whole plants or plant parts, but do not insert them into the heritable genome. There are some limitations of scale and the field release of such crops will require the regulation of the vector. However, several companies have several transiently expressed products in clinical and pre-clinical trials from crops raised in physical containment.
Resumo:
This article examines the characteristics of key measures of volatility for different types of futures contracts to provide a better foundation for modeling volatility behavior and derivative values. Particular attention is focused on analyzing how different measures of volatility affect volatility persistence relationships. Intraday realized measures of volatility are found to be more persistent than daily measures, the type of GARCH procedure used for conditional volatility analysis is critical, and realized volatility persistence is not coherent with conditional volatility persistence. Specifically, although there is a good fit between the realized and conditional volatilities, no coherence exists between their degrees of persistence, a counterintuitive finding that shows realized and conditional volatility measures are not a substitute for one another
Resumo:
Sting jets are transient coherent mesoscale strong wind features that can cause damaging surface wind gusts in extratropical cyclones. Currently, we have only limited knowledge of their climatological characteristics. Numerical weather prediction models require enough resolution to represent slantwise motions with horizontal scales of tens of kilometres and vertical scales of just a few hundred metres to represent sting jets. Hence, the climatological characteristics of sting jets and the associated extratropical cyclones can not be determined by searching for sting jets in low-resolution datasets such as reanalyses. A diagnostic is presented and evaluated for the detection in low-resolution datasets of atmospheric regions from which sting jets may originate. Previous studies have shown that conditional symmetric instability (CSI) is present in all storms studied with sting jets, while other, rapidly developing storms of a similar character but no CSI do not develop sting jets. Therefore, we assume that the release of CSI is needed for sting jets to develop. While this instability will not be released in a physically realistic way in low-resolution models (and hence sting jets are unlikely to occur), it is hypothesized that the signature of this instability (combined with other criteria that restrict analysis to moist mid-tropospheric regions in the neighbourhood of a secondary cold front) can be used to identify cyclones in which sting jets occurred in reality. The diagnostic is evaluated, and appropriate parameter thresholds defined, by applying it to three case studies simulated using two resolutions (with CSI-release resolved in only the higher-resolution simulation).
Resumo:
Internal risk management models of the kind popularized by J. P. Morgan are now used widely by the world’s most sophisticated financial institutions as a means of measuring risk. Using the returns on three of the most popular futures contracts on the London International Financial Futures Exchange, in this paper we investigate the possibility of using multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models for the calculation of minimum capital risk requirements (MCRRs). We propose a method for the estimation of the value at risk of a portfolio based on a multivariate GARCH model. We find that the consideration of the correlation between the contracts can lead to more accurate, and therefore more appropriate, MCRRs compared with the values obtained from a univariate approach to the problem.
Resumo:
This paper investigates the frequency of extreme events for three LIFFE futures contracts for the calculation of minimum capital risk requirements (MCRRs). We propose a semiparametric approach where the tails are modelled by the Generalized Pareto Distribution and smaller risks are captured by the empirical distribution function. We compare the capital requirements form this approach with those calculated from the unconditional density and from a conditional density - a GARCH(1,1) model. Our primary finding is that both in-sample and for a hold-out sample, our extreme value approach yields superior results than either of the other two models which do not explicitly model the tails of the return distribution. Since the use of these internal models will be permitted under the EC-CAD II, they could be widely adopted in the near future for determining capital adequacies. Hence, close scrutiny of competing models is required to avoid a potentially costly misallocation capital resources while at the same time ensuring the safety of the financial system.
Resumo:
This paper considers the effect of GARCH errors on the tests proposed byPerron (1997) for a unit root in the presence of a structural break. We assessthe impact of degeneracy and integratedness of the conditional varianceindividually and find that, apart from in the limit, the testing procedure isinsensitive to the degree of degeneracy but does exhibit an increasingover-sizing as the process becomes more integrated. When we consider the GARCHspecifications that we are likely to encounter in empirical research, we findthat the Perron tests are reasonably robust to the presence of GARCH and donot suffer from severe over-or under-rejection of a correct null hypothesis.
Resumo:
Given a nonlinear model, a probabilistic forecast may be obtained by Monte Carlo simulations. At a given forecast horizon, Monte Carlo simulations yield sets of discrete forecasts, which can be converted to density forecasts. The resulting density forecasts will inevitably be downgraded by model mis-specification. In order to enhance the quality of the density forecasts, one can mix them with the unconditional density. This paper examines the value of combining conditional density forecasts with the unconditional density. The findings have positive implications for issuing early warnings in different disciplines including economics and meteorology, but UK inflation forecasts are considered as an example.
Resumo:
With the increasing frequency and magnitude of warmer days during the summer in the UK, bedding plants which were a traditional part of the urban green landscape are perceived as unsustainable and water-demanding. During recent summers when bans on irrigation have been imposed, use and sales of bedding plants have dropped dramatically having a negative financial impact on the nursery industry. Retaining bedding species as a feature in public and even private spaces in future may be conditional on them being managed in a manner that minimises their water use. Using Petunia x hybrida ‘Hurrah White’ we aimed to discover which irrigation approach was the most efficient for maintaining plants’ ornamental quality (flower numbers, size and longevity), shoot and root growth under water deficit and periods of complete water withdrawal. Plants were grown from plugs for 51 days in wooden rhizotrons (0.35 m (h) x 0.1 m (w) x 0.065 m (d)); the rhizotrons’ front comprised clear Perspex which enabled us to monitor root growth closely. Irrigation treatments were: 1. watering with the amount which constitutes 50% of container capacity by conventional surface drip-irrigation (‘50% TOP’); 2. 50% as sub-irrigation at 10 cm depth (‘50% SUB’); 3. ‘split’ irrigation: 25% as surface drip- and 25% as sub-irrigation at 15 cm depth (‘25/25 SPLIT’); 4. 25% as conventional surface drip-irrigation (‘25% TOP’). Plants were irrigated daily at 18:00 apart from days 34-36 (inclusive) when water was withdrawn for all the treatments. Plants in ‘50% SUB’ had the most flowers and their size was comparable to that of ‘50% TOP’. Differences between treatments in other ‘quality’ parameters (height, shoot number) were biologically small. There was less root growth at deeper soil surface levels for ‘50% TOP’ which indicated that irrigation methods like ‘50% SUB’ and ‘25/25 SPLIT’ and stronger water deficits encouraged deeper root growth. It is suggested that sub-irrigation at 10 cm depth with water amounts of 50% container capacity would result in the most root growth with the maximum flowering for Petunia. Leaf stomatal conductance appeared to be most sensitive to the changes in substrate moisture content in the deepest part of the soil profile, where most roots were situated.
Resumo:
This study uses a bootstrap methodology to explicitly distinguish between skill and luck for 80 Real Estate Investment Trust Mutual Funds in the period January 1995 to May 2008. The methodology successfully captures non-normality in the idiosyncratic risk of the funds. Using unconditional, beta conditional and alpha-beta conditional estimation models, the results indicate that all but one fund demonstrates poor skill. Tests of robustness show that this finding is largely invariant to REIT market conditions and maturity.
Resumo:
The issue of whether Real Estate Investment Trusts should pursue a focused or diversified investment strategy remains an ongoing debate within both the academic and industry communities. This paper considers the relationship between REITs focused on different property sectors in a GARCH-DCC framework. The daily conditional correlations reveal that since 1990 there has been a marked upward trend in the coefficients between US REIT sub-sectors. The findings imply that REITs are behaving in a far more homogeneous manner than in the past. Furthermore, the argument that REITs should be focused in order that investors can make the diversification decision is reduced.
Resumo:
This paper studies the effects of increasing formality via tax reduction and simplification schemes on micro-firm performance. It uses the 1997 Brazilian SIMPLES program. We develop a simple theoretical model to show that SIMPLES has an impact only on a segment of the micro-firm population, for which the effect of formality on firm performance can be identified, and that can be analyzed along the single dimensional quantiles of the conditional firm revenues. To estimate the effect of formality, we use an econometric approach that compares eligible and non-eligible firms, born before and after SIMPLES in a local interval about the introduction of SIMPLES. We use an estimator that combines both quantile regression and the regression discontinuity identification strategy. The empirical results corroborate the positive effect of formality on microfirms' performance and produce a clear characterization of who benefits from these programs.
Resumo:
Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.