33 resultados para unimodularity conjecture


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Almost all research fields in geosciences use numerical models and observations and combine these using data-assimilation techniques. With ever-increasing resolution and complexity, the numerical models tend to be highly nonlinear and also observations become more complicated and their relation to the models more nonlinear. Standard data-assimilation techniques like (ensemble) Kalman filters and variational methods like 4D-Var rely on linearizations and are likely to fail in one way or another. Nonlinear data-assimilation techniques are available, but are only efficient for small-dimensional problems, hampered by the so-called ‘curse of dimensionality’. Here we present a fully nonlinear particle filter that can be applied to higher dimensional problems by exploiting the freedom of the proposal density inherent in particle filtering. The method is illustrated for the three-dimensional Lorenz model using three particles and the much more complex 40-dimensional Lorenz model using 20 particles. By also applying the method to the 1000-dimensional Lorenz model, again using only 20 particles, we demonstrate the strong scale-invariance of the method, leading to the optimistic conjecture that the method is applicable to realistic geophysical problems. Copyright c 2010 Royal Meteorological Society

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the finite sample properties of model selection by information criteria in conditionally heteroscedastic models. Recent theoretical results show that certain popular criteria are consistent in that they will select the true model asymptotically with probability 1. To examine the empirical relevance of this property, Monte Carlo simulations are conducted for a set of non–nested data generating processes (DGPs) with the set of candidate models consisting of all types of model used as DGPs. In addition, not only is the best model considered but also those with similar values of the information criterion, called close competitors, thus forming a portfolio of eligible models. To supplement the simulations, the criteria are applied to a set of economic and financial series. In the simulations, the criteria are largely ineffective at identifying the correct model, either as best or a close competitor, the parsimonious GARCH(1, 1) model being preferred for most DGPs. In contrast, asymmetric models are generally selected to represent actual data. This leads to the conjecture that the properties of parameterizations of processes commonly used to model heteroscedastic data are more similar than may be imagined and that more attention needs to be paid to the behaviour of the standardized disturbances of such models, both in simulation exercises and in empirical modelling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we examine the order of integration of EuroSterling interest rates by employing techniques that can allow for a structural break under the null and/or alternative hypothesis of the unit-root tests. In light of these results, we investigate the cointegrating relationship implied by the single, linear expectations hypothesis of the term structure of interest rates employing two techniques, one of which allows for the possibility of a break in the mean of the cointegrating relationship. The aim of the paper is to investigate whether or not the interest rate series can be viewed as I(1) processes and furthermore, to consider whether there has been a structural break in the series. We also determine whether, if we allow for a break in the cointegration analysis, the results are consistent with those obtained when a break is not allowed for. The main results reported in this paper support the conjecture that the ‘short’ Euro-currency rates are characterised as I(1) series that exhibit a structural break on or near Black Wednesday, 16 September 1992, whereas the ‘long’ rates are I(1) series that do not support the presence of a structural break. The evidence from the cointegration analysis suggests that tests of the expectations hypothesis based on data sets that include the ERM crisis period, or a period that includes a structural break, might be problematic if the structural break is not explicitly taken into account in the testing framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An EPRSC ‘Partnerships for Public Engagement’ scheme 2010. FEC 122,545.56/UoR 10K everything and nothing is a performance and workshop which engages the public creatively with mathematical concepts: the Poincare conjecture, the shape of the universe, topology, and the nature of infinity are explored through an original, thought provoking piece of music theatre. Jorge Luis Borges' short story 'The Library of Babel' and the aviator Amelia Earhart’s attempt to circumnavigate the globe combine to communicate to audience key mathematical concepts of Poincare’s conjecture. The project builds on a 2008 EPSRC early development project (EP/G001650/1) and is led by an interdisciplinary team the19thstep consisting of composer Dorothy Ker, sculptor Kate Allen and mathematician Marcus du Sautoy. everything and nothing has been devised by Dorothy Ker and Kate Allen, is performed by percussionist Chris Brannick, mezzo soprano Lucy Stevens and sound designer Kelcey Swain. The UK tour targets arts-going audiences, from the Green Man Festival to the British Science Festival. Each performance is accompanied with a workshop led by Topologist Katie Steckles. Alongside the performances and workshops is a website, http://www.everythingandnothingproject.com/ The Public engagement evaluation and monitoring for the project are carried out by evaluator Bea Jefferson. The project is significant in its timely relation to contemporary mathematics and arts-science themes delivering an extensive programme of public engagement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse in a common framework the properties of the Voronoi tessellations resulting from regular 2D and 3D crystals and those of tessellations generated by Poisson distributions of points, thus joining on symmetry breaking processes and the approach to uniform random distributions of seeds. We perturb crystalline structures in 2D and 3D with a spatial Gaussian noise whose adimensional strength is α and analyse the statistical properties of the cells of the resulting Voronoi tessellations using an ensemble approach. In 2D we consider triangular, square and hexagonal regular lattices, resulting into hexagonal, square and triangular tessellations, respectively. In 3D we consider the simple cubic (SC), body-centred cubic (BCC), and face-centred cubic (FCC) crystals, whose corresponding Voronoi cells are the cube, the truncated octahedron, and the rhombic dodecahedron, respectively. In 2D, for all values α>0, hexagons constitute the most common class of cells. Noise destroys the triangular and square tessellations, which are structurally unstable, as their topological properties are discontinuous in α=0. On the contrary, the honeycomb hexagonal tessellation is topologically stable and, experimentally, all Voronoi cells are hexagonal for small but finite noise with α<0.12. Basically, the same happens in the 3D case, where only the tessellation of the BCC crystal is topologically stable even against noise of small but finite intensity. In both 2D and 3D cases, already for a moderate amount of Gaussian noise (α>0.5), memory of the specific initial unperturbed state is lost, because the statistical properties of the three perturbed regular tessellations are indistinguishable. When α>2, results converge to those of Poisson-Voronoi tessellations. In 2D, while the isoperimetric ratio increases with noise for the perturbed hexagonal tessellation, for the perturbed triangular and square tessellations it is optimised for specific value of noise intensity. The same applies in 3D, where noise degrades the isoperimetric ratio for perturbed FCC and BCC lattices, whereas the opposite holds for perturbed SCC lattices. This allows for formulating a weaker form of the Kelvin conjecture. By analysing jointly the statistical properties of the area and of the volume of the cells, we discover that also the cells shape heavily fluctuates when noise is introduced in the system. In 2D, the geometrical properties of n-sided cells change with α until the Poisson-Voronoi limit is reached for α>2; in this limit the Desch law for perimeters is shown to be not valid and a square root dependence on n is established, which agrees with exact asymptotic results. Anomalous scaling relations are observed between the perimeter and the area in the 2D and between the areas and the volumes of the cells in 3D: except for the hexagonal (2D) and FCC structure (3D), this applies also for infinitesimal noise. In the Poisson-Voronoi limit, the anomalous exponent is about 0.17 in both the 2D and 3D case. A positive anomaly in the scaling indicates that large cells preferentially feature large isoperimetric quotients. As the number of faces is strongly correlated with the sphericity (cells with more faces are bulkier), in 3D it is shown that the anomalous scaling is heavily reduced when we perform power law fits separately on cells with a specific number of faces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We perturb the SC, BCC, and FCC crystal structures with a spatial Gaussian noise whose adimensional strength is controlled by the parameter a, and analyze the topological and metrical properties of the resulting Voronoi Tessellations (VT). The topological properties of the VT of the SC and FCC crystals are unstable with respect to the introduction of noise, because the corresponding polyhedra are geometrically degenerate, whereas the tessellation of the BCC crystal is topologically stable even against noise of small but finite intensity. For weak noise, the mean area of the perturbed BCC and FCC crystals VT increases quadratically with a. In the case of perturbed SCC crystals, there is an optimal amount of noise that minimizes the mean area of the cells. Already for a moderate noise (a>0.5), the properties of the three perturbed VT are indistinguishable, and for intense noise (a>2), results converge to the Poisson-VT limit. Notably, 2-parameter gamma distributions are an excellent model for the empirical of of all considered properties. The VT of the perturbed BCC and FCC structures are local maxima for the isoperimetric quotient, which measures the degre of sphericity of the cells, among space filling VT. In the BCC case, this suggests a weaker form of the recentluy disproved Kelvin conjecture. Due to the fluctuations of the shape of the cells, anomalous scalings with exponents >3/2 is observed between the area and the volumes of the cells, and, except for the FCC case, also for a->0. In the Poisson-VT limit, the exponent is about 1.67. As the number of faces is positively correlated with the sphericity of the cells, the anomalous scaling is heavily reduced when we perform powerlaw fits separately on cells with a specific number of faces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study jointly examines herding, momentum trading and performance in real estate mutual funds (REMFs). We do this using trading and performance data for 159 REMFs across the period 1998–2008. In support of the view that Real Estate Investment Trust (REIT) stocks are relatively more transparent, we find that stock herding by REMFs is lower in REIT stocks than other stock. Herding behavior in our data reveals a tendency for managers to sell winners, reflective of the “disposition effect.” We find low overall levels of REMF momentum trading, but further evidence of the disposition effect when momentum trading is segregated into buy–sell dimensions. We test the robustness of our analysis using style analysis, and by reference to the level of fund dividend distribution. Our results for this are consistent with our conjecture about the role of transparency in herding, but they provide no new insights in relation to the momentum-trading dimensions of our analysis. Summarizing what are complex interrelationships, we find that neither herding nor momentum trading are demonstrably superior investment strategies for REMFs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine individual level property returns to see whether there is evidence of persistence in performance, i.e. a greater than expected probability of well (badly) performing properties continuing to perform well (badly) in subsequent periods. Design/methodology/approach – The same methodology originally used in Young and Graff is applied, making the results directly comparable with those for the US and Australian markets. However, it uses a much larger database covering all UK commercial property data available in the Investment Property Databank (IPD) for the years 1981 to 2002 – as many as 216,758 individual property returns. Findings – While the results of this study mimic the US and Australian results of greater persistence in the extreme first and fourth quartiles, they also evidence persistence in the moderate second and third quartiles, a notable departure from previous studies. Likewise patterns across property type, location, time, and holding period are remarkably similar. Research limitations/implications – The findings suggest that performance persistence is not a feature unique to particular markets, but instead may characterize most advanced real estate investment markets. Originality/value – As well as extending previous research geographically, the paper explores possible reasons for such persistence, consideration of which leads to the conjecture that behaviors in the practice of institutional-grade commercial real estate investment management may themselves be deeply rooted and persistent, and perhaps influenced for good or ill by agency effects. - See more at: http://www.emeraldinsight.com/journals.htm?articleid=1602884&show=abstract#sthash.hc2pCmC6.dpuf

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consistent conjectures are derived in an oligopoly model with homogeneous products and identical firms. The exercise uncovers two important findings. Absent entry, the monopolistic conjecture, is the unique consistent conjecture. With endogenous entry, no consistent conjecture exists. These results provide foundations for deriving consistentc omparatives taticsf or the food industries

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consistent conjectures applied to the food industries are investigated. This is a homogeneous-product, quantity-setting model with identical firms. When firm numbers are fixed, the consistent conjecture is monopolistic. When variable, consistency requires firm output to expand with exit and contract with entry, but no conjecture exists that is consistent with equilibrium.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The maximum 'Depth to Mate' (DTM(k)) data for k-man chess, k = 3-7, is now available: log(maxDTM(k)) demonstrates quasi-linear behaviour. This note predicts maxDTM for 8- to 10-man chess and the two-sigma distributions around these figures. 'Haworth's Law' is the conjecture that maxDTM will continue to demonstrate this behaviour for some time to come. The supporting datafile is a pgn of maxDTM positions, each having a DTM-minimaxing line of play from it to 'mate'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in our understanding of the large-scale electric and magnetic fields in the coupled magnetosphere-ionosphere system are reviewed. The literature appearing in the period January 1991–June 1993 is sorted into 8 general areas of study. The phenomenon of substorms receives the most attention in this literature, with the location of onset being the single most discussed issue. However, if the magnetic topology in substorm phases was widely debated, less attention was paid to the relationship of convection to the substorm cycle. A significantly new consensus view of substorm expansion and recovery phases emerged, which was termed the ‘Kiruna Conjecture’ after the conference at which it gained widespread acceptance. The second largest area of interest was dayside transient events, both near the magnetopause and the ionosphere. It became apparent that these phenomena include at least two classes of events, probably due to transient reconnection bursts and sudden solar wind dynamic pressure changes. The contribution of both types of event to convection is controversial. The realisation that induction effects decouple electric fields in the magnetosphere and ionosphere, on time scales shorter than several substorm cycles, calls for broadening of the range of measurement techniques in both the ionosphere and at the magnetopause. Several new techniques were introduced including ionospheric observations which yield reconnection rate as a function of time. The magnetospheric and ionospheric behaviour due to various quasi-steady interplanetary conditions was studied using magnetic cloud events. For northward IMF conditions, reverse convection in the polar cap was found to be predominantly a summer hemisphere phenomenon and even for extremely rare prolonged southward IMF conditions, the magnetosphere was observed to oscillate through various substorm cycles rather than forming a steady-state convection bay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss substorm observations made near 2100 magnetic local time (MLT) on March 7, 1991, in a collaborative study involving data from the European Incoherent Scatter radar, all-sky camera data, and magnetometer data from the Tromsø Auroral Observatory, the U.K. Sub-Auroral Magnetometer Network (SAMNET) and the IMAGE magnetometer chain. We conclude that for the substorm studied a plasmoid was not pinched off until at least 10 min after onset at the local time of the observations (2100 MLT) and that the main substorm electrojet expanded westward over this local time 14 min after onset. In the late growth phase/early expansion phase, we observed southward drifting arcs probably moving faster than the background plasma. Similar southward moving arcs in the recovery phase moved at a speed which does not appear to be significantly different from the measured plasma flow speed. We discuss these data in terms of the “Kiruna conjecture” and classical “near-Earth neutral line” paradigms, since the data show features of both models of substorm development. We suggest that longitudinal variation in behavior may reconcile the differences between the two models in the case of this substorm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intuition is an important and under-researched concept in information systems. Prior exploratory research has shown that that there is potential to characterize the use of intuition in academic information systems research. This paper extends this research to all of the available issues of two leading IS journals with the aim of reaching an approximation of theoretical saturation. Specifically, the entire text of MISQ and ISR was reviewed for the years 1990 through 2009 using searchable PDF versions of these publications. All references to intuition were coded on a basis consistent with Grounded Theory, interpreted as a gestalt and represented as a mind-map. In the period 1990-2009, 681 incidents of the use of "intuition", and related terms were found in the articles reviewed, representing a greater range of codes than prior research. In addition, codes were assigned to all issues of MIS Quarterly from commencement of publication to the end of the 2012 publication year to support the conjecture that coding saturation has been approximated. The most prominent use of the term of "intuition" was coded as "Intuition as Authority" in which intuition was used to validate a statement, research objective or a finding; representing approximately 34 per cent of codes assigned. In research articles where mathematical analysis was presented, researchers not infrequently commented on the degree to which a mathematical formulation was "intuitive"; this was the second most common coding representing approximately 16 per cent of the codes. The possibly most impactful use of the term "intuition" was "Intuition as Outcome", representing approximately 7 per cent of all coding, which characterized research results as adding to the intuitive understanding of a research topic or phenomena.This research aims to contribute to a greater theoretical understanding of the use of intuition in academic IS research publications. It provides potential benefits to practitioners by providing insight into the use of intuition in IS management, for example, emphasizing the emerging importance of "intuitive technology". Research directions include the creation of reflective and/or formative constructs for intuition in information systems research and the expansion of this novel research method to additional IS academic publications and topics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we employ a hypothetical discrete choice experiment (DCE) to examine how much consumers are willing to pay to use technology to customize their food shopping. We conjecture that customized information provision can aid in the composition of a healthier shop. Our results reveal that consumers are prepared to pay relatively more for individual specic information as opposed to generic nutritional information that is typically provided on food labels. In arriving at these results we have examined various model specications including those that make use of ex-post de-brieng questions on attribute nonattendance and attribute ranking information and those that consider the time taken to complete the survey. Our main results are robust to the various model specications we examine