11 resultados para the U.S.
em CentAUR: Central Archive University of Reading - UK
Resumo:
Investment risk models with infinite variance provide a better description of distributions of individual property returns in the IPD database over the period 1981 to 2003 than Normally distributed risk models, which mirrors results in the U.S. and Australia using identical methodology. Real estate investment risk is heteroscedastic, but the Characteristic Exponent of the investment risk function is constant across time yet may vary by property type. Asset diversification is far less effective at reducing the impact of non-systematic investment risk on real estate portfolios than in the case of assets with Normally distributed investment risk. Multi-risk factor portfolio allocation models based on measures of investment codependence from finite-variance statistics are ineffectual in the real estate context.
Resumo:
This paper examines the dynamics of the residential property market in the United States between 1960 and 2011. Given the cyclically and apparent overvaluation of the market over this period, we determine whether deviations of real estate prices from their fundamentals were caused by the existence of two genres of bubbles: intrinsic bubbles and rational speculative bubbles. We find evidence of an intrinsic bubble in the market pre-2000, implying that overreaction to changes in rents contributed to the overvaluation of real estate prices. However, using a regime-switching model, we find evidence of periodically collapsing rational bubbles in the post-2000 market
Resumo:
We analyze the impact of political proximity to the United States on the occurrence and severity of terror. Employing panel data for 116 countries over the period 1975–2001 we find that countries voting in line with the U.S. are victims of more and deadlier attacks.
Resumo:
While a quantitative climate theory of tropical cyclone formation remains elusive, considerable progress has been made recently in our ability to simulate tropical cyclone climatologies and understand the relationship between climate and tropical cyclone formation. Climate models are now able to simulate a realistic rate of global tropical cyclone formation, although simulation of the Atlantic tropical cyclone climatology remains challenging unless horizontal resolutions finer than 50 km are employed. This article summarizes published research from the idealized experiments of the Hurricane Working Group of U.S. CLIVAR (CLImate VARiability and predictability of the ocean-atmosphere system). This work, combined with results from other model simulations, has strengthened relationships between tropical cyclone formation rates and climate variables such as mid-tropospheric vertical velocity, with decreased climatological vertical velocities leading to decreased tropical cyclone formation. Systematic differences are shown between experiments in which only sea surface temperature is increased versus experiments where only atmospheric carbon dioxide is increased, with the carbon dioxide experiments more likely to demonstrate the decrease in tropical cyclone numbers previously shown to be a common response of climate models in a warmer climate. Experiments where the two effects are combined also show decreases in numbers, but these tend to be less for models that demonstrate a strong tropical cyclone response to increased sea surface temperatures. Further experiments are proposed that may improve our understanding of the relationship between climate and tropical cyclone formation, including experiments with two-way interaction between the ocean and the atmosphere and variations in atmospheric aerosols.
Resumo:
On 23 November 1981, a strong cold front swept across the U.K., producing tornadoes from the west to the east coasts. An extensive campaign to collect tornado reports by the Tornado and Storm Research Organisation (TORRO) resulted in 104 reports, the largest U.K. outbreak. The front was simulated with a convection-permitting numerical model down to 200-m horizontal grid spacing to better understand its evolution and meteorological environment. The event was typical of tornadoes in the U.K., with convective available potential energy (CAPE) less than 150 J kg-1, 0-1-km wind shear of 10-20 m s-1, and a narrow cold-frontal rainband forming precipitation cores and gaps. A line of cyclonic absolute vorticity existed along the front, with maxima as large as 0.04 s-1. Some hook-shaped misovortices bore kinematic similarity to supercells. The narrow swath along which the line was tornadic was bounded on the equatorward side by weak vorticity along the line and on the poleward side by zero CAPE, enclosing a region where the environment was otherwise favorable for tornadogenesis. To determine if the 104 tornado reports were plausible, first possible duplicate reports were eliminated, resulting in as few as 58 tornadoes to as many as 90. Second, the number of possible parent misovortices that may have spawned tornadoes is estimated from model output. The number of plausible tornado reports in the 200-m grid-spacing domain was 22 and as many as 44, whereas the model simulation was used to estimate 30 possible parent misovortices within this domain. These results suggest that 90 reports was plausible.
Resumo:
Causing civilian casualties during military operations has become a much politicised topic in international relations since the Second World War. Since the last decade of the 20th century, different scholars and political analysts have claimed that human life is valued more and more among the general international community. This argument has led many researchers to assume that democratic culture and traditions, modern ethical and moral issues have created a desire for a world without war or, at least, a demand that contemporary armed conflicts, if unavoidable, at least have to be far less lethal forcing the military to seek new technologies that can minimise civilian casualties and collateral damage. Non-Lethal Weapons (NLW) – weapons that are intended to minimise civilian casualties and collateral damage – are based on the technology that, during the 1990s, was expected to revolutionise the conduct of warfare making it significantly less deadly. The rapid rise of interest in NLW, ignited by the American military twenty five years ago, sparked off an entirely new military, as well as an academic, discourse concerning their potential contribution to military success on the 21st century battlefields. It seems, however, that except for this debate, very little has been done within the military forces themselves. This research suggests that the roots of this situation are much deeper than the simple professional misconduct of the military establishment, or the poor political behaviour of political leaders, who had sent them to fight. Following the story of NLW in the U.S., Russia and Israel this research focuses on the political and cultural aspects that have been supposed to force the military organisations of these countries to adopt new technologies and operational and organisational concepts regarding NLW in an attempt to minimise enemy civilian casualties during their military operations. This research finds that while American, Russian and Israeli national characters are, undoubtedly, products of the unique historical experience of each one of these nations, all of three pay very little regard to foreigners’ lives. Moreover, while it is generally argued that the international political pressure is a crucial factor that leads to the significant reduction of harmed civilians and destroyed civilian infrastructure, the findings of this research suggest that the American, Russian and Israeli governments are well prepared and politically equipped to fend off international criticism. As the analyses of the American, Russian and Israeli cases reveal, the political-military leaderships of these countries have very little external or domestic reasons to minimise enemy civilian casualties through fundamental-revolutionary change in their conduct of war. In other words, this research finds that employment of NLW have failed because the political leadership asks the militaries to reduce the enemy civilian casualties to a politically acceptable level, rather than to the technologically possible minimum; as in the socio-cultural-political context of each country, support for the former appears to be significantly higher than for the latter.
Resumo:
This article describes the development and evaluation of the U.K.’s new High-Resolution Global Environmental Model (HiGEM), which is based on the latest climate configuration of the Met Office Unified Model, known as the Hadley Centre Global Environmental Model, version 1 (HadGEM1). In HiGEM, the horizontal resolution has been increased to 0.83° latitude × 1.25° longitude for the atmosphere, and 1/3° × 1/3° globally for the ocean. Multidecadal integrations of HiGEM, and the lower-resolution HadGEM, are used to explore the impact of resolution on the fidelity of climate simulations. Generally, SST errors are reduced in HiGEM. Cold SST errors associated with the path of the North Atlantic drift improve, and warm SST errors are reduced in upwelling stratocumulus regions where the simulation of low-level cloud is better at higher resolution. The ocean model in HiGEM allows ocean eddies to be partially resolved, which dramatically improves the representation of sea surface height variability. In the Southern Ocean, most of the heat transports in HiGEM is achieved by resolved eddy motions, which replaces the parameterized eddy heat transport in the lower-resolution model. HiGEM is also able to more realistically simulate small-scale features in the wind stress curl around islands and oceanic SST fronts, which may have implications for oceanic upwelling and ocean biology. Higher resolution in both the atmosphere and the ocean allows coupling to occur on small spatial scales. In particular, the small-scale interaction recently seen in satellite imagery between the atmosphere and tropical instability waves in the tropical Pacific Ocean is realistically captured in HiGEM. Tropical instability waves play a role in improving the simulation of the mean state of the tropical Pacific, which has important implications for climate variability. In particular, all aspects of the simulation of ENSO (spatial patterns, the time scales at which ENSO occurs, and global teleconnections) are much improved in HiGEM.
Resumo:
Immature and mature calcretes from an alluvial terrace sequence in the Sorbas basin, southeast Spain, were dated by the U-series isochron technique. The immature horizons consistently produced statistically reliable ages of high precision. The mature horizons typically produced statistically unreliable ages but, because of linear trends in the dataset and low errors associated with each data point, it was still possible to place a best-fit isochron through the dataset to produce an age with low associated uncertainties. It is, however, only possible to prove that these statistically unreliable ages have geochronological significance if multiple isochron ages are produced for a single site, and if these multiple ages are stratigraphically consistent. The geochronological significance of such ages can be further proven if at least one of the multiple ages is statistically reliable. By using this technique to date calcretes that have formed during terrace aggradation and at the terrace surface after terrace abandonment it is possible not only to date the timing of terrace aggradation but also to constrain the age at which the river switched from aggradation to incision. This approach, therefore, constrains the timing of changes in fluvial processes more reliably than any currently used geochronological procedure and is appropriate for dating terrace sequences in dryland regions worldwide, wherever calcrete horizons are present. (c) 2005 University of Washington. All rights reserved.