73 resultados para Models and modeling
Resumo:
Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
The response of the Southern Ocean to a repeating seasonal cycle of ozone loss is studied in two coupled climate models and found to comprise both fast and slow processes. The fast response is similar to the inter-annual signature of the Southern Annular Mode (SAM) on Sea Surface Temperature (SST), on to which the ozone-hole forcing projects in the summer. It comprises enhanced northward Ekman drift inducing negative summertime SST anomalies around Antarctica, earlier sea ice freeze-up the following winter, and northward expansion of the sea ice edge year-round. The enhanced northward Ekman drift, however, results in upwelling of warm waters from below the mixed layer in the region of seasonal sea ice. With sustained bursts of westerly winds induced by ozone-hole depletion, this warming from below eventually dominates over the cooling from anomalous Ekman drift. The resulting slow-timescale response (years to decades) leads to warming of SSTs around Antarctica and ultimately a reduction in sea-ice cover year-round. This two-timescale behavior - rapid cooling followed by slow but persistent warming - is found in the two coupled models analysed, one with an idealized geometry, the other a complex global climate model with realistic geometry. Processes that control the timescale of the transition from cooling to warming, and their uncertainties are described. Finally we discuss the implications of our results for rationalizing previous studies of the effect of the ozone-hole on SST and sea-ice extent. %Interannual variability in the Southern Annular Mode (SAM) and sea ice covary such that an increase and southward shift in the surface westerlies (a positive phase of the SAM) coincides with a cooling of Sea Surface Temperature (SST) around 70-50$^\circ$S and an expansion of the sea ice cover, as seen in observations and models alike. Yet, in modeling studies, the Southern Ocean warms and sea ice extent decreases in response to sustained, multi-decadal positive SAM-like wind anomalies driven by 20th century ozone depletion. Why does the Southern Ocean appear to have disparate responses to SAM-like variability on interannual and multidecadal timescales? Here it is demonstrated that the response of the Southern Ocean to ozone depletion has a fast and a slow response. The fast response is similar to the interannual variability signature of the SAM. It is dominated by an enhanced northward Ekman drift, which transports heat northward and causes negative SST anomalies in summertime, earlier sea ice freeze-up the following winter, and northward expansion of the sea ice edge year round. The enhanced northward Ekman drift causes a region of Ekman divergence around 70-50$^\circ$S, which results in upwelling of warmer waters from below the mixed layer. With sustained westerly wind enhancement in that latitudinal band, the warming due to the anomalous upwelling of warm waters eventually dominates over the cooling from the anomalous Ekman drift. Hence, the slow response ultimately results in a positive SST anomaly and a reduction in the sea ice cover year round. We demonstrate this behavior in two models: one with an idealized geometry and another, more detailed, global climate model. However, the models disagree on the timescale of transition from the fast (cooling) to the slow (warming) response. Processes that controls this transition and their uncertainties are discussed.
Resumo:
Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme.
Resumo:
Second language acquisition researchers often face particular challenges when attempting to generalize study findings to the wider learner population. For example, language learners constitute a heterogeneous group, and it is not always clear how a study’s findings may generalize to other individuals who may differ in terms of language background and proficiency, among many other factors. In this paper, we provide an overview of how mixed-effects models can be used to help overcome these and other issues in the field of second language acquisition. We provide an overview of the benefits of mixed-effects models and a practical example of how mixed-effects analyses can be conducted. Mixed-effects models provide second language researchers with a powerful statistical tool in the analysis of a variety of different types of data.
Resumo:
This paper investigates the challenge of representing structural differences in river channel cross-section geometry for regional to global scale river hydraulic models and the effect this can have on simulations of wave dynamics. Classically, channel geometry is defined using data, yet at larger scales the necessary information and model structures do not exist to take this approach. We therefore propose a fundamentally different approach where the structural uncertainty in channel geometry is represented using a simple parameterization, which could then be estimated through calibration or data assimilation. This paper first outlines the development of a computationally efficient numerical scheme to represent generalised channel shapes using a single parameter, which is then validated using a simple straight channel test case and shown to predict wetted perimeter to within 2% for the channels tested. An application to the River Severn, UK is also presented, along with an analysis of model sensitivity to channel shape, depth and friction. The channel shape parameter was shown to improve model simulations of river level, particularly for more physically plausible channel roughness and depth parameter ranges. Calibrating channel Manning’s coefficient in a rectangular channel provided similar water level simulation accuracy in terms of Nash-Sutcliffe efficiency to a model where friction and shape or depth were calibrated. However, the calibrated Manning coefficient in the rectangular channel model was ~2/3 greater than the likely physically realistic value for this reach and this erroneously slowed wave propagation times through the reach by several hours. Therefore, for large scale models applied in data sparse areas, calibrating channel depth and/or shape may be preferable to assuming a rectangular geometry and calibrating friction alone.
Resumo:
Key Performance Indicators (KPIs) are the main instruments of Business Performance Management. KPIs are the measures that are translated to both the strategy and the business process. These measures are often designed for an industry sector with the assumptions about business processes in organizations. However, the assumptions can be too incomplete to guarantee the required properties of KPIs. This raises the need to validate the properties of KPIs prior to their application to performance measurement. This paper applies the method called EXecutable Requirements Engineering Management and Evolution (EXTREME) for validation of the KPI definitions. EXTREME semantically relates the goal modeling, conceptual modeling and protocol modeling techniques into one methodology. The synchronous composition built into protocol modeling enables raceability of goals in protocol models and constructive definitions of a KPI. The application of the method clarifies the meaning of KPI properties and procedures of their assessment and validation.
Resumo:
Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.
Resumo:
Species distribution models (SDM) are increasingly used to understand the factors that regulate variation in biodiversity patterns and to help plan conservation strategies. However, these models are rarely validated with independently collected data and it is unclear whether SDM performance is maintained across distinct habitats and for species with different functional traits. Highly mobile species, such as bees, can be particularly challenging to model. Here, we use independent sets of occurrence data collected systematically in several agricultural habitats to test how the predictive performance of SDMs for wild bee species depends on species traits, habitat type, and sampling technique. We used a species distribution modeling approach parametrized for the Netherlands, with presence records from 1990 to 2010 for 193 Dutch wild bees. For each species, we built a Maxent model based on 13 climate and landscape variables. We tested the predictive performance of the SDMs with independent datasets collected from orchards and arable fields across the Netherlands from 2010 to 2013, using transect surveys or pan traps. Model predictive performance depended on species traits and habitat type. Occurrence of bee species specialized in habitat and diet was better predicted than generalist bees. Predictions of habitat suitability were also more precise for habitats that are temporally more stable (orchards) than for habitats that suffer regular alterations (arable), particularly for small, solitary bees. As a conservation tool, SDMs are best suited to modeling rarer, specialist species than more generalist and will work best in long-term stable habitats. The variability of complex, short-term habitats is difficult to capture in such models and historical land use generally has low thematic resolution. To improve SDMs’ usefulness, models require explanatory variables and collection data that include detailed landscape characteristics, for example, variability of crops and flower availability. Additionally, testing SDMs with field surveys should involve multiple collection techniques.
Resumo:
We review the effects of dynamical variability on clouds and radiation in observations and models and discuss their implications for cloud feedbacks. Jet shifts produce robust meridional dipoles in upper-level clouds and longwave cloud-radiative effect (CRE), but low-level clouds, which do not simply shift with the jet, dominate the shortwave CRE. Because the effect of jet variability on CRE is relatively small, future poleward jet shifts with global warming are only a second-order contribution to the total CRE changes around the midlatitudes, suggesting a dominant role for thermodynamic effects. This implies that constraining the dynamical response is unlikely to reduce the uncertainty in extratropical cloud feedback. However, we argue that uncertainty in the cloud-radiative response does affect the atmospheric circulation response to global warming, by modulating patterns of diabatic forcing. How cloud feedbacks can affect the dynamical response to global warming is an important topic of future research.
Resumo:
The level of agreement between climate model simulations and observed surface temperature change is a topic of scientific and policy concern. While the Earth system continues to accumulate energy due to anthropogenic and other radiative forcings, estimates of recent surface temperature evolution fall at the lower end of climate model projections. Global mean temperatures from climate model simulations are typically calculated using surface air temperatures, while the corresponding observations are based on a blend of air and sea surface temperatures. This work quantifies a systematic bias in model-observation comparisons arising from differential warming rates between sea surface temperatures and surface air temperatures over oceans. A further bias arises from the treatment of temperatures in regions where the sea ice boundary has changed. Applying the methodology of the HadCRUT4 record to climate model temperature fields accounts for 38% of the discrepancy in trend between models and observations over the period 1975–2014.
Resumo:
Population ecology is a discipline that studies changes in the number and composition (age, sex) of the individuals that form a population. Many of the mechanisms that generate these changes are associated with individual behavior, for example how individuals defend their territories, find mates or disperse. Therefore, it is important to model population dynamics considering the potential influence of behavior on the modeled dynamics. This study illustrates the diversity of behaviors that influence population dynamics describing several methods that allow integrating behavior into population models and range from simpler models that only consider the number of individuals to complex individual-based models that capture great levels of detail. A series of examples shows the importance of explicitly considering behavior in population modeling to avoid reaching erroneous conclusions. This integration is particularly relevant for conservation, as incorrect predictions regarding the dynamics of populations of conservation interest can lead to inadequate assessment and management. Improved predictions can favor effective protection of species and better use of the limited financial and human conservation resources.
Resumo:
As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison of the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. These large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.