929 resultados para Distributed lag model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present high time-resolution multiwavelength observations of X-ray bursts in the low-mass X-ray binary UY Vol. Strong reprocessed signals are present in the ultraviolet and optical, lagged and smeared with respect to the X-rays. The addition of far-ultraviolet coverage for one burst allows much tighter constraints on the temperature and geometry of the reprocessing region than previously possible. A blackbody reprocessing model for this burst suggests a rise in temperatures during the burst from 18,000 to 35,000 K and an emitting area comparable to that expected for the disk and/or irradiated companion star. The lags are consistent with those expected. The single-zone blackbody model cannot reproduce the ratio of optical to ultraviolet flux during the burst, however. The discrepancy seems too large to explain with deviations from a local blackbody spectrum and more likely indicates that a range of reprocessing temperatures are required. Comparable results are derived from other bursts, and in particular the lag and smearing both appear shorter when the companion star is on the near side of the disk as predicted. The burst observed by HST also yielded a spectrum of the reprocessed light. It is dominated by continuum, with a spectral shape consistent with the temperatures derived from lightcurve modeling. Taken as a whole, our observations confirm the standard paradigm of prompt reprocessing distributed across the disk and companion star, with the response dominated by a thermalized continuum rather than by emission lines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many urban surface energy balance models now exist. These vary in complexity from simple schemes that represent the city as a concrete slab, to those which incorporate detailed representations of momentum and energy fluxes distributed within the atmospheric boundary layer. While many of these schemes have been evaluated against observations, with some models even compared with the same data sets, such evaluations have not been undertaken in a controlled manner to enable direct comparison. For other types of climate model, for instance the Project for Intercomparison of Land-Surface Parameterization Schemes (PILPS) experiments (Henderson-Sellers et al., 1993), such controlled comparisons have been shown to provide important insights into both the mechanics of the models and the physics of the real world. This paper describes the progress that has been made to date on a systematic and controlled comparison of urban surface schemes. The models to be considered, and their key attributes, are described, along with the methodology to be used for the evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian analysis is given of an instrumental variable model that allows for heteroscedasticity in both the structural equation and the instrument equation. Specifically, the approach for dealing with heteroscedastic errors in Geweke (1993) is extended to the Bayesian instrumental variable estimator outlined in Rossi et al. (2005). Heteroscedasticity is treated by modelling the variance for each error using a hierarchical prior that is Gamma distributed. The computation is carried out by using a Markov chain Monte Carlo sampling algorithm with an augmented draw for the heteroscedastic case. An example using real data illustrates the approach and shows that ignoring heteroscedasticity in the instrument equation when it exists may lead to biased estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The semi-distributed, dynamic INCA-N model was used to simulate the behaviour of dissolved inorganic nitrogen (DIN) in two Finnish research catchments. Parameter sensitivity and model structural uncertainty were analysed using generalized sensitivity analysis. The Mustajoki catchment is a forested upstream catchment, while the Savijoki catchment represents intensively cultivated lowlands. In general, there were more influential parameters in Savijoki than Mustajoki. Model results were sensitive to N-transformation rates, vegetation dynamics, and soil and river hydrology. Values of the sensitive parameters were based on long-term measurements covering both warm and cold years. The highest measured DIN concentrations fell between minimum and maximum values estimated during the uncertainty analysis. The lowest measured concentrations fell outside these bounds, suggesting that some retention processes may be missing from the current model structure. The lowest concentrations occurred mainly during low flow periods; so effects on total loads were small.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the absence of market frictions, the cost-of-carry model of stock index futures pricing predicts that returns on the underlying stock index and the associated stock index futures contract will be perfectly contemporaneously correlated. Evidence suggests, however, that this prediction is violated with clear evidence that the stock index futures market leads the stock market. It is argued that traditional tests, which assume that the underlying data generating process is constant, might be prone to overstate the lead-lag relationship. Using a new test for lead-lag relationships based on cross correlations and cross bicorrelations it is found that, contrary to results from using the traditional methodology, periods where the futures market leads the cash market are few and far between and when any lead-lag relationship is detected, it does not last long. Overall, the results are consistent with the prediction of the standard cost-of-carry model and market efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have investigated mechanisms for the Atlantic Meridional Overturning Circulation (AMOC) variability at 26.5° N (other than the Ekman component) that can be related to external forcings, in particular wind variability. Resolution dependence is studied using identical experiments with 1° and 1/4° NEMO model runs over 1960–2010. The analysis shows that much of the variability in the AMOC at 26° N can be related to the wind strength over the North Atlantic, through mechanisms lagged on different timescales. At ~ 1-year lag the January–June difference of mean sea level pressure between high and mid-latitudes in the North Atlantic explains 35–50% of the interannual AMOC variability (with negative correlation between wind strength and AMOC). At longer lead timescales ~ 4 years, strong (weak) winds over the northern North Atlantic (specifically linked to the NAO index) are followed by higher (lower) AMOC transport, but this mechanism only works in the 1/4° model. Analysis of the density correlations suggests an increase (decrease) in deep water formation in the North Atlantic subpolar gyre to be the cause. Therefore another 30% of the AMOC variability at 26° N can be related to density changes in the top 1000 m in the Labrador and Irminger seas occurring ~ 4 years earlier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unorganized traffic is a generalized form of travel wherein vehicles do not adhere to any predefined lanes and can travel in-between lanes. Such travel is visible in a number of countries e.g. India, wherein it enables a higher traffic bandwidth, more overtaking and more efficient travel. These advantages are visible when the vehicles vary considerably in size and speed, in the absence of which the predefined lanes are near-optimal. Motion planning for multiple autonomous vehicles in unorganized traffic deals with deciding on the manner in which every vehicle travels, ensuring no collision either with each other or with static obstacles. In this paper the notion of predefined lanes is generalized to model unorganized travel for the purpose of planning vehicles travel. A uniform cost search is used for finding the optimal motion strategy of a vehicle, amidst the known travel plans of the other vehicles. The aim is to maximize the separation between the vehicles and static obstacles. The search is responsible for defining an optimal lane distribution among vehicles in the planning scenario. Clothoid curves are used for maintaining a lane or changing lanes. Experiments are performed by simulation over a set of challenging scenarios with a complex grid of obstacles. Additionally behaviours of overtaking, waiting for a vehicle to cross and following another vehicle are exhibited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cloud is playing a very important role in wireless sensor network, crowd sensing and IoT data collection and processing. However, current cloud solutions lack of some features that hamper the innovation a number of other new services. We propose a cloud solution that provides these missing features as multi-cloud and device multi-tenancy relying in a whole different fully distributed paradigm, the actor model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-lagged responses of biological variables to landscape modifications are widely recognized, but rarely considered in ecological studies. In order to test for the existence of time-lags in the response of trees, small mammals, birds and frogs to changes in fragment area and connectivity, we studied a fragmented and highly dynamic landscape in the Atlantic forest region. We also investigated the biological correlates associated with differential responses among taxonomic groups. Species richness and abundance for four taxonomic groups were measured in 21 secondary forest fragments during the same period (2000-2002), following a standardized protocol. Data analyses were based on power regressions and model selection procedures. The model inputs included present (2000) and past (1962, 1981) fragment areas and connectivity, as well as observed changes in these parameters. Although past landscape structure was particularly relevant for trees, all taxonomic groups (except small mammals) were affected by landscape dynamics, exhibiting a time-lagged response. Furthermore, fragment area was more important for species groups with lower dispersal capacity, while species with higher dispersal ability had stronger responses to connectivity measures. Although these secondary forest fragments still maintain a large fraction of their original biodiversity, the delay in biological response combined with high rates of deforestation and fast forest regeneration imply in a reduction in the average age of the forest. This also indicates that future species losses are likely, especially those that are more strictly-forest dwellers. Conservation actions should be implemented to reduce species extinction, to maintain old-growth forests and to favour the regeneration process. Our results demonstrate that landscape history can strongly affect the present distribution pattern of species in fragmented landscapes, and should be considered in conservation planning. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several accounts put forth to explain the flash-lag effect (FLE) rely mainly on either spatial or temporal mechanisms. Here we investigated the relationship between these mechanisms by psychophysical and theoretical approaches. In a first experiment we assessed the magnitudes of the FLE and temporal-order judgments performed under identical visual stimulation. The results were interpreted by means of simulations of an artificial neural network, that wits also employed to make predictions concerning the F LE. The model predicted that a spatio-temporal mislocalisation would emerge from two, continuous and abrupt-onset, moving stimuli. Additionally, a straightforward prediction of the model revealed that the magnitude of this mislocalisation should be task-dependent, increasing when the use of the abrupt-onset moving stimulus switches from a temporal marker only to both temporal and spatial markers. Our findings confirmed the model`s predictions and point to an indissoluble interplay between spatial facilitation and processing delays in the FLE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we present the EM-algorithm for performing maximum likelihood estimation of an asymmetric linear calibration model with the assumption of skew-normally distributed error. A simulation study is conducted for evaluating the performance of the calibration estimator with interpolation and extrapolation situations. As one application in a real data set, we fitted the model studied in a dimensional measurement method used for calculating the testicular volume through a caliper and its calibration by using ultrasonography as the standard method. By applying this methodology, we do not need to transform the variables to have symmetrical errors. Another interesting aspect of the approach is that the developed transformation to make the information matrix nonsingular, when the skewness parameter is near zero, leaves the parameter of interest unchanged. Model fitting is implemented and the best choice between the usual calibration model and the model proposed in this article was evaluated by developing the Akaike information criterion, Schwarz`s Bayesian information criterion and Hannan-Quinn criterion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models. We generalize an earlier work, considering the sojourn times in health states are not identically distributed, for a given vector of covariates. Approaches based on semiparametric and parametric (exponential and Weibull distributions) methodologies are considered. A simulation study is conducted to evaluate the performance of the proposed estimator and the jackknife resampling method is used to estimate the variance of such estimator. An application to a real data set is also included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many epidemiological studies it is common to resort to regression models relating incidence of a disease and its risk factors. The main goal of this paper is to consider inference on such models with error-prone observations and variances of the measurement errors changing across observations. We suppose that the observations follow a bivariate normal distribution and the measurement errors are normally distributed. Aggregate data allow the estimation of the error variances. Maximum likelihood estimates are computed numerically via the EM algorithm. Consistent estimation of the asymptotic variance of the maximum likelihood estimators is also discussed. Test statistics are proposed for testing hypotheses of interest. Further, we implement a simple graphical device that enables an assessment of the model`s goodness of fit. Results of simulations concerning the properties of the test statistics are reported. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease. Copyright (C) 2008 John Wiley & Sons, Ltd.