123 resultados para Time equivalent approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The information processing capacity of the human mind is limited, as is evidenced by the attentional blink (AB) - a deficit in identifying the second of two temporally-close targets (T1 and T2) embedded in a rapid stream of distracters. Theories of the AB generally agree that it results from competition between stimuli for conscious representation. However, they disagree in the specific mechanisms, in particular about how attentional processing of T1 determines the AB to T2. Methodology/Principal Findings The present study used the high spatial resolution of functional magnetic resonance imaging (fMRI) to examine the neural mechanisms underlying the AB. Our research approach was to design T1 and T2 stimuli that activate distinguishable brain areas involved in visual categorization and representation. ROI and functional connectivity analyses were then used to examine how attentional processing of T1, as indexed by activity in the T1 representation area, affected T2 processing. Our main finding was that attentional processing of T1 at the level of the visual cortex predicted T2 detection rates Those individuals who activated the T1 encoding area more strongly in blink versus no-blink trials generally detected T2 on a lower percentage of trials. The coupling of activity between T1 and T2 representation areas did not vary as a function of conscious T2 perception. Conclusions/Significance These data are consistent with the notion that the AB is related to attentional demands of T1 for selection, and indicate that these demands are reflected at the level of visual cortex. They also highlight the importance of individual differences in attentional settings in explaining AB task performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian Model Averaging (BMA) is used for testing for multiple break points in univariate series using conjugate normal-gamma priors. This approach can test for the number of structural breaks and produce posterior probabilities for a break at each point in time. Results are averaged over specifications including: stationary; stationary around trend and unit root models, each containing different types and number of breaks and different lag lengths. The procedures are used to test for structural breaks on 14 annual macroeconomic series and 11 natural resource price series. The results indicate that there are structural breaks in all of the natural resource series and most of the macroeconomic series. Many of the series had multiple breaks. Our findings regarding the existence of unit roots, having allowed for structural breaks in the data, are largely consistent with previous work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A neural network enhanced self-tuning controller is presented, which combines the attributes of neural network mapping with a generalised minimum variance self-tuning control (STC) strategy. In this way the controller can deal with nonlinear plants, which exhibit features such as uncertainties, nonminimum phase behaviour, coupling effects and may have unmodelled dynamics, and whose nonlinearities are assumed to be globally bounded. The unknown nonlinear plants to be controlled are approximated by an equivalent model composed of a simple linear submodel plus a nonlinear submodel. A generalised recursive least squares algorithm is used to identify the linear submodel and a layered neural network is used to detect the unknown nonlinear submodel in which the weights are updated based on the error between the plant output and the output from the linear submodel. The procedure for controller design is based on the equivalent model therefore the nonlinear submodel is naturally accommodated within the control law. Two simulation studies are provided to demonstrate the effectiveness of the control algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers PID control in terms of its implementation by means of an ARMA plant model. Two controller actions are considered, namely pole placement and deadbeat, both being applied via a PID structure for the adaptive real-time control of an industrial level system. As well as looking at two controller types separately, a comparison is made between the forms and it is shown how, under certain circumstances, the two forms can be seen to be identical. It is shown how the pole-placement PID form does not in fact realise an action which is equivalent to the deadbeat controller, when all closed-loop poles are chosen to be at the origin of the z-plane.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The valuation of farmland is a perennial issue for agricultural policy, given its importance in the farm investment portfolio. Despite the significance of farmland values to farmer wealth, prediction remains a difficult task. This study develops a dynamic information measure to examine the informational content of farmland values and farm income in explaining the distribution of farmland values over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new structure of Radial Basis Function (RBF) neural network called the Dual-orthogonal RBF Network (DRBF) is introduced for nonlinear time series prediction. The hidden nodes of a conventional RBF network compare the Euclidean distance between the network input vector and the centres, and the node responses are radially symmetrical. But in time series prediction where the system input vectors are lagged system outputs, which are usually highly correlated, the Euclidean distance measure may not be appropriate. The DRBF network modifies the distance metric by introducing a classification function which is based on the estimation data set. Training the DRBF networks consists of two stages. Learning the classification related basis functions and the important input nodes, followed by selecting the regressors and learning the weights of the hidden nodes. In both cases, a forward Orthogonal Least Squares (OLS) selection procedure is applied, initially to select the important input nodes and then to select the important centres. Simulation results of single-step and multi-step ahead predictions over a test data set are included to demonstrate the effectiveness of the new approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A quasi-optical de-embedding technique for characterizing waveguides is demonstrated using wideband time-resolved terahertz spectroscopy. A transfer function representation is adopted for the description of the signal in the input and output port of the waveguides. The time domain responses were discretised and the waveguide transfer function was obtained through a parametric approach in the z-domain after describing the system with an ARX as well as with a state space model. Prior to the identification procedure, filtering was performed in the wavelet domain to minimize signal distortion and the noise propagating in the ARX and subspace models. The model identification procedure requires isolation of the phase delay in the structure and therefore the time-domain signatures must be firstly aligned with respect to each other before they are compared. An initial estimate of the number of propagating modes was provided by comparing the measured phase delay in the structure with theoretical calculations that take into account the physical dimensions of the waveguide. Models derived from measurements of THz transients in a precision WR-8 waveguide adjustable short will be presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a discrete time dynamic integrated system optimisation and parameter estimation algorithm is applied to the solution of the nonlinear tracking optimal control problem. A version of the algorithm with a linear-quadratic model-based problem is developed and implemented in software. The algorithm implemented is tested with simulation examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An error polynomial is defined, the coefficients of which indicate the difference at any instant between a system and a model of lower order approximating the system. It is shown how Markov parameters and time series proportionals of the model can be matched with those of the system by setting error polynomial coefficients to zero. Also discussed is the way in which the error between system and model can be considered as being a filtered form of an error input function specified by means of model parameter selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of mismatch between controller and system is considered. A novel discrete-time approach is used to investigate the migration of closed-loop poles when this mismatch occurs. Two forms of state estimator are employed giving rise to several interesting features regarding stability and performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates whether obtaining sustainable building certification entails a rental premium for commercial office buildings and tracks its development over time. To this aim, both a difference-in-differences and a fixed-effects model approach are applied to a large panel dataset of office buildings in the United States in the 2000–2010 period. The results indicate a significant rental premium for both ENERGY STAR and LEED certified buildings. Controlling for confounding factors, this premium is shown to have increased steadily from 2006 to 2008, followed by a moderate decline in the subsequent periods. The results also show a significant positive relationship between ENERGY STAR labeling and building occupancy rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Six land surface models and five global hydrological models participate in a model intercomparison project (WaterMIP), which for the first time compares simulation results of these different classes of models in a consistent way. In this paper the simulation setup is described and aspects of the multi-model global terrestrial water balance are presented. All models were run at 0.5 degree spatial resolution for the global land areas for a 15-year period (1985-1999) using a newly-developed global meteorological dataset. Simulated global terrestrial evapotranspiration, excluding Greenland and Antarctica, ranges from 415 to 586 mm year-1 (60,000 to 85,000 km3 year-1) and simulated runoff ranges from 290 to 457 mm year-1 (42,000 to 66,000 km3 year-1). Both the mean and median runoff fractions for the land surface models are lower than those of the global hydrological models, although the range is wider. Significant simulation differences between land surface and global hydrological models are found to be caused by the snow scheme employed. The physically-based energy balance approach used by land surface models generally results in lower snow water equivalent values than the conceptual degree-day approach used by global hydrological models. Some differences in simulated runoff and evapotranspiration are explained by model parameterizations, although the processes included and parameterizations used are not distinct to either land surface models or global hydrological models. The results show that differences between model are major sources of uncertainty. Climate change impact studies thus need to use not only multiple climate models, but also some other measure of uncertainty, (e.g. multiple impact models).