907 resultados para Time-invariant Wavelet Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Demographic models are assuming an important role in management decisions for endangered species. Elasticity analysis and scope for management analysis are two such applications. Elasticity analysis determines the vital rates that have the greatest impact on population growth. Scope for management analysis examines the effects that feasible management might have on vital rates and population growth. Both methods target management in an attempt to maximize population growth. 2. The Seychelles magpie robin Copsychus sechellarum is a critically endangered island endemic, the population of which underwent significant growth in the early 1990s following the implementation of a recovery programme. We examined how the formal use of elasticity and scope for management analyses might have shaped management in the recovery programme, and assessed their effectiveness by comparison with the actual population growth achieved. 3. The magpie robin population doubled from about 25 birds in 1990 to more than 50 by 1995. A simple two-stage demographic model showed that this growth was driven primarily by a significant increase in the annual survival probability of first-year birds and an increase in the birth rate. Neither the annual survival probability of adults nor the probability of a female breeding at age 1 changed significantly over time. 4. Elasticity analysis showed that the annual survival probability of adults had the greatest impact on population growth. There was some scope to use management to increase survival, but because survival rates were already high (> 0.9) this had a negligible effect on population growth. Scope for management analysis showed that significant population growth could have been achieved by targeting management measures at the birth rate and survival probability of first-year birds, although predicted growth rates were lower than those achieved by the recovery programme when all management measures were in place (i.e. 1992-95). 5. Synthesis and applications. We argue that scope for management analysis can provide a useful basis for management but will inevitably be limited to some extent by a lack of data, as our study shows. This means that identifying perceived ecological problems and designing management to alleviate them must be an important component of endangered species management. The corollary of this is that it will not be possible or wise to consider only management options for which there is a demonstrable ecological benefit. Given these constraints, we see little role for elasticity analysis because, when data are available, a scope for management analysis will always be of greater practical value and, when data are lacking, precautionary management demands that as many perceived ecological problems as possible are tackled.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a novel method of authentication of users in secure buildings. The main objective is to investigate whether user actions in the built environment can produce consistent behavioural signatures upon which a building intrusion detection system could be based. In the process three behavioural expressions were discovered: time-invariant, co-dependent and idiosyncratic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new blind equalisation algorithm for the pulse amplitude modulation (PAM) data transmitted through nonminimum phase (NMP) channels. The algorithm itself is based on a noncausal AR model of communication channels and the second- and fourth-order cumulants of the received data series, where only the diagonal slices of cumulants are used. The AR parameters are adjusted at each sample by using a successive over-relaxation (SOR) scheme, a variety of the ordinary LMS scheme, but with a faster convergence rate and a greater robustness to the selection of the ‘step-size’ in iterations. Computer simulations are implemented for both linear time-invariant (LTI) and linear time-variant (LTV) NMP channels, and the results show that the algorithm proposed in this paper has a fast convergence rate and a potential capability to track the LTV NMP channels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of tracking line segments corresponding to on-line handwritten obtained through a digitizer tablet. The approach is based on Kalman filtering to model linear portions of on-line handwritten, particularly, handwritten numerals, and to detect abrupt changes in handwritten direction underlying a model change. This approach uses a Kalman filter framework constrained by a normalized line equation, where quadratic terms are linearized through a first-order Taylor expansion. The modeling is then carried out under the assumption that the state is deterministic and time-invariant, while the detection relies on double thresholding mechanism which tests for a violation of this assumption. The first threshold is based on an approach of layout kinetics. The second one takes into account the jump in angle between the past observed direction of layout and its current direction. The method proposed enables real-time processing. To illustrate the methodology proposed, some results obtained from handwritten numerals are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uncertainty affects all aspects of the property market but one area where the impact of uncertainty is particularly significant is within feasibility analyses. Any development is impacted by differences between market conditions at the conception of the project and the market realities at the time of completion. The feasibility study needs to address the possible outcomes based on an understanding of the current market. This requires the appraiser to forecast the most likely outcome relating to the sale price of the completed development, the construction costs and the timing of both. It also requires the appraiser to understand the impact of finance on the project. All these issues are time sensitive and analysis needs to be undertaken to show the impact of time to the viability of the project. The future is uncertain and a full feasibility analysis should be able to model the upside and downside risk pertaining to a range of possible outcomes. Feasibility studies are extensively used in Italy to determine land value but they tend to be single point analysis based upon a single set of “likely” inputs. In this paper we look at the practical impact of uncertainty in variables using a simulation model (Crystal Ball ©) with an actual case study of an urban redevelopment plan for an Italian Municipality. This allows the appraiser to address the issues of uncertainty involved and thus provide the decision maker with a better understanding of the risk of development. This technique is then refined using a “two-dimensional technique” to distinguish between “uncertainty” and “variability” and thus create a more robust model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous work has demonstrated that observed and modeled climates show a near-time-invariant ratio of mean land to mean ocean surface temperature change under transient and equilibrium global warming. This study confirms this in a range of atmospheric models coupled to perturbed sea surface temperatures (SSTs), slab (thermodynamics only) oceans, and a fully coupled ocean. Away from equilibrium, it is found that the atmospheric processes that maintain the ratio cause a land-to-ocean heat transport anomaly that can be approximated using a two-box energy balance model. When climate is forced by increasing atmospheric CO2 concentration, the heat transport anomaly moves heat from land to ocean, constraining the land to warm in step with the ocean surface, despite the small heat capacity of the land. The heat transport anomaly is strongly related to the top-of-atmosphere radiative flux imbalance, and hence it tends to a small value as equilibrium is approached. In contrast, when climate is forced by prescribing changes in SSTs, the heat transport anomaly replaces ‘‘missing’’ radiative forcing over land by moving heat from ocean to land, warming the land surface. The heat transport anomaly remains substantial in steady state. These results are consistent with earlier studies that found that both land and ocean surface temperature changes may be approximated as local responses to global mean radiative forcing. The modeled heat transport anomaly has large impacts on surface heat fluxes but small impacts on precipitation, circulation, and cloud radiative forcing compared with the impacts of surface temperature change. No substantial nonlinearities are found in these atmospheric variables when the effects of forcing and surface temperature change are added.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some necessary and sufficient conditions for closed-loop eigenstructure assignment by output feedback in time-invariant linear multivariable control systems are presented. A simple condition on a square matrix necessary and sufficient for it to be the closed-loop plant matrix of a given system with some output feedback is the basis of the paper. Some known results on entire eigenstructure assignment are deduced from this. The concept of an inner inverse of a matrix is employed to obtain a condition concerning the assignment of an eigenstructure consisting of the eigenvalues and a mixture of left and right eigenvectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of robust pole assignment by feedback in a linear, multivariable, time-invariant system which is subject to structured perturbations is investigated. A measure of robustness, or sensitivity, of the poles to a given class of perturbations is derived, and a reliable and efficient computational algorithm is presented for constructing a feedback which assigns the prescribed poles and optimizes the robustness measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using previously published data from the whisker barrel cortex of anesthetized rodents (Berwick et al 2008 J. Neurophysiol. 99 787–98) we investigated whether highly spatially localized stimulus-evoked cortical hemodynamics responses displayed a linear time-invariant (LTI) relationship with neural activity. Presentation of stimuli to individual whiskers of 2 s and 16 s durations produced hemodynamics and neural activity spatially localized to individual cortical columns. Two-dimensional optical imaging spectroscopy (2D-OIS) measured hemoglobin responses, while multi-laminar electrophysiology recorded neural activity. Hemoglobin responses to 2 s stimuli were deconvolved with underlying evoked neural activity to estimate impulse response functions which were then convolved with neural activity evoked by 16 s stimuli to generate predictions of hemodynamic responses. An LTI system more adequately described the temporal neuro-hemodynamics coupling relationship for these spatially localized sensory stimuli than in previous studies that activated the entire whisker cortex. An inability to predict the magnitude of an initial 'peak' in the total and oxy- hemoglobin responses was alleviated when excluding responses influenced by overlying arterial components. However, this did not improve estimation of the hemodynamic responses return to baseline post-stimulus cessation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine mid- to late Holocene centennial-scale climate variability in Ireland using proxy data from peatlands, lakes and a speleothem. A high degree of between-record variability is apparent in the proxy data and significant chronological uncertainties are present. However, tephra layers provide a robust tool for correlation and improve the chronological precision of the records. Although we can find no statistically significant coherence in the dataset as a whole, a selection of high-quality peatland water table reconstructions co-vary more than would be expected by chance alone. A locally weighted regression model with bootstrapping can be used to construct a ‘best-estimate’ palaeoclimatic reconstruction from these datasets. Visual comparison and cross-wavelet analysis of peatland water table compilations from Ireland and Northern Britain show that there are some periods of coherence between these records. Some terrestrial palaeoclimatic changes in Ireland appear to coincide with changes in the North Atlantic thermohaline circulation and solar activity. However, these relationships are inconsistent and may be obscured by chronological uncertainties. We conclude by suggesting an agenda for future Holocene climate research in Ireland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Advanced Along-Track Scanning Radiometer (AATSR) was launched on Envisat in March 2002. The AATSR instrument is designed to retrieve precise and accurate global sea surface temperature (SST) that, combined with the large data set collected from its predecessors, ATSR and ATSR-2, will provide a long term record of SST data that is greater than 15 years. This record can be used for independent monitoring and detection of climate change. The AATSR validation programme has successfully completed its initial phase. The programme involves validation of the AATSR derived SST values using in situ radiometers, in situ buoys and global SST fields from other data sets. The results of the initial programme presented here will demonstrate that the AATSR instrument is currently close to meeting its scientific objectives of determining global SST to an accuracy of 0.3 K (one sigma). For night time data, the analysis gives a warm bias of between +0.04 K (0.28 K) for buoys to +0.06 K (0.20 K) for radiometers, with slightly higher errors observed for day time data, showing warm biases of between +0.02 (0.39 K) for buoys to +0.11 K (0.33 K) for radiometers. They show that the ATSR series of instruments continues to be the world leader in delivering accurate space-based observations of SST, which is a key climate parameter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines the ability of several models to generate optimal hedge ratios. Statistical models employed include univariate and multivariate generalized autoregressive conditionally heteroscedastic (GARCH) models, and exponentially weighted and simple moving averages. The variances of the hedged portfolios derived using these hedge ratios are compared with those based on market expectations implied by the prices of traded options. One-month and three-month hedging horizons are considered for four currency pairs. Overall, it has been found that an exponentially weighted moving-average model leads to lower portfolio variances than any of the GARCH-based, implied or time-invariant approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Body Sensor Networks (BSNs) have been recently introduced for the remote monitoring of human activities in a broad range of application domains, such as health care, emergency management, fitness and behaviour surveillance. BSNs can be deployed in a community of people and can generate large amounts of contextual data that require a scalable approach for storage, processing and analysis. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of data streams generated in BSNs. This paper proposes BodyCloud, a SaaS approach for community BSNs that supports the development and deployment of Cloud-assisted BSN applications. BodyCloud is a multi-tier application-level architecture that integrates a Cloud computing platform and BSN data streams middleware. BodyCloud provides programming abstractions that allow the rapid development of community BSN applications. This work describes the general architecture of the proposed approach and presents a case study for the real-time monitoring and analysis of cardiac data streams of many individuals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the 1960s and early 1970s sea surface temperatures in the North Atlantic Ocean cooled rapidly. There is still considerable uncertainty about the causes of this event, although various mechanisms have been proposed. In this observational study it is demonstrated that the cooling proceeded in several distinct stages. Cool anomalies initially appeared in the mid-1960s in the Nordic Seas and Gulf Stream Extension, before spreading to cover most of the Subpolar Gyre. Subsequently, cool anomalies spread into the tropical North Atlantic before retreating, in the late 1970s, back to the Subpolar Gyre. There is strong evidence that changes in atmospheric circulation, linked to a southward shift of the Atlantic ITCZ, played an important role in the event, particularly in the period 1972-76. Theories for the cooling event must account for its distinctive space-time evolution. Our analysis suggests that the most likely drivers were: 1) The “Great Salinity Anomaly” of the late 1960s; 2) An earlier warming of the subpolar North Atlantic, which may have led to a slow-down in the Atlantic Meridional Overturning Circulation; 3) An increase in anthropogenic sulphur dioxide emissions. Determining the relative importance of these factors is a key area for future work.