69 resultados para Temporal Precision
Resumo:
SARAS is a correlation spectrometer purpose designed for precision measurements of the cosmic radio background and faint features in the sky spectrum at long wavelengths that arise from redshifted 21-cm from gas in the reionization epoch. SARAS operates in the octave band 87.5-175 MHz. We present herein the system design arguing for a complex correlation spectrometer concept. The SARAS design concept provides a differential measurement between the antenna temperature and that of an internal reference termination, with measurements in switched system states allowing for cancellation of additive contaminants from a large part of the signal flow path including the digital spectrometer. A switched noise injection scheme provides absolute spectral calibration. Additionally, we argue for an electrically small frequency-independent antenna over an absorber ground. Various critical design features that aid in avoidance of systematics and in providing calibration products for the parametrization of other unavoidable systematics are described and the rationale discussed. The signal flow and processing is analyzed and the response to noise temperatures of the antenna, reference termination and amplifiers is computed. Multi-path propagation arising from internal reflections are considered in the analysis, which includes a harmonic series of internal reflections. We opine that the SARAS design concept is advantageous for precision measurement of the absolute cosmic radio background spectrum; therefore, the design features and analysis methods presented here are expected to serve as a basis for implementations tailored to measurements of a multiplicity of features in the background sky at long wavelengths, which may arise from events in the dark ages and subsequent reionization era.
Resumo:
One of the challenges for accurately estimating Worst Case Execu-tion Time(WCET) of executables is to accurately predict their cache behaviour. Various techniques have been developed to predict the cache contents at different program points to estimate the execution time of memory-accessing instructions. One of the most widely used techniques is Abstract Interpretation based Must Analysis, which de-termines the cache blocks guaranteed to be present in the cache, and hence provides safe estimation of cache hits and misses. However,Must Analysis is highly imprecise, and platforms using Must Analysis have been known to produce blown-up WCET estimates. In our work, we propose to use May Analysis to assist the Must Analysis cache up-date and make it more precise. We prove the safety of our approach as well as provide examples where our Improved Must Analysis provides better precision. Further, we also detect a serious flaw in the original Persistence Analysis, and use Must and May Analysis to assist the Persistence Analysis cache update, to make it safe and more precise than the known solutions to the problem.
Resumo:
We address the problem of temporal envelope modeling for transient audio signals. We propose the Gamma distribution function (GDF) as a suitable candidate for modeling the envelope keeping in view some of its interesting properties such as asymmetry, causality, near-optimal time-bandwidth product, controllability of rise and decay, etc. The problem of finding the parameters of the GDF becomes a nonlinear regression problem. We overcome the hurdle by using a logarithmic envelope fit, which reduces the problem to one of linear regression. The logarithmic transformation also has the feature of dynamic range compression. Since temporal envelopes of audio signals are not uniformly distributed, in order to compute the amplitude, we investigate the importance of various loss functions for regression. Based on synthesized data experiments, wherein we have a ground truth, and real-world signals, we observe that the least-squares technique gives reasonably accurate amplitude estimates compared with other loss functions.
Resumo:
We propose a novel space-time descriptor for region-based tracking which is very concise and efficient. The regions represented by covariance matrices within a temporal fragment, are used to estimate this space-time descriptor which we call the Eigenprofiles(EP). EP so obtained is used in estimating the Covariance Matrix of features over spatio-temporal fragments. The Second Order Statistics of spatio-temporal fragments form our target model which can be adapted for variations across the video. The model being concise also allows the use of multiple spatially overlapping fragments to represent the target. We demonstrate good tracking results on very challenging datasets, shot under insufficient illumination conditions.
Resumo:
The two-pion contribution from low energies to the muon magnetic moment anomaly, although small, has a large relative uncertainty since in this region the experimental data on the cross sections are neither sufficient nor precise enough. It is therefore of interest to see whether the precision can be improved by means of additional theoretical information on the pion electromagnetic form factor, which controls the leading-order contribution. In the present paper, we address this problem by exploiting analyticity and unitarity of the form factor in a parametrization-free approach that uses the phase in the elastic region, known with high precision from the Fermi-Watson theorem and Roy equations for pi pi elastic scattering as input. The formalism also includes experimental measurements on the modulus in the region 0.65-0.70 GeV, taken from the most recent e(+)e(-) ->pi(+)pi(-) experiments, and recent measurements of the form factor on the spacelike axis. By combining the results obtained with inputs from CMD2, SND, BABAR, and KLOE, we make the predictions a(mu)(pi pi,LO)2m(pi), 0.30 GeV] = (0.553 +/- 0.004) x 10(-10) and a(mu)(pi pi,LO)0.30 GeV; 0.63 GeV] = (133.083 +/- 0.837) x 10(-10). These are consistent with the other recent determinations and have slightly smaller errors.
Resumo:
Land use (LU) land cover (LC) information at a temporal scale illustrates the physical coverage of the Earth's terrestrial surface according to its use and provides the intricate information for effective planning and management activities. LULC changes are stated as local and location specific, collectively they act as drivers of global environmental changes. Understanding and predicting the impact of LULC change processes requires long term historical restorations and projecting into the future of land cover changes at regional to global scales. The present study aims at quantifying spatio temporal landscape dynamics along the gradient of varying terrains presented in the landscape by multi-data approach (MDA). MDA incorporates multi temporal satellite imagery with demographic data and other additional relevant data sets. The gradient covers three different types of topographic features, planes; hilly terrain and coastal region to account the significant role of elevation in land cover change. The seasonality is another aspect to be considered in the vegetation dominated landscapes; variations are accounted using multi seasonal data. Spatial patterns of the various patches are identified and analysed using landscape metrics to understand the forest fragmentation. The prediction of likely changes in 2020 through scenario analysis has been done to account for the changes, considering the present growth rates and due to the proposed developmental projects. This work summarizes recent estimates on changes in cropland, agricultural intensification, deforestation, pasture expansion, and urbanization as the causal factors for LULC change.
Resumo:
Long-term surveys of entire communities of species are needed to measure fluctuations in natural populations and elucidate the mechanisms driving population dynamics and community assembly. We analysed changes in abundance of over 4000 tree species in 12 forests across the world over periods of 6-28years. Abundance fluctuations in all forests are large and consistent with population dynamics models in which temporal environmental variance plays a central role. At some sites we identify clear environmental drivers, such as fire and drought, that could underlie these patterns, but at other sites there is a need for further research to identify drivers. In addition, cross-site comparisons showed that abundance fluctuations were smaller at species-rich sites, consistent with the idea that stable environmental conditions promote higher diversity. Much community ecology theory emphasises demographic variance and niche stabilisation; we encourage the development of theory in which temporal environmental variance plays a central role.
Resumo:
This paper discusses an approach for river mapping and flood evaluation to aid multi-temporal time series analysis of satellite images utilizing pixel spectral information for image classification and region-based segmentation to extract water covered region. Analysis of Moderate Resolution Imaging Spectroradiometer (MODIS) satellite images is applied in two stages: before flood and during flood. For these images the extraction of water region utilizes spectral information for image classification and spatial information for image segmentation. Multi-temporal MODIS images from ``normal'' (non-flood) and flood time-periods are processed in two steps. In the first step, image classifiers such as artificial neural networks and gene expression programming to separate the image pixels into water and non-water groups based on their spectral features. The classified image is then segmented using spatial features of the water pixels to remove the misclassified water region. From the results obtained, we evaluate the performance of the method and conclude that the use of image classification and region-based segmentation is an accurate and reliable for the extraction of water-covered region.
Resumo:
A new representation of spatio-temporal random processes is proposed in this work. In practical applications, such processes are used to model velocity fields, temperature distributions, response of vibrating systems, to name a few. Finding an efficient representation for any random process leads to encapsulation of information which makes it more convenient for a practical implementations, for instance, in a computational mechanics problem. For a single-parameter process such as spatial or temporal process, the eigenvalue decomposition of the covariance matrix leads to the well-known Karhunen-Loeve (KL) decomposition. However, for multiparameter processes such as a spatio-temporal process, the covariance function itself can be defined in multiple ways. Here the process is assumed to be measured at a finite set of spatial locations and a finite number of time instants. Then the spatial covariance matrix at different time instants are considered to define the covariance of the process. This set of square, symmetric, positive semi-definite matrices is then represented as a third-order tensor. A suitable decomposition of this tensor can identify the dominant components of the process, and these components are then used to define a closed-form representation of the process. The procedure is analogous to the KL decomposition for a single-parameter process, however, the decompositions and interpretations vary significantly. The tensor decompositions are successfully applied on (i) a heat conduction problem, (ii) a vibration problem, and (iii) a covariance function taken from the literature that was fitted to model a measured wind velocity data. It is observed that the proposed representation provides an efficient approximation to some processes. Furthermore, a comparison with KL decomposition showed that the proposed method is computationally cheaper than the KL, both in terms of computer memory and execution time.
Resumo:
This paper proposes an automatic acoustic-phonetic method for estimating voice-onset time of stops. This method requires neither transcription of the utterance nor training of a classifier. It makes use of the plosion index for the automatic detection of burst onsets of stops. Having detected the burst onset, the onset of the voicing following the burst is detected using the epochal information and a temporal measure named the maximum weighted inner product. For validation, several experiments are carried out on the entire TIMIT database and two of the CMU Arctic corpora. The performance of the proposed method compares well with three state-of-the-art techniques. (C) 2014 Acoustical Society of America
Resumo:
A novel algorithm for Virtual View Synthesis based on Non-Local Means Filtering is presented in this paper. Apart from using the video frames from the nearby cameras and the corresponding per-pixel depth map, this algorithm also makes use of the previously synthesized frame. Simple and efficient, the algorithm can synthesize video at any given virtual viewpoint at a faster rate. In the process, the quality of the synthesized frame is not compromised. Experimental results prove the above mentioned claim. The subjective and objective quality of the synthesized frames are comparable to the existing algorithms.
Resumo:
High wind poses a number of hazards in different areas such as structural safety, aviation, and wind energy-where low wind speed is also a concern, pollutant transport, to name a few. Therefore, usage of a good prediction tool for wind speed is necessary in these areas. Like many other natural processes, behavior of wind is also associated with considerable uncertainties stemming from different sources. Therefore, to develop a reliable prediction tool for wind speed, these uncertainties should be taken into account. In this work, we propose a probabilistic framework for prediction of wind speed from measured spatio-temporal data. The framework is based on decompositions of spatio-temporal covariance and simulation using these decompositions. A novel simulation method based on a tensor decomposition is used here in this context. The proposed framework is composed of a set of four modules, and the modules have flexibility to accommodate further modifications. This framework is applied on measured data on wind speed in Ireland. Both short-and long-term predictions are addressed.
Resumo:
The problem addressed in this paper is sound, scalable, demand-driven null-dereference verification for Java programs. Our approach consists conceptually of a base analysis, plus two major extensions for enhanced precision. The base analysis is a dataflow analysis wherein we propagate formulas in the backward direction from a given dereference, and compute a necessary condition at the entry of the program for the dereference to be potentially unsafe. The extensions are motivated by the presence of certain ``difficult'' constructs in real programs, e.g., virtual calls with too many candidate targets, and library method calls, which happen to need excessive analysis time to be analyzed fully. The base analysis is hence configured to skip such a difficult construct when it is encountered by dropping all information that has been tracked so far that could potentially be affected by the construct. Our extensions are essentially more precise ways to account for the effect of these constructs on information that is being tracked, without requiring full analysis of these constructs. The first extension is a novel scheme to transmit formulas along certain kinds of def-use edges, while the second extension is based on using manually constructed backward-direction summary functions of library methods. We have implemented our approach, and applied it on a set of real-life benchmarks. The base analysis is on average able to declare about 84% of dereferences in each benchmark as safe, while the two extensions push this number up to 91%. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Local heterogeneity is ubiquitous in natural aqueous systems. It can be caused locally by external biomolecular subsystems like proteins, DNA, micelles and reverse micelles, nanoscopic materials etc., but can also be intrinsic to the thermodynamic nature of the aqueous solution itself (like binary mixtures or at the gas-liquid interface). The altered dynamics of water in the presence of such diverse surfaces has attracted considerable attention in recent years. As these interfaces are quite narrow, only a few molecular layers thick, they are hard to study by conventional methods. The recent development of two dimensional infra-red (2D-IR) spectroscopy allows us to estimate length and time scales of such dynamics fairly accurately. In this work, we present a series of interesting studies employing two dimensional infra-red spectroscopy (2D-IR) to investigate (i) the heterogeneous dynamics of water inside reverse micelles of varying sizes, (ii) supercritical water near the Widom line that is known to exhibit pronounced density fluctuations and also study (iii) the collective and local polarization fluctuation of water molecules in the presence of several different proteins. The spatio-temporal correlation of confined water molecules inside reverse micelles of varying sizes is well captured through the spectral diffusion of corresponding 2D-IR spectra. In the case of supercritical water also, we observe a strong signature of dynamic heterogeneity from the elongated nature of the 2D-IR spectra. In this case the relaxation is ultrafast. We find remarkable agreement between the different tools employed to study the relaxation of density heterogeneity. For aqueous protein solutions, we find that the calculated dielectric constant of the respective systems unanimously shows a noticeable increment compared to that of neat water. However, the `effective' dielectric constant for successive layers shows significant variation, with the layer adjacent to the protein having a much lower value. Relaxation is also slowest at the surface. We find that the dielectric constant achieves the bulk value at distances more than 3 nm from the surface of the protein.