876 resultados para bigdata, data stream processing, dsp, apache storm, cyber security
Resumo:
In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.
Resumo:
The differential phase (ΦDP) measured by polarimetric radars is recognized to be a very good indicator of the path integrated by rain. Moreover, if a linear relationship is assumed between the specific differential phase (KDP) and the specific attenuation (AH) and specific differential attenuation (ADP), then attenuation can easily be corrected. The coefficients of proportionality, γH and γDP, are, however, known to be dependent in rain upon drop temperature, drop shapes, drop size distribution, and the presence of large drops causing Mie scattering. In this paper, the authors extensively apply a physically based method, often referred to as the “Smyth and Illingworth constraint,” which uses the constraint that the value of the differential reflectivity ZDR on the far side of the storm should be low to retrieve the γDP coefficient. More than 30 convective episodes observed by the French operational C-band polarimetric Trappes radar during two summers (2005 and 2006) are used to document the variability of γDP with respect to the intrinsic three-dimensional characteristics of the attenuating cells. The Smyth and Illingworth constraint could be applied to only 20% of all attenuated rays of the 2-yr dataset so it cannot be considered the unique solution for attenuation correction in an operational setting but is useful for characterizing the properties of the strongly attenuating cells. The range of variation of γDP is shown to be extremely large, with minimal, maximal, and mean values being, respectively, equal to 0.01, 0.11, and 0.025 dB °−1. Coefficient γDP appears to be almost linearly correlated with the horizontal reflectivity (ZH), differential reflectivity (ZDR), and specific differential phase (KDP) and correlation coefficient (ρHV) of the attenuating cells. The temperature effect is negligible with respect to that of the microphysical properties of the attenuating cells. Unusually large values of γDP, above 0.06 dB °−1, often referred to as “hot spots,” are reported for 15%—a nonnegligible figure—of the rays presenting a significant total differential phase shift (ΔϕDP > 30°). The corresponding strongly attenuating cells are shown to have extremely high ZDR (above 4 dB) and ZH (above 55 dBZ), very low ρHV (below 0.94), and high KDP (above 4° km−1). Analysis of 4 yr of observed raindrop spectra does not reproduce such low values of ρHV, suggesting that (wet) ice is likely to be present in the precipitation medium and responsible for the attenuation and high phase shifts. Furthermore, if melting ice is responsible for the high phase shifts, this suggests that KDP may not be uniquely related to rainfall rate but can result from the presence of wet ice. This hypothesis is supported by the analysis of the vertical profiles of horizontal reflectivity and the values of conventional probability of hail indexes.
Resumo:
We are developing computational tools supporting the detailed analysis of the dependence of neural electrophysiological response on dendritic morphology. We approach this problem by combining simulations of faithful models of neurons (experimental real life morphological data with known models of channel kinetics) with algorithmic extraction of morphological and physiological parameters and statistical analysis. In this paper, we present the novel method for an automatic recognition of spike trains in voltage traces, which eliminates the need for human intervention. This enables classification of waveforms with consistent criteria across all the analyzed traces and so it amounts to reduction of the noise in the data. This method allows for an automatic extraction of relevant physiological parameters necessary for further statistical analysis. In order to illustrate the usefulness of this procedure to analyze voltage traces, we characterized the influence of the somatic current injection level on several electrophysiological parameters in a set of modeled neurons. This application suggests that such an algorithmic processing of physiological data extracts parameters in a suitable form for further investigation of structure-activity relationship in single neurons.
Resumo:
This paper introduces a new blind equalisation algorithm for the pulse amplitude modulation (PAM) data transmitted through nonminimum phase (NMP) channels. The algorithm itself is based on a noncausal AR model of communication channels and the second- and fourth-order cumulants of the received data series, where only the diagonal slices of cumulants are used. The AR parameters are adjusted at each sample by using a successive over-relaxation (SOR) scheme, a variety of the ordinary LMS scheme, but with a faster convergence rate and a greater robustness to the selection of the ‘step-size’ in iterations. Computer simulations are implemented for both linear time-invariant (LTI) and linear time-variant (LTV) NMP channels, and the results show that the algorithm proposed in this paper has a fast convergence rate and a potential capability to track the LTV NMP channels.
Resumo:
Previous studies have shown that sea-ice in the Sea of Okhotsk can be affected by local storms; in turn, the resultant sea-ice changes can affect the downstream development of storm tracks in the Pacific and possibly dampen a pre-existing North Atlantic Oscillation (NAO) signal in late winter. In this paper, a storm tracking algorithm was applied to the six hourly horizontal winds from the National Centers for Environmental Prediction (NCEP) reanalysis data from 1978(9) to 2007 and output from the atmospheric general circulation model (AGCM) ECHAM5 forced by sea-ice anomalies in the Sea of Okhotsk. The life cycle response of storms to sea-ice anomalies is investigated using various aspects of storm activity—cyclone genesis, lysis, intensity and track density. Results show that, for enhanced positive sea-ice concentrations in the Sea of Okhotsk, there is a decrease in secondary cyclogenesis, a westward shift in cyclolysis and changes in the subtropical jet are seen in the North Pacific. In the Atlantic, a pattern resembling the negative phase of the NAO is observed. This pattern is confirmed by the AGCM ECHAM5 experiments driven with above normal sea-ice anomalies in the Sea of Okhotsk
Resumo:
Increased tidal levels and storm surges related to climate change are projected to result in extremely adverse effects on coastal regions. Predictions of such extreme and small-scale events, however, are exceedingly challenging, even for relatively short time horizons. Here we use data from observations, ERA-40 reanalysis, climate scenario simulations, and a simple feature model to find that the frequency of extreme storm surge events affecting Venice is projected to decrease by about 30% by the end of the twenty-first century. In addition, through a trend assessment based on tidal observations we found a reduction in extreme tidal levels. Extrapolating the current +17 cm/century sea level trend, our results suggest that the frequency of extreme tides in Venice might largely remain unaltered under the projected twenty-first century climate simulations.
Resumo:
The potential of visible-near infrared spectra, obtained using a light backscatter sensor, in conjunction with chemometrics, to predict curd moisture and whey fat content in a cheese vat was examined. A three-factor (renneting temperature, calcium chloride, cutting time), central composite design was carried out in triplicate. Spectra (300–1,100 nm) of the product in the cheese vat were captured during syneresis using a prototype light backscatter sensor. Stirring followed upon cutting the gel, and samples of curd and whey were removed at 10 min intervals and analyzed for curd moisture and whey fat content. Spectral data were used to develop models for predicting curd moisture and whey fat contents using partial least squares regression. Subjecting the spectral data set to Jack-knifing improved the accuracy of the models. The whey fat models (R = 0.91, 0.95) and curd moisture model (R = 0.86, 0.89) provided good and approximate predictions, respectively. Visible-near infrared spectroscopy was found to have potential for the prediction of important syneresis indices in stirred cheese vats.
Resumo:
The impact of North Atlantic SST patterns on the storm track is investigated using a hierarchy of GCM simulations using idealized (aquaplanet) and “semirealistic” boundary conditions in the atmospheric component (HadAM3) of the third climate configuration of the Met Office Unified Model (HadCM3). This framework enables the mechanisms determining the tropospheric response to North Atlantic SST patterns to be examined, both in isolation and in combination with continental-scale landmasses and orography. In isolation, a “Gulf Stream” SST pattern acts to strengthen the downstream storm track while a “North Atlantic Drift” SST pattern weakens it. These changes are consistent with changes in the extratropical SST gradient and near-surface baroclinicity, and each storm-track response is associated with a consistent change in the tropospheric jet structure. Locally enhanced near-surface horizontal wind convergence is found over the warm side of strengthened SST gradients associated with ascending air and increased precipitation, consistent with previous studies. When the combined SST pattern is introduced into the semirealistic framework (including the “North American” continent and the “Rocky Mountains”), the results suggest that the topographically generated southwest–northeast tilt in the North Atlantic storm track is enhanced. In particular, the Gulf Stream shifts the storm track south in the western Atlantic whereas the strong high-latitude SST gradient in the northeastern Atlantic enhances the storm track there.
Resumo:
This paper examines two hydrochemical time-series derived from stream samples taken in the Upper Hafren catchment, Plynlimon, Wales. One time-series comprises data collected at 7-hour intervals over 22 months (Neal et al., submitted, this issue), while the other is based on weekly sampling over 20 years. A subset of determinands: aluminium, calcium, chloride, conductivity, dissolved organic carbon, iron, nitrate, pH, silicon and sulphate are examined within a framework of non-stationary time-series analysis to identify determinand trends, seasonality and short-term dynamics. The results demonstrate that both long-term and high-frequency monitoring provide valuable and unique insights into the hydrochemistry of a catchment. The long-term data allowed analysis of long-termtrends, demonstrating continued increases in DOC concentrations accompanied by declining SO4 concentrations within the stream, and provided new insights into the changing amplitude and phase of the seasonality of the determinands such as DOC and Al. Additionally, these data proved invaluable for placing the short-term variability demonstrated within the high-frequency data within context. The 7-hour data highlighted complex diurnal cycles for NO3, Ca and Fe with cycles displaying changes in phase and amplitude on a seasonal basis. The high-frequency data also demonstrated the need to consider the impact that the time of sample collection can have on the summary statistics of the data and also that sampling during the hours of darkness provides additional hydrochemical information for determinands which exhibit pronounced diurnal variability. Moving forward, this research demonstrates the need for both long-term and high-frequency monitoring to facilitate a full and accurate understanding of catchment hydrochemical dynamics.
Resumo:
Basic Network transactions specifies that datagram from source to destination is routed through numerous routers and paths depending on the available free and uncongested paths which results in the transmission route being too long, thus incurring greater delay, jitter, congestion and reduced throughput. One of the major problems of packet switched networks is the cell delay variation or jitter. This cell delay variation is due to the queuing delay depending on the applied loading conditions. The effect of delay, jitter accumulation due to the number of nodes along transmission routes and dropped packets adds further complexity to multimedia traffic because there is no guarantee that each traffic stream will be delivered according to its own jitter constraints therefore there is the need to analyze the effects of jitter. IP routers enable a single path for the transmission of all packets. On the other hand, Multi-Protocol Label Switching (MPLS) allows separation of packet forwarding and routing characteristics to enable packets to use the appropriate routes and also optimize and control the behavior of transmission paths. Thus correcting some of the shortfalls associated with IP routing. Therefore MPLS has been utilized in the analysis for effective transmission through the various networks. This paper analyzes the effect of delay, congestion, interference, jitter and packet loss in the transmission of signals from source to destination. In effect the impact of link failures, repair paths in the various physical topologies namely bus, star, mesh and hybrid topologies are all analyzed based on standard network conditions.
Resumo:
Sirens’ used by police, fire and paramedic vehicles generate noise that propagates inside the vehicle cab that subsequently corrupts intelligibility of voice communications from the emergency vehicle to the control room. It is even common for the siren to be turned off to enable the control room to hear what is being said. Both fixed filter and adaptive filter systems have previously been developed to help cancel the transmission of the siren noise over the radio. Previous cancellation systems have only concentrated on the traditional 2-tone, wail and yelp sirens. This paper discusses an improvement to a previous adaptive filter system and presents the cancellation results to three new types of sirens; being chirp pulsar and localiser. A siren noise filter system has the capability to improve the response time for an emergency vehicle and thus help save lives. To date, this system has been tested using live recordings taken from a nonemergency situation with good results.
Resumo:
Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.
Resumo:
The technique of constructing a transformation, or regrading, of a discrete data set such that the histogram of the transformed data matches a given reference histogram is commonly known as histogram modification. The technique is widely used for image enhancement and normalization. A method which has been previously derived for producing such a regrading is shown to be “best” in the sense that it minimizes the error between the cumulative histogram of the transformed data and that of the given reference function, over all single-valued, monotone, discrete transformations of the data. Techniques for smoothed regrading, which provide a means of balancing the error in matching a given reference histogram against the information lost with respect to a linear transformation are also examined. The smoothed regradings are shown to optimize certain cost functionals. Numerical algorithms for generating the smoothed regradings, which are simple and efficient to implement, are described, and practical applications to the processing of LANDSAT image data are discussed.
Resumo:
Methods for producing nonuniform transformations, or regradings, of discrete data are discussed. The transformations are useful in image processing, principally for enhancement and normalization of scenes. Regradings which “equidistribute” the histogram of the data, that is, which transform it into a constant function, are determined. Techniques for smoothing the regrading, dependent upon a continuously variable parameter, are presented. Generalized methods for constructing regradings such that the histogram of the data is transformed into any prescribed function are also discussed. Numerical algorithms for implementing the procedures and applications to specific examples are described.