899 resultados para the SIMPLE algorithm


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article a simple and effective algorithm is introduced for the system identification of the Wiener system using observational input/output data. The nonlinear static function in the Wiener system is modelled using a B-spline neural network. The Gauss–Newton algorithm is combined with De Boor algorithm (both curve and the first order derivatives) for the parameter estimation of the Wiener model, together with the use of a parameter initialisation scheme. Numerical examples are utilised to demonstrate the efficacy of the proposed approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The estimation of prediction quality is important because without quality measures, it is difficult to determine the usefulness of a prediction. Currently, methods for ligand binding site residue predictions are assessed in the function prediction category of the biennial Critical Assessment of Techniques for Protein Structure Prediction (CASP) experiment, utilizing the Matthews Correlation Coefficient (MCC) and Binding-site Distance Test (BDT) metrics. However, the assessment of ligand binding site predictions using such metrics requires the availability of solved structures with bound ligands. Thus, we have developed a ligand binding site quality assessment tool, FunFOLDQA, which utilizes protein feature analysis to predict ligand binding site quality prior to the experimental solution of the protein structures and their ligand interactions. The FunFOLDQA feature scores were combined using: simple linear combinations, multiple linear regression and a neural network. The neural network produced significantly better results for correlations to both the MCC and BDT scores, according to Kendall’s τ, Spearman’s ρ and Pearson’s r correlation coefficients, when tested on both the CASP8 and CASP9 datasets. The neural network also produced the largest Area Under the Curve score (AUC) when Receiver Operator Characteristic (ROC) analysis was undertaken for the CASP8 dataset. Furthermore, the FunFOLDQA algorithm incorporating the neural network, is shown to add value to FunFOLD, when both methods are employed in combination. This results in a statistically significant improvement over all of the best server methods, the FunFOLD method (6.43%), and one of the top manual groups (FN293) tested on the CASP8 dataset. The FunFOLDQA method was also found to be competitive with the top server methods when tested on the CASP9 dataset. To the best of our knowledge, FunFOLDQA is the first attempt to develop a method that can be used to assess ligand binding site prediction quality, in the absence of experimental data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accurate observations of cloud microphysical properties are needed for evaluating and improving the representation of cloud processes in climate models and better estimate of the Earth radiative budget. However, large differences are found in current cloud products retrieved from ground-based remote sensing measurements using various retrieval algorithms. Understanding the differences is an important step to address uncertainties in the cloud retrievals. In this study, an in-depth analysis of nine existing ground-based cloud retrievals using ARM remote sensing measurements is carried out. We place emphasis on boundary layer overcast clouds and high level ice clouds, which are the focus of many current retrieval development efforts due to their radiative importance and relatively simple structure. Large systematic discrepancies in cloud microphysical properties are found in these two types of clouds among the nine cloud retrieval products, particularly for the cloud liquid and ice particle effective radius. Note that the differences among some retrieval products are even larger than the prescribed uncertainties reported by the retrieval algorithm developers. It is shown that most of these large differences have their roots in the retrieval theoretical bases, assumptions, as well as input and constraint parameters. This study suggests the need to further validate current retrieval theories and assumptions and even the development of new retrieval algorithms with more observations under different cloud regimes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The collection of wind speed time series by means of digital data loggers occurs in many domains, including civil engineering, environmental sciences and wind turbine technology. Since averaging intervals are often significantly larger than typical system time scales, the information lost has to be recovered in order to reconstruct the true dynamics of the system. In the present work we present a simple algorithm capable of generating a real-time wind speed time series from data logger records containing the average, maximum, and minimum values of the wind speed in a fixed interval, as well as the standard deviation. The signal is generated from a generalized random Fourier series. The spectrum can be matched to any desired theoretical or measured frequency distribution. Extreme values are specified through a postprocessing step based on the concept of constrained simulation. Applications of the algorithm to 10-min wind speed records logged at a test site at 60 m height above the ground show that the recorded 10-min values can be reproduced by the simulated time series to a high degree of accuracy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reading comprehension is an area of difficulty for many individuals with autism spectrum disorders (ASD). According to the Simple View of Reading, word recognition and oral language are both important determinants of reading comprehension ability. We provide a novel test of this model in 100 adolescents with ASD of varying intellectual ability. Further, we explore whether reading comprehension is additionally influenced by individual differences in social behaviour and social cognition in ASD. Adolescents with ASD aged 14-16 years completed assessments indexing word recognition, oral language, reading comprehension, social behaviour and social cognition. Regression analyses show that both word recognition and oral language explain unique variance in reading comprehension. Further, measures of social behaviour and social cognition predict reading comprehension after controlling for the variance explained by word recognition and oral language. This indicates that word recognition, oral language and social impairments may constrain reading comprehension in ASD.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Three years of meteorological data collected at the WLEF-TV tower were used to drive a revised version of the Simple Biosphere (SiB 2.5) Model. Physiological properties and vegetation phenology were specified from satellite imagery. Simulated fluxes of heat, moisture, and carbon were compared to eddy covariance measurements taken onsite as a means of evaluating model performance on diurnal, synoptic, seasonal, and interannual time scales. The model was very successful in simulating variations of latent heat flux when compared to observations, slightly less so in the simulation of sensible heat flux. The model overestimated peak values of sensible heat flux on both monthly and diurnal scales. There was evidence that the differences between observed and simulated fluxes might be linked to wetlands near the WLEF tower, which were not present in the SiB simulation. The model overestimated the magnitude of the net ecosystem exchange of CO2 in both summer and winter. Mid-day maximum assimilation was well represented by the model, but late afternoon simulations showed excessive carbon uptake due to misrepresentation of within-canopy shading in the model. Interannual variability was not well simulated because only a single year of satellite imagery was used to parameterize the model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Evolutionary meta-algorithms for pulse shaping of broadband femtosecond duration laser pulses are proposed. The genetic algorithm searching the evolutionary landscape for desired pulse shapes consists of a population of waveforms (genes), each made from two concatenated vectors, specifying phases and magnitudes, respectively, over a range of frequencies. Frequency domain operators such as mutation, two-point crossover average crossover, polynomial phase mutation, creep and three-point smoothing as well as a time-domain crossover are combined to produce fitter offsprings at each iteration step. The algorithm applies roulette wheel selection; elitists and linear fitness scaling to the gene population. A differential evolution (DE) operator that provides a source of directed mutation and new wavelet operators are proposed. Using properly tuned parameters for DE, the meta-algorithm is used to solve a waveform matching problem. Tuning allows either a greedy directed search near the best known solution or a robust search across the entire parameter space.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyze and study a pervasive computing system in a mining environment to track people based on RFID (radio frequency identification) technology. In first instance, we explain the RFID fundamentals and the LANDMARC (location identification based on dynamic active RFID calibration) algorithm, then we present the proposed algorithm combining LANDMARC and trilateration technique to collect the coordinates of the people inside the mine, next we generalize a pervasive computing system that can be implemented in mining, and finally we show the results and conclusions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most of the operational Sea Surface Temperature (SST) products derived from satellite infrared radiometry use multi-spectral algorithms. They show, in general, reasonable performances with root mean square (RMS) residuals around 0.5 K when validated against buoy measurements, but have limitations, particularly a component of the retrieval error that relates to such algorithms' limited ability to cope with the full variability of atmospheric absorption and emission. We propose to use forecast atmospheric profiles and a radiative transfer model to simulate the algorithmic errors of multi-spectral algorithms. In the practical case of SST derived from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard Meteosat Second Generation (MSG), we demonstrate that simulated algorithmic errors do explain a significant component of the actual errors observed for the non linear (NL) split window algorithm in operational use at the Centre de Météorologie Spatiale (CMS). The simulated errors, used as correction terms, reduce significantly the regional biases of the NL algorithm as well as the standard deviation of the differences with drifting buoy measurements. The availability of atmospheric profiles associated with observed satellite-buoy differences allows us to analyze the origins of the main algorithmic errors observed in the SEVIRI field of view: a negative bias in the inter-tropical zone, and a mid-latitude positive bias. We demonstrate how these errors are explained by the sensitivity of observed brightness temperatures to the vertical distribution of water vapour, propagated through the SST retrieval algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes the techniques used to obtain sea surface temperature (SST) retrievals from the Geostationary Operational Environmental Satellite 12 (GOES-12) at the National Oceanic and Atmospheric Administration’s Office of Satellite Data Processing and Distribution. Previous SST retrieval techniques relying on channels at 11 and 12 μm are not applicable because GOES-12 lacks the latter channel. Cloud detection is performed using a Bayesian method exploiting fast-forward modeling of prior clear-sky radiances using numerical weather predictions. The basic retrieval algorithm used at nighttime is based on a linear combination of brightness temperatures at 3.9 and 11 μm. In comparison with traditional split window SSTs (using 11- and 12-μm channels), simulations show that this combination has maximum scatter when observing drier colder scenes, with a comparable overall performance. For daytime retrieval, the same algorithm is applied after estimating and removing the contribution to brightness temperature in the 3.9-μm channel from solar irradiance. The correction is based on radiative transfer simulations and comprises a parameterization for atmospheric scattering and a calculation of ocean surface reflected radiance. Potential use of the 13-μm channel for SST is shown in a simulation study: in conjunction with the 3.9-μm channel, it can reduce the retrieval error by 30%. Some validation results are shown while a companion paper by Maturi et al. shows a detailed analysis of the validation results for the operational algorithms described in this present article.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article, we investigate how the choice of the attenuation factor in an extended version of Katz centrality influences the centrality of the nodes in evolving communication networks. For given snapshots of a network, observed over a period of time, recently developed communicability indices aim to identify the best broadcasters and listeners (receivers) in the network. Here we explore the attenuation factor constraint, in relation to the spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We compare three different communicability measures: standard, exponential, and relaxed (where the spectral radius bound on the attenuation factor is relaxed and the adjacency matrix is normalised, in order to maintain the convergence of the measure). Furthermore, using a vitality-based measure of both standard and relaxed communicability indices, we look at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We compare those measures with the scores produced by an iterative version of the PageRank algorithm and illustrate our findings with two examples of real-life evolving networks: the MIT reality mining data set, consisting of daily communications between 106 individuals over the period of one year, a UK Twitter mentions network, constructed from the direct \emph{tweets} between 12.4k individuals during one week, and a subset the Enron email data set.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This technique paper describes a novel method for quantitatively and routinely identifying auroral breakup following substorm onset using the Time History of Events and Macroscale Interactions During Substorms (THEMIS) all-sky imagers (ASIs). Substorm onset is characterised by a brightening of the aurora that is followed by auroral poleward expansion and auroral breakup. This breakup can be identified by a sharp increase in the auroral intensity i(t) and the time derivative of auroral intensity i'(t). Utilising both i(t) and i'(t) we have developed an algorithm for identifying the time interval and spatial location of auroral breakup during the substorm expansion phase within the field of view of ASI data based solely on quantifiable characteristics of the optical auroral emissions. We compare the time interval determined by the algorithm to independently identified auroral onset times from three previously published studies. In each case the time interval determined by the algorithm is within error of the onset independently identified by the prior studies. We further show the utility of the algorithm by comparing the breakup intervals determined using the automated algorithm to an independent list of substorm onset times. We demonstrate that up to 50% of the breakup intervals characterised by the algorithm are within the uncertainty of the times identified in the independent list. The quantitative description and routine identification of an interval of auroral brightening during the substorm expansion phase provides a foundation for unbiased statistical analysis of the aurora to probe the physics of the auroral substorm as a new scientific tool for aiding the identification of the processes leading to auroral substorm onset.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mineral dust aerosols in the atmosphere have the potential to affect the global climate by influencing the radiative balance of the atmosphere and the supply of micronutrients to the ocean. Ice and marine sediment cores indicate that dust deposition from the atmosphere was at some locations 2–20 times greater during glacial periods, raising the possibility that mineral aerosols might have contributed to climate change on glacial-interglacial time scales. To address this question, we have used linked terrestrial biosphere, dust source, and atmospheric transport models to simulate the dust cycle in the atmosphere for current and last glacial maximum (LGM) climates. We obtain a 2.5-fold higher dust loading in the entire atmosphere and a twenty-fold higher loading in high latitudes, in LGM relative to present. Comparisons to a compilation of atmospheric dust deposition flux estimates for LGM and present in marine sediment and ice cores show that the simulated flux ratios are broadly in agreement with observations; differences suggest where further improvements in the simple dust model could be made. The simulated increase in high-latitude dustiness depends on the expansion of unvegetated areas, especially in the high latitudes and in central Asia, caused by a combination of increased aridity and low atmospheric [CO2]. The existence of these dust source areas at the LGM is supported by pollen data and loess distribution in the northern continents. These results point to a role for vegetation feedbacks, including climate effects and physiological effects of low [CO2], in modulating the atmospheric distribution of dust.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigate the relationship between interdiurnal variation geomagnetic activity indices, IDV and IDV(1d), corrected sunspot number, R{sub}C{\sub}, and the group sunspot number R{sub}G{\sub}. R{sub}C{\sub} uses corrections for both the “Waldmeier discontinuity”, as derived in Paper 1 [Lockwood et al., 2014c], and the “Wolf discontinuity” revealed by Leussu et al. [2013]. We show that the simple correlation of the geomagnetic indices with R{sub}C{\sub}{sup}n{\sup} or R{sub}G{\sub}{sup}n{\sup} masks a considerable solar cycle variation. Using IDV(1d) or IDV to predict or evaluate the sunspot numbers, the errors are almost halved by allowing for the fact that the relationship varies over the solar cycle. The results indicate that differences between R{sub}C{\sub} and R{sub}G{\sub} have a variety of causes and are highly unlikely to be attributable to errors in either R{sub}G{\sub} alone, as has recently been assumed. Because it is not known if R{sub}C{\sub} or R{sub}G{\sub} is a better predictor of open flux emergence before 1874, a simple sunspot number composite is suggested which, like R{sub}G{\sub}, enables modelling of the open solar flux for 1610 onwards in Paper 3, but maintains the characteristics of R{sub}C{\sub}.