849 resultados para RM extended algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose an efficient two-level model identification method for a large class of linear-in-the-parameters models from the observational data. A new elastic net orthogonal forward regression (ENOFR) algorithm is employed at the lower level to carry out simultaneous model selection and elastic net parameter estimation. The two regularization parameters in the elastic net are optimized using a particle swarm optimization (PSO) algorithm at the upper level by minimizing the leave one out (LOO) mean square error (LOOMSE). Illustrative examples are included to demonstrate the effectiveness of the new approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pardo, Patie, and Savov derived, under mild conditions, a Wiener-Hopf type factorization for the exponential functional of proper Lévy processes. In this paper, we extend this factorization by relaxing a finite moment assumption as well as by considering the exponential functional for killed Lévy processes. As a by-product, we derive some interesting fine distributional properties enjoyed by a large class of this random variable, such as the absolute continuity of its distribution and the smoothness, boundedness or complete monotonicity of its density. This type of results is then used to derive similar properties for the law of maxima and first passage time of some stable Lévy processes. Thus, for example, we show that for any stable process with $\rho\in(0,\frac{1}{\alpha}-1]$, where $\rho\in[0,1]$ is the positivity parameter and $\alpha$ is the stable index, then the first passage time has a bounded and non-increasing density on $\mathbb{R}_+$. We also generate many instances of integral or power series representations for the law of the exponential functional of Lévy processes with one or two-sided jumps. The proof of our main results requires different devices from the one developed by Pardo, Patie, Savov. It relies in particular on a generalization of a transform recently introduced by Chazal et al together with some extensions to killed Lévy process of Wiener-Hopf techniques. The factorizations developed here also allow for further applications which we only indicate here also allow for further applications which we only indicate here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Novel bis(azidophenyl)phosphole sulfide building block 8 has been developed to give access to a plethora of phosphole-containing π-conjugated systems in a simple synthetic step. This was explored for the reaction of the two azido moieties with phenyl-, pyridyl- and thienylacetylenes, to give bis(aryltriazolyl)-extended π-systems, having either the phosphole sulfide (9) or the phosphole (10) group as central ring. These conjugated frameworks exhibit intriguing photophysical and electrochemical properties that vary with the nature of the aromatic end-group. The λ3-phospholes 10 display blue fluorescence (λem = 460–469 nm) with high quan-tum yield (ΦF = 0.134–0.309). The radical anion of pyridylsubstituted phosphole sulfide 9b was observed with UV/Vis spectroscopy. TDDFT calculations on the extended π-systems showed some variation in the shape of the HOMOs, which was found to have an effect on the extent of charge transfer, depending on the aromatic end-group. Some fine-tuning of the emission maxima was observed, albeit subtle, showing a decrease in conjugation in the order thienyl � phenyl � pyridyl. These results show that variations in the distal ends of such π-systems have a subtle but significant effect on photophysical properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An in vitro colon extended physiologically based extraction test (CEPBET) which incorporates human gastrointestinal tract (GIT) parameters (including pH and chemistry, solid-to-fluid ratio, mixing and emptying rates) was applied for the first time to study the bioaccessibility of brominated flame retardants (BFRs) from the 3 main GIT compartments (stomach, small intestine and colon) following ingestion of indoor dust. Results revealed the bioaccessibility of γ-HBCD (72%) was less than that for α- and β-isomers (92% and 80% respectively) which may be attributed to the lower aqueous solubility of the γ-isomer (2 μg L−1) compared to the α- and β-isomers (45 and 15 μg L−1 respectively). No significant change in the enantiomeric fractions of HBCDs was observed in any of the studied samples. However, this does not completely exclude the possibility of in vivo enantioselective absorption of HBCDs, as the GIT cell lining and bacterial flora – which may act enantioselectively – are not included in the current CE-PBET model. While TBBP-A was almost completely (94%) bioaccessible, BDE-209 was the least (14%) bioaccessible of the studied BFRs. Bioaccessibility of tri-hepta BDEs ranged from 32–58%. No decrease in the bioaccessibility with increasing level of bromination was observed in the studied PBDEs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolutionary meta-algorithms for pulse shaping of broadband femtosecond duration laser pulses are proposed. The genetic algorithm searching the evolutionary landscape for desired pulse shapes consists of a population of waveforms (genes), each made from two concatenated vectors, specifying phases and magnitudes, respectively, over a range of frequencies. Frequency domain operators such as mutation, two-point crossover average crossover, polynomial phase mutation, creep and three-point smoothing as well as a time-domain crossover are combined to produce fitter offsprings at each iteration step. The algorithm applies roulette wheel selection; elitists and linear fitness scaling to the gene population. A differential evolution (DE) operator that provides a source of directed mutation and new wavelet operators are proposed. Using properly tuned parameters for DE, the meta-algorithm is used to solve a waveform matching problem. Tuning allows either a greedy directed search near the best known solution or a robust search across the entire parameter space.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the end of the 20th century, we can look back on a spectacular development of numerical weather prediction, which has, practically uninterrupted, been going on since the middle of the century. High-resolution predictions for more than a week ahead for any part of the globe are now routinely produced and anyone with an Internet connection can access many of these forecasts for anywhere in the world. Extended predictions for several seasons ahead are also being done — the latest El Niño event in 1997/1998 is an example of such a successful prediction. The great achievement is due to a number of factors including the progress in computational technology and the establishment of global observing systems, combined with a systematic research program with an overall strategy towards building comprehensive prediction systems for climate and weather. In this article, I will discuss the different evolutionary steps in this development and the way new scientific ideas have contributed to efficiently explore the computing power and in using observations from new types of observing systems. Weather prediction is not an exact science due to unavoidable errors in initial data and in the models. To quantify the reliability of a forecast is therefore essential and probably more so the longer the forecasts are. Ensemble prediction is thus a new and important concept in weather and climate prediction, which I believe will become a routine aspect of weather prediction in the future. The limit between weather and climate prediction is becoming more and more diffuse and in the final part of this article I will outline the way I think development may proceed in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In cooperative communication networks, owing to the nodes' arbitrary geographical locations and individual oscillators, the system is fundamentally asynchronous. This will damage some of the key properties of the space-time codes and can lead to substantial performance degradation. In this paper, we study the design of linear dispersion codes (LDCs) for such asynchronous cooperative communication networks. Firstly, the concept of conventional LDCs is extended to the delay-tolerant version and new design criteria are discussed. Then we propose a new design method to yield delay-tolerant LDCs that reach the optimal Jensen's upper bound on ergodic capacity as well as minimum average pairwise error probability. The proposed design employs stochastic gradient algorithm to approach a local optimum. Moreover, it is improved by using simulated annealing type optimization to increase the likelihood of the global optimum. The proposed method allows for flexible number of nodes, receive antennas, modulated symbols and flexible length of codewords. Simulation results confirm the performance of the newly-proposed delay-tolerant LDCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The extended Canadian Middle Atmosphere Model is used to investigate the large-scale dynamics of the mesosphere and lower thermosphere (MLT). It is shown that the 4-day wave is substantially amplified in southern polar winter in the presence of instabilities arising from strong vertical shears in the MLT zonal mean zonal winds brought about by parameterized nonorographic gravity wave drag. A weaker 4-day wave in northern polar winter is attributed to the weaker wind shears that result from weaker parameterized wave drag. The 2-day wave also exhibits a strong dependence on zonal wind shears, in agreement with previous modeling studies. In the equatorial upper mesosphere, the migrating diurnal tide provides most of the resolved westward wave forcing, which varies semiannually in conjunction with the tide itself; resolved forcing by eastward traveling disturbances is dominated by smaller scales. Nonmigrating tides and other planetary-scale waves play only a minor role in the zonal mean zonal momentum budget in the tropics at these heights. Resolved waves are shown to play a significant role in the zonal mean meridional momentum budget in the MLT, impacting significantly on gradient wind balance. Balance fails at low latitudes as a result of a strong Reynolds stress associated with the migrating diurnal tide, an effect which is most pronounced at equinox when the tide is strongest. Resolved and parameterized waves account for most of the imbalance at higher latitudes in summer. This results in the gradient wind underestimating the actual eastward wind reversal by up to 40%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recovery of the Arctic polar vortex following stratospheric sudden warmings is found to take upward of 3 months in a particular subset of cases, termed here polar-night jet oscillation (PJO) events. The anomalous zonal-mean circulation above the pole during this recovery is characterized by a persistently warm lower stratosphere, and above this a cold midstratosphere and anomalously high stratopause, which descends as the event unfolds. Composites of these events in the Canadian Middle Atmosphere Model show the persistence of the lower-stratospheric anomaly is a result of strongly suppressed wave driving and weak radiative cooling at these heights. The upper-stratospheric and lower-mesospheric anomalies are driven immediately following the warming by anomalous planetary-scale eddies, following which, anomalous parameterized nonorographic and orographic gravity waves play an important role. These details are found to be robust for PJO events (as opposed to sudden warmings in general) in that many details of individual PJO events match the composite mean. Azonal-mean quasigeostrophic model on the sphere is shown to reproduce the response to the thermal and mechanical forcings produced during a PJO event. The former is well approximated by Newtonian cooling. The response can thus be considered as a transient approach to the steady-state, downward control limit. In this context, the time scale of the lower-stratospheric anomaly is determined by the transient, radiative response to the extended absence of wave driving. The extent to which the dynamics of the wave-driven descent of the stratopause can be considered analogous to the descending phases of the quasi-biennial oscillation (QBO) is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyze and study a pervasive computing system in a mining environment to track people based on RFID (radio frequency identification) technology. In first instance, we explain the RFID fundamentals and the LANDMARC (location identification based on dynamic active RFID calibration) algorithm, then we present the proposed algorithm combining LANDMARC and trilateration technique to collect the coordinates of the people inside the mine, next we generalize a pervasive computing system that can be implemented in mining, and finally we show the results and conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the energetics and zonal-mean state of the upward extension of the Canadian Middle Atmosphere Model, which extends from the ground to ~210 km. The model includes realistic parameterizations of the major physical processes from the ground up to the lower thermosphere and exhibits a broad spectrum of geophysical variability. The rationale for the extended model is to examine the nature of the physical and dynamical processes in the mesosphere/lower thermosphere (MLT) region without the artificial effects of an imposed sponge layer which can modify the circulation in an unrealistic manner. The zonal-mean distributions of temperature and zonal wind are found to be in reasonable agreement with observations in most parts of the model domain below ~150 km. Analysis of the global-average energy and momentum budgets reveals a balance between solar extreme ultraviolet heating and molecular diffusion and a thermally direct viscous meridional circulation above 130 km, with the viscosity coming from molecular diffusion and ion drag. Below 70 km, radiative equilibrium prevails in the global mean. In the MLT region between ~70 and 120 km, many processes contribute to the global energy budget. At solstice, there is a thermally indirect meridional circulation driven mainly by parameterized nonorographic gravity-wave drag. This circulation provides a net global cooling of up to 25 K d^-1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For Northern Hemisphere extra-tropical cyclone activity, the dependency of a potential anthropogenic climate change signal on the identification method applied is analysed. This study investigates the impact of the used algorithm on the changing signal, not the robustness of the climate change signal itself. Using one single transient AOGCM simulation as standard input for eleven state-of-the-art identification methods, the patterns of model simulated present day climatologies are found to be close to those computed from re-analysis, independent of the method applied. Although differences in the total number of cyclones identified exist, the climate change signals (IPCC SRES A1B) in the model run considered are largely similar between methods for all cyclones. Taking into account all tracks, decreasing numbers are found in the Mediterranean, the Arctic in the Barents and Greenland Seas, the mid-latitude Pacific and North America. Changing patterns are even more similar, if only the most severe systems are considered: the methods reveal a coherent statistically significant increase in frequency over the eastern North Atlantic and North Pacific. We found that the differences between the methods considered are largely due to the different role of weaker systems in the specific methods.