819 resultados para relay filtering
Resumo:
We study a two-way relay network (TWRN), where distributed space-time codes are constructed across multiple relay terminals in an amplify-and-forward mode. Each relay transmits a scaled linear combination of its received symbols and their conjugates,with the scaling factor chosen based on automatic gain control. We consider equal power allocation (EPA) across the relays, as well as the optimal power allocation (OPA) strategy given access to instantaneous channel state information (CSI). For EPA, we derive an upper bound on the pairwise-error-probability (PEP), from which we prove that full diversity is achieved in TWRNs. This result is in contrast to one-way relay networks, in which case a maximum diversity order of only unity can be obtained. When instantaneous CSI is available at the relays, we show that the OPA which minimizes the conditional PEP of the worse link can be cast as a generalized linear fractional program, which can be solved efficiently using the Dinkelback-type procedure.We also prove that, if the sum-power of the relay terminals is constrained, then the OPA will activate at most two relays.
Resumo:
Currently researchers in the field of personalized recommendations bear little consideration on users' interest differences in resource attributes although resource attribute is usually one of the most important factors in determining user preferences. To solve this problem, the paper builds an evaluation model of user interest based on resource multi-attributes, proposes a modified Pearson-Compatibility multi-attribute group decision-making algorithm, and introduces an algorithm to solve the recommendation problem of k-neighbor similar users. Considering the characteristics of collaborative filtering recommendation, the paper addresses the issues on the preference differences of similar users, incomplete values, and advanced converge of the algorithm. Thus the paper realizes multi-attribute collaborative filtering. Finally, the effectiveness of the algorithm is proved by an experiment of collaborative recommendation among multi-users based on virtual environment. The experimental results show that the algorithm has a high accuracy on predicting target users' attribute preferences and has a strong anti-interference ability on deviation and incomplete values.
Resumo:
Climatic and land use changes have significant consequences for the distribution of tree species, both through natural dispersal processes and following management prescriptions. Responses to these changes will be expressed most strongly in seedlings near current species range boundaries. In northern temperate forest ecosystems, where changes are already being observed, ectomycorrhizal fungi contribute significantly to successful tree establishment. We hypothesised that communities of fungal symbionts might therefore play a role in facilitating, or limiting, host seedling range expansion. To test this hypothesis, ectomycorrhizal communities of interior Douglas-fir and interior lodgepole pine seedlings were analysed in a common greenhouse environment following growth in five soils collected along an ecosystem gradient. Currently, Douglas-fir’s natural distribution encompasses three of the five soils, whereas lodgepole pine’s extends much further north. Host filtering was evident amongst the 29 fungal species encountered: 7 were shared, 9 exclusive to Douglas-fir and 13 exclusive to lodgepole pine. Seedlings of both host species formed symbioses with each soil fungal community, thus Douglas-fir did so even where those soils came from outside its current distribution. However, these latter communities displayed significant taxonomic and functional differences to those found within the host distribution, indicative of habitat filtering. In contrast, lodgepole pine fungal communities displayed high functional similarity across the soil gradient. Taxonomic and/or functional shifts in Douglas-fir fungal communities may prove ecologically significant during the predicted northward migration of this species; especially in combination with changes in climate and management operations, such as seed transfer across geographical regions for forestry purposes.
Resumo:
Nonlinear data assimilation is high on the agenda in all fields of the geosciences as with ever increasing model resolution and inclusion of more physical (biological etc.) processes, and more complex observation operators the data-assimilation problem becomes more and more nonlinear. The suitability of particle filters to solve the nonlinear data assimilation problem in high-dimensional geophysical problems will be discussed. Several existing and new schemes will be presented and it is shown that at least one of them, the Equivalent-Weights Particle Filter, does indeed beat the curse of dimensionality and provides a way forward to solve the problem of nonlinear data assimilation in high-dimensional systems.
Resumo:
Only a small fraction of spectra acquired in LC-MS/MS runs matches peptides from target proteins upon database searches. The remaining, operationally termed background, spectra originate from a variety of poorly controlled sources and affect the throughput and confidence of database searches. Here, we report an algorithm and its software implementation that rapidly removes background spectra, regardless of their precise origin. The method estimates the dissimilarity distance between screened MS/MS spectra and unannotated spectra from a partially redundant background library compiled from several control and blank runs. Filtering MS/MS queries enhanced the protein identification capacity when searches lacked spectrum to sequence matching specificity. In sequence-similarity searches it reduced by, on average, 30-fold the number of orphan hits, which were not explicitly related to background protein contaminants and required manual validation. Removing high quality background MS/MS spectra, while preserving in the data set the genuine spectra from target proteins, decreased the false positive rate of stringent database searches and improved the identification of low-abundance proteins.
Resumo:
O nome de Claude Elwood Shannon não é totalmente estranho aos pesquisadores de Comunicação Social. No entanto, parte de sua importância para a história da comunicação no século XX é pouco conhecida. Sua dissertação de mestrado e o artigo dela derivado (A Symbolic Analysis of Relay and Switching Circuits) foram essenciais para que o computador se tornasse uma máquina de comunicação e, conseqüentemente, penetrasse em nossa sociedade na forma como ocorre hoje. Este artigo revisa o primeiro grande trabalho de Shannon e explicita sua participação no contexto atual da comunicação.
Resumo:
The number of research papers available today is growing at a staggering rate, generating a huge amount of information that people cannot keep up with. According to a tendency indicated by the United States’ National Science Foundation, more than 10 million new papers will be published in the next 20 years. Because most of these papers will be available on the Web, this research focus on exploring issues on recommending research papers to users, in order to directly lead users to papers of their interest. Recommender systems are used to recommend items to users among a huge stream of available items, according to users’ interests. This research focuses on the two most prevalent techniques to date, namely Content-Based Filtering and Collaborative Filtering. The first explores the text of the paper itself, recommending items similar in content to the ones the user has rated in the past. The second explores the citation web existing among papers. As these two techniques have complementary advantages, we explored hybrid approaches to recommending research papers. We created standalone and hybrid versions of algorithms and evaluated them through both offline experiments on a database of 102,295 papers, and an online experiment with 110 users. Our results show that the two techniques can be successfully combined to recommend papers. The coverage is also increased at the level of 100% in the hybrid algorithms. In addition, we found that different algorithms are more suitable for recommending different kinds of papers. Finally, we verified that users’ research experience influences the way users perceive recommendations. In parallel, we found that there are no significant differences in recommending papers for users from different countries. However, our results showed that users’ interacting with a research paper Recommender Systems are much happier when the interface is presented in the user’s native language, regardless the language that the papers are written. Therefore, an interface should be tailored to the user’s mother language.
Resumo:
In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.
Resumo:
The past decade has wítenessed a series of (well accepted and defined) financial crises periods in the world economy. Most of these events aI,"e country specific and eventually spreaded out across neighbor countries, with the concept of vicinity extrapolating the geographic maps and entering the contagion maps. Unfortunately, what contagion represents and how to measure it are still unanswered questions. In this article we measure the transmission of shocks by cross-market correlation\ coefficients following Forbes and Rigobon's (2000) notion of shift-contagion,. Our main contribution relies upon the use of traditional factor model techniques combined with stochastic volatility mo deIs to study the dependence among Latin American stock price indexes and the North American indexo More specifically, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. From a theoretical perspective, we improve currently available methodology by allowing the factor loadings, in the factor model structure, to have a time-varying structure and to capture changes in the series' weights over time. By doing this, we believe that changes and interventions experienced by those five countries are well accommodated by our models which learns and adapts reasonably fast to those economic and idiosyncratic shocks. We empirically show that the time varying covariance structure can be modeled by one or two common factors and that some sort of contagion is present in most of the series' covariances during periods of economical instability, or crisis. Open issues on real time implementation and natural model comparisons are thoroughly discussed.
Resumo:
An adaptive scheme is shown by the authors of the above paper (ibid. vol. 71, no. 2, pp. 275-276, Feb. 1983) for continuous time model reference adaptive systems (MRAS), where relays replace the usual multipliers in the existing MRAS. The commenter shows an error in the analysis of the hyperstability of the scheme, such that the validity of this configuration becomes an open question.
Resumo:
An algorithm for adaptive IIR filtering that uses prefiltering structure in direct form is presented. This structure has an estimation error that is a linear function of the coefficients. This property greatly simplifies the derivation of gradient-based algorithms. Computer simulations show that the proposed structure improves convergence speed.
Resumo:
Traditional mathematical tools, like Fourier Analysis, have proven to be efficient when analyzing steady-state distortions; however, the growing utilization of electronically controlled loads and the generation of a new dynamics in industrial environments signals have suggested the need of a powerful tool to perform the analysis of non-stationary distortions, overcoming limitations of frequency techniques. Wavelet Theory provides a new approach to harmonic analysis, focusing the decomposition of a signal into non-sinusoidal components, which are translated and scaled in time, generating a time-frequency basis. The correct choice of the waveshape to be used in decomposition is very important and discussed in this work. A brief theoretical introduction on Wavelet Transform is presented and some cases (practical and simulated) are discussed. Distortions commonly found in industrial environments, such as the current waveform of a Switched-Mode Power Supply and the input phase voltage waveform of motor fed by inverter are analyzed using Wavelet Theory. Applications such as extracting the fundamental frequency of a non-sinusoidal current signal, or using the ability of compact representation to detect non-repetitive disturbances are presented.
Resumo:
Commissioning studies of the CMS hadron calorimeter have identified sporadic uncharacteristic noise and a small number of malfunctioning calorimeter channels. Algorithms have been developed to identify and address these problems in the data. The methods have been tested on cosmic ray muon data, calorimeter noise data, and single beam data collected with CMS in 2008. The noise rejection algorithms can be applied to LHC collision data at the trigger level or in the offline analysis. The application of the algorithms at the trigger level is shown to remove 90% of noise events with fake missing transverse energy above 100 GeV, which is sufficient for the CMS physics trigger operation. © 2010 IOP Publishing Ltd and SISSA.