156 resultados para Generalized Functions
Outperformance in exchange-traded fund pricing deviations: Generalized control of data snooping bias
Resumo:
An investigation into exchange-traded fund (ETF) outperforrnance during the period 2008-2012 is undertaken utilizing a data set of 288 U.S. traded securities. ETFs are tested for net asset value (NAV) premium, underlying index and market benchmark outperformance, with Sharpe, Treynor, and Sortino ratios employed as risk-adjusted performance measures. A key contribution is the application of an innovative generalized stepdown procedure in controlling for data snooping bias. We find that a large proportion of optimized replication and debt asset class ETFs display risk-adjusted premiums with energy and precious metals focused funds outperforming the S&P 500 market benchmark.
Resumo:
Quantum-dot cellular automata (QCA) is potentially a very attractive alternative to CMOS for future digital designs. Circuit designs in QCA have been extensively studied. However, how to properly evaluate the QCA circuits has not been carefully considered. To date, metrics and area-delay cost functions directly mapped from CMOS technology have been used to compare QCA designs, which is inappropriate due to the differences between these two technologies. In this paper, several cost metrics specifically aimed at QCA circuits are studied. It is found that delay, the number of QCA logic gates, and the number and type of crossovers, are important metrics that should be considered when comparing QCA designs. A family of new cost functions for QCA circuits is proposed. As fundamental components in QCA computing arithmetic, QCA adders are reviewed and evaluated with the proposed cost functions. By taking the new cost metrics into account, previous best adders become unattractive and it has been shown that different optimization goals lead to different “best” adders.
Resumo:
Necessary and sufficient conditions for choice functions to be rational have been intensively studied in the past. However, in these attempts, a choice function is completely specified. That is, given any subset of options, called an issue, the best option over that issue is always known, whilst in real-world scenarios, it is very often that only a few choices are known instead of all. In this paper, we study partial choice functions and investigate necessary and sufficient rationality conditions for situations where only a few choices are known. We prove that our necessary and sufficient condition for partial choice functions boils down to the necessary and sufficient conditions for complete choice functions proposed in the literature. Choice functions have been instrumental in belief revision theory. That is, in most approaches to belief revision, the problem studied can simply be described as the choice of possible worlds compatible with the input information, given an agent’s prior belief state. The main effort has been to devise strategies in order to infer the agents revised belief state. Our study considers the converse problem: given a collection of input information items and their corresponding revision results (as provided by an agent), does there exist a rational revision operation used by the agent and a consistent belief state that may explain the observed results?
Resumo:
Multicarrier Index Keying (MCIK) is a recently developed technique that modulates subcarriers but also indices of the subcarriers. In this paper a novel low-complexity detection scheme of subcarrier indices is proposed for an MCIK system and addresses a substantial reduction in complexity over the optimalmaximum likelihood (ML) detection. For the performance evaluation, a closed-form expression for the pairwise error probability (PEP) of an active subcarrier index, and a tight approximation of the average PEP of multiple subcarrier indices are derived in closed-form. The theoretical outcomes are validated usingsimulations, at a difference of less than 0.1dB. Compared to the optimal ML, the proposed detection achieves a substantial reduction in complexity with small loss in error performance (<= 0.6dB).
Resumo:
Cellular signal transduction in response to environmental signals involves a relay of precisely regulated signal amplifying and damping events. A prototypical signaling relay involves ligands binding to cell surface receptors and triggering the activation of downstream enzymes to ultimately affect the subcellular distribution and activity of DNA-binding proteins that regulate gene expression. These so-called signal transduction cascades have dominated our view of signaling for decades. More recently evidence has accumulated that components of these cascades can be multifunctional, in effect playing a conventional role for example as a cell surface receptor for a ligand whilst also having alternative functions for example as transcriptional regulators in the nucleus. This raises new challenges for researchers. What are the cues/triggers that determine which role such proteins play? What are the trafficking pathways which regulate the spatial distribution of such proteins so that they can perform nuclear functions and under what circumstances are these alternative functions most relevant?
Resumo:
A subset of proteins predominantly associated with early endosomes or implicated in clathrin-mediated endocytosis can shuttle between the cytoplasm and the nucleus. Although the endocytic functions of these proteins have been extensively studied, much less effort has been expended in exploring their nuclear roles. Membrane trafficking proteins can affect signalling and proliferation and this can be achieved either at a nuclear or endocytic level. Furthermore, some proteins, such as Huntingtin interacting protein 1, are known as cancer biomarkers. This review will highlight the limits of our understanding of their nuclear functions and the relevance of this to signalling and oncogenesis.
Resumo:
This paper presents a new variant of broadband Doherty power amplifier that employs a novel output combiner. A new parameter ∝ is introduced to permit a generalized analysis of the recently reported Parallel Doherty power amplifier (PDPA),and hence offer design flexibility. The circuit prototype of the new DPA fabricated using GaN devices exhibits maximum drain efficiency of 85% at 43-dBm peak power and 63% at 6-dB backoff power (BOP). Measured drain efficiency of >60% at peak power across 500-MHz frequency range and >50% at 6-dB BOP across 480-MHz frequency range were achieved, confirming the theoretical wideband characteristics of the new DPA.
Resumo:
A geostatistical version of the classical Fisher rule (linear discriminant analysis) is presented.This method is applicable when a large dataset of multivariate observations is available within a domain split in several known subdomains, and it assumes that the variograms (or covariance functions) are comparable between subdomains, which only differ in the mean values of the available variables. The method consists on finding the eigen-decomposition of the matrix W-1B, where W is the matrix of sills of all direct- and cross-variograms, and B is the covariance matrix of the vectors of weighted means within each subdomain, obtained by generalized least squares. The method is used to map peat blanket occurrence in Northern Ireland, with data from the Tellus
survey, which requires a minimal change to the general recipe: to use compositionally-compliant variogram tools and models, and work with log-ratio transformed data.