118 resultados para Miller functions
Resumo:
Quantum-dot cellular automata (QCA) is potentially a very attractive alternative to CMOS for future digital designs. Circuit designs in QCA have been extensively studied. However, how to properly evaluate the QCA circuits has not been carefully considered. To date, metrics and area-delay cost functions directly mapped from CMOS technology have been used to compare QCA designs, which is inappropriate due to the differences between these two technologies. In this paper, several cost metrics specifically aimed at QCA circuits are studied. It is found that delay, the number of QCA logic gates, and the number and type of crossovers, are important metrics that should be considered when comparing QCA designs. A family of new cost functions for QCA circuits is proposed. As fundamental components in QCA computing arithmetic, QCA adders are reviewed and evaluated with the proposed cost functions. By taking the new cost metrics into account, previous best adders become unattractive and it has been shown that different optimization goals lead to different “best” adders.
Resumo:
Necessary and sufficient conditions for choice functions to be rational have been intensively studied in the past. However, in these attempts, a choice function is completely specified. That is, given any subset of options, called an issue, the best option over that issue is always known, whilst in real-world scenarios, it is very often that only a few choices are known instead of all. In this paper, we study partial choice functions and investigate necessary and sufficient rationality conditions for situations where only a few choices are known. We prove that our necessary and sufficient condition for partial choice functions boils down to the necessary and sufficient conditions for complete choice functions proposed in the literature. Choice functions have been instrumental in belief revision theory. That is, in most approaches to belief revision, the problem studied can simply be described as the choice of possible worlds compatible with the input information, given an agent’s prior belief state. The main effort has been to devise strategies in order to infer the agents revised belief state. Our study considers the converse problem: given a collection of input information items and their corresponding revision results (as provided by an agent), does there exist a rational revision operation used by the agent and a consistent belief state that may explain the observed results?
Resumo:
Cellular signal transduction in response to environmental signals involves a relay of precisely regulated signal amplifying and damping events. A prototypical signaling relay involves ligands binding to cell surface receptors and triggering the activation of downstream enzymes to ultimately affect the subcellular distribution and activity of DNA-binding proteins that regulate gene expression. These so-called signal transduction cascades have dominated our view of signaling for decades. More recently evidence has accumulated that components of these cascades can be multifunctional, in effect playing a conventional role for example as a cell surface receptor for a ligand whilst also having alternative functions for example as transcriptional regulators in the nucleus. This raises new challenges for researchers. What are the cues/triggers that determine which role such proteins play? What are the trafficking pathways which regulate the spatial distribution of such proteins so that they can perform nuclear functions and under what circumstances are these alternative functions most relevant?
Resumo:
A subset of proteins predominantly associated with early endosomes or implicated in clathrin-mediated endocytosis can shuttle between the cytoplasm and the nucleus. Although the endocytic functions of these proteins have been extensively studied, much less effort has been expended in exploring their nuclear roles. Membrane trafficking proteins can affect signalling and proliferation and this can be achieved either at a nuclear or endocytic level. Furthermore, some proteins, such as Huntingtin interacting protein 1, are known as cancer biomarkers. This review will highlight the limits of our understanding of their nuclear functions and the relevance of this to signalling and oncogenesis.
Resumo:
This paper presents a new framework for multi-subject event inference in surveillance video, where measurements produced by low-level vision analytics usually are noisy, incomplete or incorrect. Our goal is to infer the composite events undertaken by each subject from noise observations. To achieve this, we consider the temporal characteristics of event relations and propose a method to correctly associate the detected events with individual subjects. The Dempster–Shafer (DS) theory of belief functions is used to infer events of interest from the results of our vision analytics and to measure conflicts occurring during the event association. Our system is evaluated against a number of videos that present passenger behaviours on a public transport platform namely buses at different levels of complexity. The experimental results demonstrate that by reasoning with spatio-temporal correlations, the proposed method achieves a satisfying performance when associating atomic events and recognising composite events involving multiple subjects in dynamic environments.
Resumo:
CCTV (Closed-Circuit TeleVision) systems are broadly deployed in the present world. To ensure in-time reaction for intelligent surveillance, it is a fundamental task for real-world applications to determine the gender of people of interest. However, normal video algorithms for gender profiling (usually face profiling) have three drawbacks. First, the profiling result is always uncertain. Second, the profiling result is not stable. The degree of certainty usually varies over time, sometimes even to the extent that a male is classified as a female, and vice versa. Third, for a robust profiling result in cases that a person’s face is not visible, other features, such as body shape, are required. These algorithms may provide different recognition results - at the very least, they will provide different degrees of certainties. To overcome these problems, in this paper, we introduce an Dempster-Shafer (DS) evidential approach that makes use of profiling results from multiple algorithms over a period of time, in particular, Denoeux’s cautious rule is applied for fusing mass functions through time lines. Experiments show that this approach does provide better results than single profiling results and classic fusion results. Furthermore, it is found that if severe mis-classification has occurred at the beginning of the time line, the combination can yield undesirable results. To remedy this weakness, we further propose three extensions to the evidential approach proposed above incorporating notions of time-window, time-attenuation, and time-discounting, respectively. These extensions also applies Denoeux’s rule along with time lines and take the DS approach as a special case. Experiments show that these three extensions do provide better results than their predecessor when mis-classifications occur.
Resumo:
Understanding the seismic vulnerability of building structures is important for seismic engineers, building owners, risk insurers and governments. Seismic vulnerability defines a buildings predisposition to be damaged as a result of an earthquake of a given severity. There are two components to seismic risk; the seismic hazard and the exposure of the structural inventory to any given earthquake event. This paper demonstrates the development of fragility curves at different damage states using a detailed mechanical model of a moment resisting reinforced concrete structure typical of Southern Europe. The mechanical model consists of a complex three-dimensional finite element model of the reinforced concrete moment resisting frame structure and is used to define the damage states through pushover analysis. Fragility curves are also defined using the HAZUS macroseismic methodology and the Risk-UE macroseismic methodology. Comparison of the mechanically modelled and HAZUS fragility curve shows good agreement while the Risk-UE methodology shows reasonably poor agreement.