130 resultados para average complexity
Resumo:
In this article, we explore the extent to which a consideration of welfare regime and socioeconomic differences in poverty levels and patterns can assist us in making an informed assessment of alternative poverty indicators. Poverty in the EU is normally defined in terms of income thresholds at the level of each member state. However, with the enlargement of the EU, such measures have come in for increasing criticism. One set of reservation relates to the limitations imposed by an entirely national frame of reference. An alternative critique focuses on the fact that low income is an unreliable indicator of poverty. In this article, we seek to explore the strength of both arguments by comparing the outcomes associated with ‘at risk of poverty’ and consistent poverty at both national and EU levels. Developing an appropriate assessment of poverty levels in the enlarged EU, particularly in periods of rapid change, is likely to require that we make use of a number of indicators none of which capture the full complexity of cross-national poverty outcomes. However, our analysis suggests that if a choice is to be made between the available indicators, the ‘mixed consistent poverty’ indicator developed in this study is best suited to achieving the stated EU objective of assessing the scale of exclusion from minimally acceptable standards of living in individual countries while also measuring the extent to which the whole population of Europe is sharing in the benefits of high average prosperity.
Resumo:
We define a multi-modal version of Computation Tree Logic (ctl) by extending the language with path quantifiers E and A where d denotes one of finitely many dimensions, interpreted over Kripke structures with one total relation for each dimension. As expected, the logic is axiomatised by taking a copy of a ctl axiomatisation for each dimension. Completeness is proved by employing the completeness result for ctl to obtain a model along each dimension in turn. We also show that the logic is decidable and that its satisfiability problem is no harder than the corresponding problem for ctl. We then demonstrate how Normative Systems can be conceived as a natural interpretation of such a multi-dimensional ctl logic. © 2009 Springer Science+Business Media B.V.
Resumo:
Flow responsive passive samplers offer considerable potential in nutrient monitoring in catchments; bridging the gap between the intermittency of grab sampling and the high cost of automated monitoring systems. A commercially available passive sampler was evaluated in a number of river systems encapsulating a gradient in storm response, combinations of diffuse and point source pressures, and levels of phosphorus and nitrogen concentrations. Phosphorus and nitrogen are sequestered to a resin matrix in a permeable cartridge positioned in line with streamflow. A salt tracer dissolves in proportion to advective flow through the cartridge. Multiple deployments of different cartridge types were undertaken and the recovery of P and N compared with the flow-weighted mean concentration (FWMC) from high-resolution bank-side analysers at each site. Results from the passive samplers were variable and largely underestimated the FWMC derived from the bank-side analysers. Laboratory tests using ambient river samples indicated good replication of advective throughflow using pumped water, although this appeared not to be a good analogue of river conditions where flow divergence was possible. Laboratory tests also showed good nutrient retention but not elution and these issues appeared to combine to limit the utility in ambient river systems at the small catchment scale.
Resumo:
An experimental artificial reefwas constructed in Strangford Lough, Northern Ireland as part of trials to regenerate damaged biogenic reefs formed by the horse mussel Modiolus modiolus. Experimental reef plots were constructed using Pecten maximus shell as cultch. Clumps of live adult M. modiolus were translocated from nearby natural reefs into cultchwith a high profile (elevated cultch), cultch with a lowprofile (flattened cultch), as well as directly into the seafloor. The aim of the study was to test the hypothesis that translocated mussel clumps would increase habitat complexity thus accelerating community succession and enhancing natural recruitment of M. modiolus spat. These effects were predicted to be greater on elevated cultch due to greater protection from
predators and increased accessibility to food resources. Within the artificial reef array the translocated clumps had a significant positive effect on recruitment compared to cultch without mussels with average densities of spat settled on the translocated M. modiolus clumps ranging from 100 to 200 individuals m-2 compared to 4 to 52 spat m-2 on cultch without mussels. Recruitment of M. modiolus spat was also significantly higher on translocated horse mussels when compared to natural reefs where densities of 8–36 spat m-2 were recorded.
Reef elevation appeared to provide some degree of protection from predators but differences in translocated M. modiolus survival on the different elevation treatments were not significant. In total, 223 taxa were recorded 12 months after reef construction. The presence of translocated clumps ofM. modiolus was the main driver of the increases in faunal diversity and species abundance. Application of objective criteria to assess the performance of artificial reefs suggested that translocation of M. modiolus clumps alone achieved most of the restoration objectives. Consequently this pilot study demonstrates a straightforward and realistic intervention technique that could be used to kick start the regeneration and expansion of impacted mussel and similar biogenic reefs elsewhere.
Resumo:
The regulation of the small GTPases leading to their membrane localization has long been attributed to processing of their C-terminal CAAX box. As deregulation of many of these GTPases have been implicated in cancer and other disorders, prenylation and methylation of this CAAX box has been studied in depth as a possibility for drug targeting, but unfortunately, to date no drug has proved clinically beneficial. However, these GTPases also undergo other modifications that may be important for their regulation. Ubiquitination has long been demonstrated to regulate the fate of numerous cellular proteins and recently it has become apparent that many GTPases, along with their GAPs, GeFs and GDis, undergo ubiquitination leading to a variety of fates such as re-localization or degradation. in this review we focus on the recent literature demonstrating that the regulation of small GTPases by ubiquitination, either directly or indirectly, plays a considerable role in controlling their function and that targeting these modifications could be important for disease treatment.
Resumo:
To examine the prevalence and pattern of specific areas of learning disability (LD) in neurologically normal children with extremely low birth weight (ELBW) (<or = 800 g) who have broadly average intelligence compared with full-term children with normal birth weight of comparable sociodemographic background, and to explore concurrent cognitive correlates of the specific LDs.
Resumo:
In this paper we employ the recently introduced improved moving average methodology of Papailias and Thomakos (2011) and we apply it in two energy ETFs. We compare it to the standard moving average methodology and the buy and hold strategy. Investors who are interested in energy-related sectors and trade using averages, could benefit by forming their strategies based on this improved moving average methodology as it returns higher profits accompanied by decreased risk (measured in terms of drawdown).
Resumo:
Practical demonstration of the operational advantages gained through the use of a co-operating retrodirective array (RDA) basestation and Van Atta node arrangements is discussed. The system exploits a number of inherent RDA features to provide analogue real time multifunctional operation at low physical complexity. An active dual-conversion four element RDA is used as the power distribution source (basestation) while simultaneously achieving a receive sensitivity level of ??109 dBm and 3 dB automatic beam steering angle of ??45??. When mobile units are each equipped with a semi-passive four element Van Atta array, it is shown mobile device orientation issues are mitigated and optimal energy transfer can occur because of automatic beam formation resulting from retrodirective self-pointing action. We show that operation in multipath rich environments with or without line of sight acts to reduce average power density limits in the operating volume with high energy density occurring at mobile nodes sites only. The system described can be used as a full duplex ASK communications link, or, as a means for remote node charging by wireless means, thereby enhancing deployment opportunities between unstabilised moving platforms.
Resumo:
Electing a leader is a fundamental task in distributed computing. In its implicit version, only the leader must know who is the elected leader. This paper focuses on studying the message and time complexity of randomized implicit leader election in synchronous distributed networks. Surprisingly, the most "obvious" complexity bounds have not been proven for randomized algorithms. The "obvious" lower bounds of O(m) messages (m is the number of edges in the network) and O(D) time (D is the network diameter) are non-trivial to show for randomized (Monte Carlo) algorithms. (Recent results that show that even O(n) (n is the number of nodes in the network) is not a lower bound on the messages in complete networks, make the above bounds somewhat less obvious). To the best of our knowledge, these basic lower bounds have not been established even for deterministic algorithms (except for the limited case of comparison algorithms, where it was also required that some nodes may not wake up spontaneously, and that D and n were not known).
We establish these fundamental lower bounds in this paper for the general case, even for randomized Monte Carlo algorithms. Our lower bounds are universal in the sense that they hold for all universal algorithms (such algorithms should work for all graphs), apply to every D, m, and n, and hold even if D, m, and n are known, all the nodes wake up simultaneously, and the algorithms can make anyuse of node's identities. To show that these bounds are tight, we present an O(m) messages algorithm. An O(D) time algorithm is known. A slight adaptation of our lower bound technique gives rise to an O(m) message lower bound for randomized broadcast algorithms.
An interesting fundamental problem is whether both upper bounds (messages and time) can be reached simultaneously in the randomized setting for all graphs. (The answer is known to be negative in the deterministic setting). We answer this problem partially by presenting a randomized algorithm that matches both complexities in some cases. This already separates (for some cases) randomized algorithms from deterministic ones. As first steps towards the general case, we present several universal leader election algorithms with bounds that trade-off messages versus time. We view our results as a step towards understanding the complexity of universal leader election in distributed networks.
Resumo:
Focusing on the uplink, where mobile users (each with a single transmit antenna) communicate with a base station with multiple antennas, we treat multiple users as antennas to enable spatial multiplexing across users. Introducing distributed closed-loop spatial multiplexing with threshold-based user selection, we propose two uplink channel-assigning strategies with limited feedback. We prove that the proposed system also outperforms the standard greedy scheme with respect to the degree of fairness, measured by the variance of the time averaged throughput. For uplink multi-antenna systems, we show that the proposed scheduling is a better choice than the greedy scheme in terms of the average BER, feedback complexity, and fairness. The numerical results corroborate our findings
Resumo:
We propose a low-complexity closed-loop spatial multiplexing method with limited feedback over multi-input-multi-output (MIMO) fading channels. The transmit adaptation is simply performed by selecting transmit antennas (or substreams) by comparing their signal-to-noise ratios to a given threshold with a fixed nonadaptive constellation and fixed transmit power per substream. We analyze the performance of the proposed system by deriving closed-form expressions for spectral efficiency, average transmit power, and bit error rate (BER). Depending on practical system design constraints, the threshold is chosen to maximize the spectral efficiency (or minimize the average BER) subject to average transmit power and average BER (or spectral efficiency) constraints, respectively. We present numerical and Monte Carlo simulation results that validate our analysis. Compared to open-loop spatial multiplexing and other approaches that select the best antenna subset in spatial multiplexing, the numerical results illustrate that the proposed technique obtains significant power gains for the same BER and spectral efficiency. We also provide numerical results that show improvement over rate-adaptive orthogonal space-time block coding, which requires highly complex constellation adaptation. We analyze the impact of feedback delay using analytical and Monte Carlo approaches. The proposed approach is arguably the simplest possible adaptive spatial multiplexing system from an implementation point of view. However, our approach and analysis can be extended to other systems using multiple constellations and power levels.
Resumo:
High-dimensional gene expression data provide a rich source of information because they capture the expression level of genes in dynamic states that reflect the biological functioning of a cell. For this reason, such data are suitable to reveal systems related properties inside a cell, e.g., in order to elucidate molecular mechanisms of complex diseases like breast or prostate cancer. However, this is not only strongly dependent on the sample size and the correlation structure of a data set, but also on the statistical hypotheses tested. Many different approaches have been developed over the years to analyze gene expression data to (I) identify changes in single genes, (II) identify changes in gene sets or pathways, and (III) identify changes in the correlation structure in pathways. In this paper, we review statistical methods for all three types of approaches, including subtypes, in the context of cancer data and provide links to software implementations and tools and address also the general problem of multiple hypotheses testing. Further, we provide recommendations for the selection of such analysis methods.
Resumo:
The Marine Strategy Framework Directive (MSFD) requires that European Union Member States achieve "Good Environmental Status" (GES) in respect of 11 Descriptors of the marine environment by 2020. Of those, Descriptor 4, which focuses on marine food webs, is perhaps the most challenging to implement since the identification of simple indicators able to assess the health of highly dynamic and complex interactions is difficult. Here, we present the proposed food web criteria/indicators and analyse their theoretical background and applicability in order to highlight both the current knowledge gaps and the difficulties associated with the assessment of GES. We conclude that the existing suite of indicators gives variable focus to the three important food web properties: structure, functioning and dynamics, and more emphasis should be given to the latter two and the general principles that relate these three properties. The development of food web indicators should be directed towards more integrative and process-based indicators with an emphasis on their responsiveness to multiple anthropogenic pressures. (C) 2013 Elsevier Ltd. All rights reserved.