33 resultados para System complexity
Resumo:
Biodiversity may be seen as a scientific measure of the complexity of a biological system, implying an information basis. Complexity cannot be directly valued, so economists have tried to define the services it provides, though often just valuing the services of 'key' species. Here we provide a new definition of biodiversity as a measure of functional information, arguing that complexity embodies meaningful information as Gregory Bateson defined it. We argue that functional information content (FIC) is the potentially valuable component of total (algorithmic) information content (AIC), as it alone determines biological fitness and supports ecosystem services. Inspired by recent extensions to the Noah's Ark problem, we show how FIC/AIC can be calculated to measure the degree of substitutability within an ecological community. Establishing substitutability is an essential foundation for valuation. From it, we derive a way to rank whole communities by Indirect Use Value, through quantifying the relation between system complexity and the production rate of ecosystem services. Understanding biodiversity as information evidently serves as a practical interface between economics and ecological science. © 2012 Elsevier B.V.
Resumo:
Biodiversity is not a commodity, nor a service (ecosystem or otherwise), it is a scientific measure of the complexity of a biological system. Rather than directly valuing biodiversity, economists have tended to value its services, more often the services of 'key' species. This is understandable given the confusion of definitions and measures of biodiversity, but weakly justified if biodiversity is not substitutable. We provide a quantitative and comprehensive definition of biodiversity and propose a framework for examining its substitutability as the first step towards valuation. We define biodiversity as a measure of semiotic information. It is equated with biocomplexity and measured by Algorithmic Information Content (AIC). We argue that the potentially valuable component of this is functional information content (FIC) which determines biological fitness and supports ecosystem services. Inspired by recent extensions to the Noah's Ark problem, we show how FIC/AIC can be calculated to measure the degree of substitutability within an ecological community. From this, we derive a way to rank whole communities by Indirect Use Value, through quantifying the relation between system complexity and production rate of ecosystem services. Understanding biodiversity as information evidently serves as a practical interface between economics and ecological science.
Resumo:
In this paper, we study a two-phase underlay cognitive relay network, where there exists an eavesdropper who can overhear the message. The secure data transmission from the secondary source to secondary destination is assisted by two decode-and-forward (DF) relays. Although the traditional opportunistic relaying technique can choose one relay to provide the best secure performance, it needs to continuously have the channel state information (CSI) of both relays, and may result in a high relay switching rate. To overcome these limitations, a secure switch-and-stay combining (SSSC) protocol is proposed where only one out of the two relays is activated to assist the secure data transmission, and the secure relay switching occurs when the relay cannot support the secure communication any longer. This security switching is assisted by either instantaneous or statistical eavesdropping CSI. For these two cases, we study the system secure performance of SSSC protocol, by deriving the analytical secrecy outage probability as well as an asymptotic expression for the high main-to-eavesdropper ratio (MER) region. We show that SSSC can substantially reduce the system complexity while achieving or approaching the full diversity order of opportunistic relaying in the presence of the instantaneous or statistical eavesdropping CSI.
Resumo:
Purpose The aim of this paper is to explore the issues involved in developing and applying performance management approaches within a large UK public sector department using a multiple stakeholder perspective and an accompanying theoretical framework. Design/methodology/approach An initial short questionnaire was used to determine perceptions about the implementation and effectiveness of the new performance management system across the organisation. In total, 700 questionnaires were distributed. Running concurrently with an ethnographic approach, and informed by the questionnaire responses, was a series of semi-structured interviews and focus groups. Findings Staff at all levels had an understanding of the new system and perceived it as being beneficial. However, there were concerns that the approach was not continuously managed throughout the year and was in danger of becoming an annual event, rather than an ongoing process. Furthermore, the change process seemed to have advanced without corresponding changes to appraisal and reward and recognition systems. Thus, the business objectives were not aligned with motivating factors within the organisation. Research limitations/implications Additional research to test the validity and usefulness of the theoretical model, as discussed in this paper, would be beneficial. Practical implications The strategic integration of the stakeholder performance measures and scorecards was found to be essential to producing an overall stakeholder-driven strategy within the case study organisation. Originality/value This paper discusses in detail the approach adopted and the progress made by one large UK public sector organisation, as it attempts to develop better relationships with all of its stakeholders and hence improve its performance. This paper provides a concerted attempt to link theory with practice.
Resumo:
This study finds evidence that attempts to reduce costs and error rates in the Inland Revenue through the use of e-commerce technology are flawed. While it is technically possible to write software that will record tax data, and then transmit it to the Inland Revenue, there is little demand for this service. The key finding is that the tax system is so complex that many people are unable to complete their own tax returns. This complexity cannot be overcome by well-designed software. The recommendation is to encourage the use of agents to assist taxpayers or simplify the tax system. The Inland Revenue is interested in saving administrative costs and errors by encouraging electronic submission of tax returns. To achieve these objectives, given the raw data it would seem clear that the focus should be on facilitating the work of agents.
Resumo:
The introduction of the Quality Protects initiative in England and the focus on performance management has challenged social services departments to examine the systems, processes and outcomes for children who have their name on a child protection register. Research indicates that approximately one-quarter of the situations in which children are registered could be described as chronic—that is, they remain on the child protection register for significant periods of time, experience more than one period of registration or suffer a further incident of significant harm whilst subject to a child protection plan. In this article, the findings from a research study conducted into this group of vulnerable children are reported, focusing on the characteristics of the children and their families, and their careers in the child protection system. The paper concludes with observations about the weak conceptualization of performance management and the need to recognize the complexity of the factors that influence children’s careers in the child protection system.
Resumo:
Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.
We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.
Resumo:
Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.
Resumo:
We report the discovery of WASP-8b, a transiting planet of 2.25 ± 0.08 MJup on a strongly inclined eccentric 8.15-day orbit, moving in a retrograde direction to the rotation of its late-G host star. Evidence is found that the star is in a multiple stellar system with two other companions. The dynamical complexity of the system indicates that it may have experienced secular interactions such as the Kozai mechanism or a formation that differs from the “classical” disc-migration theory.
Resumo:
This article will examine the thesis that Northern Ireland experiences a relatively low level of crime. It will explore the possible reasons why crime in the North has not witnessed a dramatic increase. In light of this, the article will highlight the difficulties surrounding the current prison system and illustrate that once again Northern Ireland is experiencing a very different criminal justice system in comparison to Great Britain. Although the prisons are now being used predominately to deal with “ordinary’ crime”, they are still part of the political process.
Resumo:
The development of high performance, low computational complexity detection algorithms is a key challenge for real-time Multiple-Input Multiple-Output (MIMO) communication system design. The Fixed-Complexity Sphere Decoder (FSD) algorithm is one of the most promising approaches, enabling quasi-ML decoding accuracy and high performance implementation due to its deterministic, highly parallel structure. However, it suffers from exponential growth in computational complexity as the number of MIMO transmit antennas increases, critically limiting its scalability to larger MIMO system topologies. In this paper, we present a solution to this problem by applying a novel cutting protocol to the decoding tree of a real-valued FSD algorithm. The new Real-valued Fixed-Complexity Sphere Decoder (RFSD) algorithm derived achieves similar quasi-ML decoding performance as FSD, but with an average 70% reduction in computational complexity, as we demonstrate from both theoretical and implementation perspectives for Quadrature Amplitude Modulation (QAM)-MIMO systems.
Resumo:
How best to predict the effects of perturbations to ecological communities has been a long-standing goal for both applied and basic ecology. This quest has recently been revived by new empirical data, new analysis methods, and increased computing speed, with the promise that ecologically important insights may be obtainable from a limited knowledge of community interactions. We use empirically based and simulated networks of varying size and connectance to assess two limitations to predicting perturbation responses in multispecies communities: (1) the inaccuracy by which species interaction strengths are empirically quantified and (2) the indeterminacy of species responses due to indirect effects associated with network size and structure. We find that even modest levels of species richness and connectance (similar to 25 pairwise interactions) impose high requirements for interaction strength estimates because system indeterminacy rapidly overwhelms predictive insights. Nevertheless, even poorly estimated interaction strengths provide greater average predictive certainty than an approach that uses only the sign of each interaction. Our simulations provide guidance in dealing with the trade-offs involved in maximizing the utility of network approaches for predicting dynamics in multispecies communities.
Resumo:
Studies of trait-mediated indirect interactions (TMIIs) typically focus on effects higher predators have on per capita consumption by intermediate consumers of a third, basal prey resource. TMIIs are usually evidenced by changes in feeding rates of intermediate consumers and/or differences in densities of this third species. However, understanding and predicting effects of TMIIs on population stability of such basal species requires examination of the type and magnitude of the functional responses exhibited towards them. Here, in a marine intertidal system consisting of a higher-order fish predator, the shanny Lipophrys pholis, an intermediate predator, the amphipod Echinogammarus marinus, and a basal prey resource, the isopod Jaera nordmanni, we detected TMIIs, demonstrating the importance of habitat complexity in such interactions, by deriving functional responses and exploring consequences for prey population stability. Echinogammarus marinus reacted to fish predator diet cues by reducing activity, a typical anti-predator response, but did not alter habitat use. Basal prey, Jaera nordmanni, did not respond to fish diet cues with respect to activity, distribution or aggregation behaviour. Echinogammarus marinus exhibited type II functional responses towards J. nordmanni in simple habitat, but type III functional responses in complex habitat. However, while predator cue decreased the magnitude of the type II functional response in simple habitat, it increased the magnitude of the type III functional response in complex habitat. These findings indicate that, in simple habitats, TMIIs may drive down consumption rates within type II responses, however, this interaction may remain de-stabilising for prey populations. Conversely, in complex habitats, TMIIs may strengthen regulatory influences of intermediate consumers on prey populations, whilst potentially maintaining prey population stability. We thus highlight that TMIIs can have unexpected and complex ramifications throughout communities, but can be unravelled by considering effects on intermediate predator functional response types and magnitudes.
Resumo:
Measures of icon designs rely heavily on surveys of the perceptions of population samples. Thus, measuring the extent to which changes in the structure of an icon will alter its perceived complexity can be costly and slow. An automated system capable of producing reliable estimates of perceived complexity could reduce development costs and time. Measures of icon complexity developed by Garcia, Badre, and Stasko (1994) and McDougall, Curry, and de Bruijn (1999) were correlated with six icon properties measured using Matlab (MathWorks, 2001) software, which uses image-processing techniques to measure icon properties. The six icon properties measured were icon foreground, the number of objects in an icon, the number of holes in those objects, and two calculations of icon edges and homogeneity in icon structure. The strongest correlates with human judgments of perceived icon complexity (McDougall et al., 1999) were structural variability (r(s) = .65) and edge information (r(s) =.64).
Resumo:
A bit-level systolic array system for performing a binary tree vector quantization (VQ) codebook search is described. This is based on a highly regular VLSI building block circuit. The system in question exhibits a very high data rate suitable for a range of real-time applications. A technique is described which reduces the storage requirements of such a system by 50%, with a corresponding decrease in hardware complexity.