130 resultados para average complexity
Resumo:
The rejoining kinetics of double-stranded DNA fragments, along with measurements of residual damage after postirradiation incubation, are often used as indicators of the biological relevance of the damage induced by ionizing radiation of different qualities. Although it is widely accepted that high-LET radiation-induced double-strand breaks (DSBs) tend to rejoin with kinetics slower than low-LET radiation-induced DSBs, possibly due to the complexity of the DSB itself, the nature of a slowly rejoining DSB-containing DNA lesion remains unknown. Using an approach that combines pulsed-field gel electrophoresis (PFGE) of fragmented DNA from human skin fibroblasts and a recently developed Monte Carlo simulation of radiation-induced DNA breakage and rejoining kinetics, we have tested the role of DSB-containing DNA lesions in the 8-kbp-5.7-Mbp fragment size range in determining the DSB rejoining kinetics. It is found that with low-LET X rays or high LET alpha particles, DSB rejoining kinetics data obtained with PFGE can be computer-simulated assuming that DSB rejoining kinetics does not depend on spacing of breaks along the chromosomes. After analysis of DNA fragmentation profiles, the rejoining kinetics of X-ray-induced DSBs could be fitted by two components: a fast component with a half-life of 0.9 +/- 0.5 h and a slow component with a half-life of 16 +/- 9 h. For a particles, a fast component with a half-life of 0.7 +/- 0.4 h and a slow component with a half-life of 12 5 h along with a residual fraction of unrepaired breaks accounting for 8% of the initial damage were observed. In summary, it is shown that genomic proximity of breaks along a chromosome does not determine the rejoining kinetics, so the slowly rejoining breaks induced with higher frequencies after exposure to high-LET radiation (0.37 +/- 0.12) relative to low-LET radiation (0.22 +/- 0.07) can be explained on the basis of lesion complexity at the nanometer scale, known as locally multiply damaged sites. (c) 2005 by Radiation Research Society.
Resumo:
This paper offers a new insight into how organizations engage with external complexity. It applies a political action perspective that draws attention to the hitherto neglected question of how the relative power organizational leaders enjoy within their environments is significant for the actions they can take on behalf of their organizations when faced with external complexity. It identifies cognitive and relational complexity as two dimensions of the environment with which organizations have to engage. It proposes three modes whereby organizations may engage with environmental complexity that are conditioned by an organization's power within its environment. It also considers the intention associated with each mode, as well as the implications of these modes of engagement for how an organization can learn about its environment and for the use of rationality and intuition in its strategic decision-making. The closing discussion considers how this analysis integrates complexity and political action perspectives in a way that contributes to theoretical development and provides the basis for a dynamic political co-evolutionary approach. © The Author(s) 2011.
Resumo:
The departure point for this investigation is to highlight the centrality of regulation theory as a praxis in planning enforcement. The value of the conceptual framework is demonstrated by application in the problematic arena of conservation regulatory compliance, where there is currently a dearth of investigation. It is evidenced that this thematic approach provides a lens to scrutinise problematic areas of control and provides a deeper understanding of the difficulties faced by planning enforcement operational practice generally and heritage regimes specifically. The utility of the proposed mechanism is that it remedies the current well documented pitfalls of disjointed, piecemeal strategies by providing a framework for robust, coherent decision making not only in planning but in the wider regulatory arena.
Resumo:
The influence of predation in structuring ecological communities can be informed by examining the shape and magnitude of the functional response of predators towards prey. We derived functional responses of the ubiquitous intertidal amphipod Echinogammarus marinus towards one of its preferred prey species, the isopod Jaera nordmanni. First, we examined the form of the functional response where prey were replaced following consumption, as compared to the usual experimental design where prey density in each replicate is allowed to deplete. E. marinus exhibited Type II functional responses, i.e. inversely density-dependent predation of J. nordmanni that increased linearly with prey availability at low densities, but decreased with further prey supply. In both prey replacement and non-replacement experiments, handling times and maximum feeding rates were similar. The non-replacement design underestimated attack rates compared to when prey were replaced. We then compared the use of Holling’s disc equation (assuming constant prey density) with the more appropriate Rogers’ random predator equation (accounting for prey depletion) using the prey non-replacement data. Rogers’ equation returned significantly greater attack rates but lower maximum feeding rates, indicating that model choice has significant implications for parameter estimates. We then manipulated habitat complexity and found significantly reduced predation by the amphipod in complex as opposed to simple habitat structure. Further, the functional response changed from a Type II in simple habitats to a sigmoidal, density-dependent Type III response in complex habitats, which may impart stability on the predator−prey interaction. Enhanced habitat complexity returned significantly lower attack rates, higher handling times and lower maximum feeding rates. These findings illustrate the sensitivity of the functional response to variations in prey supply, model selection and habitat complexity and, further, that E. marinus could potentially determine the local exclusion and persistence of prey through habitat-mediated changes in its predatory functional responses.
Resumo:
The characterization and the definition of the complexity of objects is an important but very difficult problem that attracted much interest in many different fields. In this paper we introduce a new measure, called network diversity score (NDS), which allows us to quantify structural properties of networks. We demonstrate numerically that our diversity score is capable of distinguishing ordered, random and complex networks from each other and, hence, allowing us to categorize networks with respect to their structural complexity. We study 16 additional network complexity measures and find that none of these measures has similar good categorization capabilities. In contrast to many other measures suggested so far aiming for a characterization of the structural complexity of networks, our score is different for a variety of reasons. First, our score is multiplicatively composed of four individual scores, each assessing different structural properties of a network. That means our composite score reflects the structural diversity of a network. Second, our score is defined for a population of networks instead of individual networks. We will show that this removes an unwanted ambiguity, inherently present in measures that are based on single networks. In order to apply our measure practically, we provide a statistical estimator for the diversity score, which is based on a finite number of samples.
Resumo:
To enable reliable data transfer in next generation Multiple-Input Multiple-Output (MIMO) communication systems, terminals must be able to react to fluctuating channel conditions by having flexible modulation schemes and antenna configurations. This creates a challenging real-time implementation problem: to provide the high performance required of cutting edge MIMO standards, such as 802.11n, with the flexibility for this behavioural variability. FPGA softcore processors offer a solution to this problem, and in this paper we show how heterogeneous SISD/SIMD/MIMD architectures can enable programmable multicore architectures on FPGA with similar performance and cost as traditional dedicated circuit-based architectures. When applied to a 4×4 16-QAM Fixed-Complexity Sphere Decoder (FSD) detector we present the first soft-processor based solution for real-time 802.11n MIMO.
Resumo:
Acidity peaks in Greenland ice cores have been used as critical reference horizons for synchronizing ice core records, aiding the construction of a single Greenland Ice Core Chronology (GICC05) for the Holocene. Guided by GICC05, we examined sub-sections of three Greenland cores in the search for tephra from specific eruptions that might facilitate the linkage of ice core records, the dating of prehistoric tephras and the understanding of the eruptions. Here we report the identification of 14 horizons with tephra particles, including 11 that have not previously been reported from the North Atlantic region and that have the potential to be valuable isochrons. The positions of tephras whose major element data are consistent with ash from the Katmai AD 1912 and Öraefajökull AD 1362 eruptions confirm the annually resolved ice core chronology for the last 700 years. We provide a more refined date for the so-called “AD860B” tephra, a widespread isochron found across NW Europe, and present new evidence relating to the 17th century BC Thera/Aniakchak debate that shows N. American eruptions likely contributed to the acid signals at this time. Our results emphasize the variable spatial and temporal distributions of volcanic products in Greenland ice that call for a more cautious approach in the attribution of acid signals to specific eruptive events.
Resumo:
We investigate the computational complexity of testing dominance and consistency in CP-nets. Previously, the complexity of dominance has been determined for restricted classes in which the dependency graph of the CP-net is acyclic. However, there are preferences of interest that define cyclic dependency graphs; these are modeled with general CP-nets. In our main results, we show here that both dominance and consistency for general CP-nets are PSPACE-complete. We then consider the concept of strong dominance, dominance equivalence and dominance incomparability, and several notions of optimality, and identify the complexity of the corresponding decision problems. The reductions used in the proofs are from STRIPS planning, and thus reinforce the earlier established connections between both areas.
Resumo:
Biodiversity may be seen as a scientific measure of the complexity of a biological system, implying an information basis. Complexity cannot be directly valued, so economists have tried to define the services it provides, though often just valuing the services of 'key' species. Here we provide a new definition of biodiversity as a measure of functional information, arguing that complexity embodies meaningful information as Gregory Bateson defined it. We argue that functional information content (FIC) is the potentially valuable component of total (algorithmic) information content (AIC), as it alone determines biological fitness and supports ecosystem services. Inspired by recent extensions to the Noah's Ark problem, we show how FIC/AIC can be calculated to measure the degree of substitutability within an ecological community. Establishing substitutability is an essential foundation for valuation. From it, we derive a way to rank whole communities by Indirect Use Value, through quantifying the relation between system complexity and the production rate of ecosystem services. Understanding biodiversity as information evidently serves as a practical interface between economics and ecological science. © 2012 Elsevier B.V.
Resumo:
Measures of icon designs rely heavily on surveys of the perceptions of population samples. Thus, measuring the extent to which changes in the structure of an icon will alter its perceived complexity can be costly and slow. An automated system capable of producing reliable estimates of perceived complexity could reduce development costs and time. Measures of icon complexity developed by Garcia, Badre, and Stasko (1994) and McDougall, Curry, and de Bruijn (1999) were correlated with six icon properties measured using Matlab (MathWorks, 2001) software, which uses image-processing techniques to measure icon properties. The six icon properties measured were icon foreground, the number of objects in an icon, the number of holes in those objects, and two calculations of icon edges and homogeneity in icon structure. The strongest correlates with human judgments of perceived icon complexity (McDougall et al., 1999) were structural variability (r(s) = .65) and edge information (r(s) =.64).