19 resultados para 2-adic complexity


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Potential explanatory variables often co-vary in studies of species richness. Where topography varies within a survey it is difficult to separate area and habitat-diversity effects. Topographically complex surfaces may contain more species due to increased habitat diversity or as a result of increased area per se. Fractal geometry can be used to adjust species richness estimates to control for increases in area on complex surfaces. Application of fractal techniques to a survey of rocky shores demonstrated an unambiguous area-independent effect of topography on species richness in the Isle of Man. In contrast, variation in species richness in south-west England reflected surface availability alone. Multivariate tests and variation in limpet abundances also demonstrated regional variation in the area-independent effects of topography. Community composition did not vary with increasing surface complexity in south-west England. These results suggest large-scale gradients in the effects of heterogeneity on community processes or demography.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present report investigates the role of formate species as potential reaction intermediates for the WGS reaction (CO + H2O -> CO2 + H-2) over a Pt-CeO2 catalyst. A combination of operando techniques, i.e., in situ diffuse reflectance FT-IR (DRIFT) spectroscopy and mass spectrometry (MS) during steady-state isotopic transient kinetic analysis (SSITKA), was used to relate the exchange of the reaction product CO2 to that of surface formate species. The data presented here suggest that a switchover from a non-formate to a formate-based mechanism could take place over a very narrow temperature range (as low as 60 K) over our Pt-CeO2 catalyst. This observation clearly stresses the need to avoid extrapolating conclusions to the case of results obtained under even slightly different experimental conditions. The occurrence of a low-temperature mechanism, possibly redox or Mars van Krevelen-like, that deactivates above 473 K because of ceria over-reduction is suggested as a possible explanation for the switchover, similarly to the case of the CO-NO reaction over Cu, I'd and Rh-CeZrOx (see Kaspar and co-workers [1-3]). (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a novel graph class we call universal hierarchical graphs (UHG) whose topology can be found numerously in problems representing, e.g., temporal, spacial or general process structures of systems. For this graph class we show, that we can naturally assign two probability distributions, for nodes and for edges, which lead us directly to the definition of the entropy and joint entropy and, hence, mutual information establishing an information theory for this graph class. Furthermore, we provide some results under which conditions these constraint probability distributions maximize the corresponding entropy. Also, we demonstrate that these entropic measures can be computed efficiently which is a prerequisite for every large scale practical application and show some numerical examples. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How best to predict the effects of perturbations to ecological communities has been a long-standing goal for both applied and basic ecology. This quest has recently been revived by new empirical data, new analysis methods, and increased computing speed, with the promise that ecologically important insights may be obtainable from a limited knowledge of community interactions. We use empirically based and simulated networks of varying size and connectance to assess two limitations to predicting perturbation responses in multispecies communities: (1) the inaccuracy by which species interaction strengths are empirically quantified and (2) the indeterminacy of species responses due to indirect effects associated with network size and structure. We find that even modest levels of species richness and connectance (similar to 25 pairwise interactions) impose high requirements for interaction strength estimates because system indeterminacy rapidly overwhelms predictive insights. Nevertheless, even poorly estimated interaction strengths provide greater average predictive certainty than an approach that uses only the sign of each interaction. Our simulations provide guidance in dealing with the trade-offs involved in maximizing the utility of network approaches for predicting dynamics in multispecies communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern Multiple-Input Multiple-Output (MIMO) communication systems place huge demands on embedded processing resources in terms of throughput, latency and resource utilization. State-of-the-art MIMO detector algorithms, such as Fixed-Complexity Sphere Decoding (FSD), rely on efficient channel preprocessing involving numerous calculations of the pseudo-inverse of the channel matrix by QR Decomposition (QRD) and ordering. These highly complicated operations can quickly become the critical prerequisite for real-time MIMO detection, exaggerated as the number of antennas in a MIMO detector increases. This paper describes a sorted QR decomposition (SQRD) algorithm extended for FSD, which significantly reduces the complexity and latency
of this preprocessing step and increases the throughput of MIMO detection. It merges the calculations of the QRD and ordering operations to avoid multiple iterations of QRD. Specifically, it shows that SQRD reduces the computational complexity by over 60-70% when compared to conventional
MIMO preprocessing algorithms. In 4x4 to 7x7 MIMO cases, the approach suffers merely 0.16-0.2 dB reduction in Bit Error Rate (BER) performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measures of icon designs rely heavily on surveys of the perceptions of population samples. Thus, measuring the extent to which changes in the structure of an icon will alter its perceived complexity can be costly and slow. An automated system capable of producing reliable estimates of perceived complexity could reduce development costs and time. Measures of icon complexity developed by Garcia, Badre, and Stasko (1994) and McDougall, Curry, and de Bruijn (1999) were correlated with six icon properties measured using Matlab (MathWorks, 2001) software, which uses image-processing techniques to measure icon properties. The six icon properties measured were icon foreground, the number of objects in an icon, the number of holes in those objects, and two calculations of icon edges and homogeneity in icon structure. The strongest correlates with human judgments of perceived icon complexity (McDougall et al., 1999) were structural variability (r(s) = .65) and edge information (r(s) =.64).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We define a multi-modal version of Computation Tree Logic (ctl) by extending the language with path quantifiers E and A where d denotes one of finitely many dimensions, interpreted over Kripke structures with one total relation for each dimension. As expected, the logic is axiomatised by taking a copy of a ctl axiomatisation for each dimension. Completeness is proved by employing the completeness result for ctl to obtain a model along each dimension in turn. We also show that the logic is decidable and that its satisfiability problem is no harder than the corresponding problem for ctl. We then demonstrate how Normative Systems can be conceived as a natural interpretation of such a multi-dimensional ctl logic. © 2009 Springer Science+Business Media B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The regulation of the small GTPases leading to their membrane localization has long been attributed to processing of their C-terminal CAAX box. As deregulation of many of these GTPases have been implicated in cancer and other disorders, prenylation and methylation of this CAAX box has been studied in depth as a possibility for drug targeting, but unfortunately, to date no drug has proved clinically beneficial. However, these GTPases also undergo other modifications that may be important for their regulation. Ubiquitination has long been demonstrated to regulate the fate of numerous cellular proteins and recently it has become apparent that many GTPases, along with their GAPs, GeFs and GDis, undergo ubiquitination leading to a variety of fates such as re-localization or degradation. in this review we focus on the recent literature demonstrating that the regulation of small GTPases by ubiquitination, either directly or indirectly, plays a considerable role in controlling their function and that targeting these modifications could be important for disease treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voltage-gated sodium channels (VGSCs) play a crucial role in epilepsy. The expressions of different VGSCs subtypes are varied in diverse animal models of epilepsy that may reflect their multiple phenotypes or the complexity of the mechanisms of epilepsy. In a previous study, we reported that NaV1.1 and NaV1.3 were up-regulated in the hippocampus of the spontaneously epileptic rat (SER). In this study, we further analyzed both the expression and distribution of the typical VGSC subtypes NaV1.1, NaV1.2, NaV1.3 and NaV1.6 in the hippocampus and in the cortex of the temporal lobe of two genetic epileptic animal models: the SER and the tremor rat (TRM). The expressions of calmodulin (CaM) and calmodulin-dependent protein kinase II (CaMKII) were also analyzed with the purpose of assessing the effect of the CaM/CaMKII pathway in these two models of epilepsy. Increased expression of the four VGSC subtypes and CaM, accompanied by a decrease in CaMKII was observed in the hippocampus of both the SERs and the TRM rats. However, the changes observed in the expression of VGSC subtypes and CaM were decreased with an elevated CaMKII in the cortex of their temporal lobes. Double-labeled immunofluorescence data suggested that in SERs and TRM rats, the four subtypes of the VGSC proteins were present throughout the CA1, CA3 and dentate gyrus regions of the hippocampus and temporal lobe cortex and these were co-localized in neurons with CaM. These data represent the first evidence of abnormal changes in expression of four VGSC subtypes (NaV1.1, NaV1.2, NaV1.3 and NaV1.6) and CaM/CaMKII in the hippocampus and temporal lobe cortex of SERs and TRM rats. These changes may be involved in the generation of epileptiform activity and underlie the observed seizure phenotype in these rat models of genetic epilepsy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this study was to reduce physics workload relating to patient- specific quality assurance (QA). VMAT plan delivery accuracy was determined from analysis of pre- and on-treatment trajectory log files and phantom-based ionization chamber array measurements. The correlation in this combination of measurements for patient-specific QA was investigated. The relationship between delivery errors and plan complexity was investigated as a potential method to further reduce patient-specific QA workload. Thirty VMAT plans from three treatment sites - prostate only, prostate and pelvic node (PPN), and head and neck (H&N) - were retrospectively analyzed in this work. The 2D fluence delivery reconstructed from pretreatment and on-treatment trajectory log files was compared with the planned fluence using gamma analysis. Pretreatment dose delivery verification was also car- ried out using gamma analysis of ionization chamber array measurements compared with calculated doses. Pearson correlations were used to explore any relationship between trajectory log file (pretreatment and on-treatment) and ionization chamber array gamma results (pretreatment). Plan complexity was assessed using the MU/ arc and the modulation complexity score (MCS), with Pearson correlations used to examine any relationships between complexity metrics and plan delivery accu- racy. Trajectory log files were also used to further explore the accuracy of MLC and gantry positions. Pretreatment 1%/1 mm gamma passing rates for trajectory log file analysis were 99.1% (98.7%-99.2%), 99.3% (99.1%-99.5%), and 98.4% (97.3%-98.8%) (median (IQR)) for prostate, PPN, and H&N, respectively, and were significantly correlated to on-treatment trajectory log file gamma results (R = 0.989, p < 0.001). Pretreatment ionization chamber array (2%/2 mm) gamma results were also significantly correlated with on-treatment trajectory log file gamma results (R = 0.623, p < 0.001). Furthermore, all gamma results displayed a significant correlation with MCS (R > 0.57, p < 0.001), but not with MU/arc. Average MLC position and gantry angle errors were 0.001 ± 0.002 mm and 0.025° ± 0.008° over all treatment sites and were not found to affect delivery accuracy. However, vari- ability in MLC speed was found to be directly related to MLC position accuracy. The accuracy of VMAT plan delivery assessed using pretreatment trajectory log file fluence delivery and ionization chamber array measurements were strongly correlated with on-treatment trajectory log file fluence delivery. The strong corre- lation between trajectory log file and phantom-based gamma results demonstrates potential to reduce our current patient-specific QA. Additionally, insight into MLC and gantry position accuracy through trajectory log file analysis and the strong cor- relation between gamma analysis results and the MCS could also provide further methodologies to both optimize the VMAT planning and QA process. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract The current study reports original vapour-liquid equilibrium (VLE) for the system {CO2 (1) + 1-chloropropane (2)}. The measurements have been performed over the entire pressure-composition range for the (303.15, 313.15 and 328.15) K isotherms. The values obtained have been used for comparison of four predictive approaches, namely the equation of state (EoS) of Peng and Robinson (PR), the Soave modification of Benedict–Webb–Rubin (SBWR) EoS, the Critical Point-based Revised Perturbed-Chain Association Fluid Theory (CP-PC-SAFT) EoS, and the Conductor-like Screening Model for Real Solvents (COSMO-RS). It has been demonstrated that the three EoS under consideration yield similar and qualitatively accurate predictions of VLE, which is not the case for the COSMO-RS model examined. Although CP-PC-SAFT EoS exhibits only minor superiority in comparison with PR and SBWR EoS in predicting VLE in the system under consideration, its relative complexity can be justified when taking into account the entire thermodynamic phase space and, in particular, considering the liquid densities and sound velocities over a wider pressure-volume-temperature range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multicarrier Index Keying (MCIK) is a recently developed technique that modulates subcarriers but also indices of the subcarriers. In this paper a novel low-complexity detection scheme of subcarrier indices is proposed for an MCIK system and addresses a substantial reduction in complexity over the optimalmaximum likelihood (ML) detection. For the performance evaluation, a closed-form expression for the pairwise error probability (PEP) of an active subcarrier index, and a tight approximation of the average PEP of multiple subcarrier indices are derived in closed-form. The theoretical outcomes are validated usingsimulations, at a difference of less than 0.1dB. Compared to the optimal ML, the proposed detection achieves a substantial reduction in complexity with small loss in error performance (<= 0.6dB).