753 resultados para Capacity limits
Resumo:
Capacity dimensioning is one of the key problems in wireless network planning. Analytical and simulation methods are usually used to pursue the accurate capacity dimensioning of wireless network. In this paper, an analytical capacity dimensioning method for WCDMA with high speed wireless link is proposed based on the analysis on relations among system performance and high speed wireless transmission technologies, such as H-ARQ, AMC and fast scheduling. It evaluates system capacity in closed-form expressions from link level and system level. Numerical results show that the proposed method can calculate link level and system level capacity for WCDMA system with HSDPA and HSUPA.
Resumo:
Many in vitro systems used to examine multipotential neural progenitor cells (NPCs) rely on mitogens including fibroblast growth factor 2 (FGF2) for their continued expansion. However, FGF2 has also been shown to alter the expression of transcription factors (TFs) that determine cell fate. Here, we report that NPCs from the embryonic telencephalon grown without FGF2 retain many of their in vivo characteristics, making them a good model for investigating molecular mechanisms involved in cell fate specification and differentiation. However, exposure of cortical NPCs to FGF2 results in a profound change in the types of neurons generated, switching them from a glutamatergic to a GABAergic phenotype. This change closely correlates with the dramatic upregulation of TFs more characteristic of ventral telencephalic NPCs. In addition, exposure of cortical NPCs to FGF2 maintains their neurogenic potential in vitro, and NPCs spontaneously undergo differentiation following FGF2 withdrawal. These results highlight the importance of TFs in determining the types of neurons generated by NPCs in vitro. In addition, they show that FGF2, as well as acting as a mitogen, changes the developmental capabilities of NPCs. These findings have implications for the cell fate specification of in vitro-expanded NPCs and their ability to generate specific cell types for therapeutic applications. Disclosure of potential conflicts of interest is found at the end of this article.
Resumo:
Over the last decade issues related to the financial viability of development have become increasingly important to the English planning system. As part of a wider shift towards the compartmentalisation of planning tasks, expert consultants are required to quantify, in an attempt to rationalise, planning decisions in terms of economic ‘viability’. Often with a particular focus on planning obligations, the results of development viability modelling have emerged as a key part of the evidence base used in site-specific negotiations and in planning policy formation. Focussing on the role of clients and other stakeholders, this paper investigates how development viability is tested in practice. It draws together literature on the role of calculative practices in policy formation, client feedback and influence in real estate appraisals and stakeholder engagement and consultation in the planning literature to critically evaluate the role of clients and other interest groups in influencing the production and use of development viability appraisal models. The paper draws upon semi-structured interviews with the main producers of development viability appraisals to conclude that, whilst appraisals have the potential to be biased by client and stakeholder interests, there are important controlling influences on potential opportunistic behaviour. One such control is local authorities’ weak understanding of development viability appraisal techniques which limits their capacity to question the outputs of appraisal models. However, this also is of concern given that viability is now a central feature of the town planning system.
Resumo:
Demand Side Response (DSR) has been slow to emerge in European electricity markets. This paper aims to both examine the reasons for low levels of DSR in Europe and reflect on factors that might affect the participation of DSR in capacity mechanisms. It relies on available evidence from the literature, secondary data on existing DSR programmes and energy aggregator's data from industries participating in DSR. Findings show that changes to the duration of contracted loads under existing or new programmes might increase the penetration of DSR. The introduction of capacity mechanisms may increase DSR from demand turn down if longer response times were available.
Resumo:
Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.
Resumo:
Simple predator–prey models with a prey-dependent functional response predict that enrichment (increased carrying capacity) destabilizes community dynamics: this is the ‘paradox of enrichment’. However, the energy value of prey is very important in this context. The intraspecific chemical composition of prey species determines its energy value as a food for the potential predator. Theoretical and experimental studies establish that variable chemical composition of prey affects the predator–prey dynamics. Recently, experimental and theoretical approaches have been made to incorporate explicitly the stoichiometric heterogeneity of simple predator–prey systems. Following the results of the previous experimental and theoretical advances, in this article we propose a simple phenomenological formulation of the variation of energy value at increased level of carrying capacity. Results of our study demonstrate that coupling the parameters representing the phenomenological energy value and carrying capacity in a realistic way, may avoid destabilization of community dynamics following enrichment. Additionally, under such coupling the producer–grazer system persists for only an intermediate zone of production—a result consistent with recent studies. We suggest that, while addressing the issue of enrichment in a general predator–prey model, the phenomenological relationship that we propose here might be applicable to avoid Rosenzweig’s paradox.
Resumo:
The IEEE 754 standard for oating-point arithmetic is widely used in computing. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. The IEEE infinities are said to have the behaviour of limits. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. We elucidate the transreal tangent and extend real limits to transreal limits. Arguing from this firm foundation, we maintain that there are three category errors in the IEEE 754 standard. Firstly the claim that IEEE infinities are limits of real arithmetic confuses limiting processes with arithmetic. Secondly a defence of IEEE negative zero confuses the limit of a function with the value of a function. Thirdly the definition of IEEE NaNs confuses undefined with unordered. Furthermore we prove that the tangent function, with the infinities given by geometrical con- struction, has a period of an entire rotation, not half a rotation as is commonly understood. This illustrates a category error, confusing the limit with the value of a function, in an important area of applied mathe- matics { trigonometry. We brie y consider the wider implications of this category error. Another paper proposes transreal arithmetic as a basis for floating- point arithmetic; here we take the profound step of proposing transreal arithmetic as a replacement for real arithmetic to remove the possibility of certain category errors in mathematics. Thus we propose both theo- retical and practical advantages of transmathematics. In particular we argue that implementing transreal analysis in trans- floating-point arith- metic would extend the coverage, accuracy and reliability of almost all computer programs that exploit real analysis { essentially all programs in science and engineering and many in finance, medicine and other socially beneficial applications.
Resumo:
Background: Research in aphasia has focused on acquired dyslexias at the single word level, with a paucity of assessment techniques and rehabilitation approaches for individuals with difficulty at the text level. A rich literature from research with paediatric populations and healthy non-brain damaged, skilled adult readers allows the component processes that are important for text reading to be defined and more appropriate assessments to be devised. Aims: To assess the component processes of text reading in a small group of individuals with aphasia who report difficulties reading at the text level. Do assessments of component processes in reading comprehension reveal distinct profiles of text comprehension? To what extent are text comprehension difficulties caused by underlying linguistic and/or cognitive deficits? Methods & Procedures: Four individuals with mild aphasia who reported difficulties reading at the text level took part in a case-series study. Published assessments were used to confirm the presence of text comprehension impairment. Participants completed a range of assessments to provide a profile of their linguistic and cognitive skills, focusing on processes known to be important for text comprehension. We identified the following areas for assessment: reading speed, language skills (single word and sentence), inferencing, working memory and metacognitive skills (monitoring and strategy use). Outcomes & Results: Performance was compared against age-matched adult control data. One participant presented with a trend for impaired abilities in inferencing, with all other assessed skills being within normal limits. The other three had identified linguistic and working memory difficulties. One presented with a residual deficit in accessing single word meaning that affected text comprehension. The other two showed no clear link between sentence processing difficulties and text comprehension impairments. Across these three, data suggested a link between verbal working memory capacity and specific inferencing skills. Conclusions: Successful text reading relies on a number of component processes. In this paper we have made a start in defining those component processes and devising tasks suitable to assess them. From our results, assessment of verbal working memory and inferencing appears to be critical for understanding text comprehension impairments in aphasia. It is possible that rehabilitation input can capitalize on key meta-cognitive skills (monitoring, strategy use) to support functional reading in the face of existing linguistic, text comprehension and memory impairments.
Resumo:
Body size affects nearly all aspects of organismal biology, so it is important to understand the constraints and dynamics of body size evolution. Despite empirical work on the macroevolution and macroecology of minimum and maximum size, there is little general quantitative theory on rates and limits of body size evolution. We present a general theory that integrates individual productivity, the lifestyle component of the slow–fast life-history continuum, and the allometric scaling of generation time to predict a clade's evolutionary rate and asymptotic maximum body size, and the shape of macroevolutionary trajectories during diversifying phases of size evolution. We evaluate this theory using data on the evolution of clade maximum body sizes in mammals during the Cenozoic. As predicted, clade evolutionary rates and asymptotic maximum sizes are larger in more productive clades (e.g. baleen whales), which represent the fast end of the slow–fast lifestyle continuum, and smaller in less productive clades (e.g. primates). The allometric scaling exponent for generation time fundamentally alters the shape of evolutionary trajectories, so allometric effects should be accounted for in models of phenotypic evolution and interpretations of macroevolutionary body size patterns. This work highlights the intimate interplay between the macroecological and macroevolutionary dynamics underlying the generation and maintenance of morphological diversity.
Resumo:
The canopy interception capacity is a small but key part of the surface hydrology, which affects the amount of water intercepted by vegetation and therefore the partitioning of evaporation and transpiration. However, little research with climate models has been done to understand the effects of a range of possible canopy interception capacity parameter values. This is in part due to the assumption that it does not significantly affect climate. Near global evapotranspiration products now make evaluation of canopy interception capacity parameterisations possible. We use a range of canopy water interception capacity values from the literature to investigate the effect on climate within the climate model HadCM3. We find that the global mean temperature is affected by up to -0.64 K globally and -1.9 K regionally. These temperature impacts are predominantly due to changes in the evaporative fraction and top of atmosphere albedo. In the tropics, the variations in evapotranspiration affect precipitation, significantly enhancing rainfall. Comparing the model output to measurements, we find that the default canopy interception capacity parameterisation overestimates canopy interception loss (i.e. canopy evaporation) and underestimates transpiration. Overall, decreasing canopy interception capacity improves the evapotranspiration partitioning in HadCM3, though the measurement literature more strongly supports an increase. The high sensitivity of climate to the parameterisation of canopy interception capacity is partially due to the high number of light rain-days in the climate model that means that interception is overestimated. This work highlights the hitherto underestimated importance of canopy interception capacity in climate model hydroclimatology and the need to acknowledge the role of precipitation representation limitations in determining parameterisations.
Resumo:
We extend all elementary functions from the real to the transreal domain so that they are defined on division by zero. Our method applies to a much wider class of functions so may be of general interest.
Resumo:
Cyber warfare is an increasingly important emerging phenomenon in international relations. The focus of this edited volume is on this notion of cyber warfare, meaning interstate cyber aggression, as distinct from cyber-terrorism or cyber-crime. Waging warfare in cyberspace has the capacity to be as devastating as any conventional means of conducting armed conflict. However, while there is a growing amount of literature on the subject within disciplines, there has been very little work done on cyber warfare across disciplines, which necessarily limits our understanding of it. This book is a major multidisciplinary analysis of cyber warfare, featuring contributions by world-leading experts from a mixture of academic and professional backgrounds.
Resumo:
This article uses discourse analysis to study the continuities in British foreign policy thinking within the Labour party from the 1960s to the present day. Using representative extracts from speeches by Hugh Gaitskell, Harold Wilson, Tony Blair and Gordon Brown, it identifies the ideational consis- tencies in the leaders’ attitudes to: Empire; federalism in the EEC/EU; and laying down conditions that have to be met before any constructive engagement with ‘Europe’ can be countenanced. We argue that these consistencies, spanning a 50-year period, exemplify a certain stagnation both within Labour’s European discourses and within British foreign policy thinking more widely. We develop the idea that Labour party thinking has been crucially framed by both small ‘c’ conser- vative and upper-case Conservative ideology, popularised by Winston Churchill in his ‘three circles’ model of British foreign policy.