900 resultados para Borrowing constraint


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary 1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests. Keywords: bioenergetics; energy budget; individual-based models; population dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From 2001, the construction of flats and high-density developments increased in England and the building of houses declined. Does this indicate a change in taste or is it a result of government planning policies? In this paper, an analysis is made of the long-term effects of the policy of constraint which has existed for the past 50 years but the increase in density is identified as occurring primarily after new, revised, planning guidance was issued in England in 2000 which discouraged low-density development. To substantiate this, it is pointed out that the change which occurred in England did not occur in Scotland where guidance was not changed to encourage high-density residential development. The conclusion that the change is the result of planning policies and not of a change in taste is confirmed by surveys of the occupants of new high-rise developments in Leeds. The new flat-dwellers were predominantly young and childless and expressed the intention, in the near future, when they could, of moving out of the city centre and into houses. From recent changes in guidance by the new coalition government, it is expected that the construction of flats in England will fall back to earlier levels over the next few years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new technique for correcting errors in radar estimates of rainfall due to attenuation which is based on the fact that any attenuating target will itself emit, and that this emission can be detected by the increased noise level in the radar receiver. The technique is being installed on the UK operational network, and for the first time, allows radome attenuation to be monitored using the increased noise at the higher beam elevations. This attenuation has a large azimuthal dependence but for an old radome can be up to 4 dB for rainfall rates of just 2–4 mm/h. This effect has been neglected in the past, but may be responsible for significant errors in rainfall estimates and in radar calibrations using gauges. The extra noise at low radar elevations provides an estimate of the total path integrated attenuation of nearby storms; this total attenuation can then be used as a constraint for gate-by-gate or polarimetric correction algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We explored the impact of a degraded semantic system on lexical, morphological and syntactic complexity in language production. We analysed transcripts from connected speech samples from eight patients with semantic dementia (SD) and eight age-matched healthy speakers. The frequency distributions of nouns and verbs were compared for hand-scored data and data extracted using text-analysis software. Lexical measures showed the predicted pattern for nouns and verbs in hand-scored data, and for nouns in software-extracted data, with fewer low frequency items in the speech of the patients relative to controls. The distribution of complex morpho-syntactic forms for the SD group showed a reduced range, with fewer constructions that required multiple auxiliaries and inflections. Finally, the distribution of syntactic constructions also differed between groups, with a pattern that reflects the patients’ characteristic anomia and constraints on morpho-syntactic complexity. The data are in line with previous findings of an absence of gross syntactic errors or violations in SD speech. Alterations in the distributions of morphology and syntax, however, support constraint satisfaction models of speech production in which there is no hard boundary between lexical retrieval and grammatical encoding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the role of trust as an enabler and constraint between buyers and suppliers engaged in long-term relationships. According to the relational view, cooperative strategies require trust-based mutual commitments to co-create value. However, complete pictures of the positive and negative outcomes from trust development have yet to be fully developed. In particular, trust as an originator of path dependent constraints resulting from over embeddedness is yet to be integrated into the relational view. We use a case-based methodology to explore whether trust is an optimizing phenomenon in key supplier relationships. Two cases where trust development processes demonstrate a paradox of trust-building behaviors cultivate different outcomes constraining value co-creation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of effective environmental management plans and policies requires a sound understanding of the driving forces involved in shaping and altering the structure and function of ecosystems. However, driving forces, especially anthropogenic ones, are defined and operate at multiple administrative levels, which do not always match ecological scales. This paper presents an innovative methodology of analysing drivers of change by developing a typology of scale sensitivity of drivers that classifies and describes the way they operate across multiple administrative levels. Scale sensitivity varies considerably among drivers, which can be classified into five broad categories depending on the response of ‘evenness’ and ‘intensity change’ when moving across administrative levels. Indirect drivers tend to show low scale sensitivity, whereas direct drivers show high scale sensitivity, as they operate in a non-linear way across the administrative scale. Thus policies addressing direct drivers of change, in particular, need to take scale into consideration during their formulation. Moreover, such policies must have a strong spatial focus, which can be achieved either by encouraging local–regional policy making or by introducing high flexibility in (inter)national policies to accommodate increased differentiation at lower administrative levels. High quality data is available for several drivers, however, the availability of consistent data at all levels for non-anthropogenic drivers is a major constraint to mapping and assessing their scale sensitivity. This lack of data may hinder effective policy making for environmental management, since it restricts the ability to fully account for scale sensitivity of natural drivers in policy design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The very first numerical models which were developed more than 20 years ago were drastic simplifications of the real atmosphere and they were mostly restricted to describe adiabatic processes. For prediction of a day or two of the mid tropospheric flow these models often gave reasonable results but the result deteriorated quickly when the prediction was extended further in time. The prediction of the surface flow was unsatisfactory even for short predictions. It was evident that both the energy generating processes as well as the dissipative processes have to be included in numerical models in order to predict the weather patterns in the lower part of the atmosphere and to predict the atmosphere in general beyond a day or two. Present-day computers make it possible to attack the weather forecasting problem in a more comprehensive and complete way and substantial efforts have been made during the last decade in particular to incorporate the non-adiabatic processes in numerical prediction models. The physics of radiational transfer, condensation of moisture, turbulent transfer of heat, momentum and moisture and the dissipation of kinetic energy are the most important processes associated with the formation of energy sources and sinks in the atmosphere and these have to be incorporated in numerical prediction models extended over more than a few days. The mechanisms of these processes are mainly related to small scale disturbances in space and time or even molecular processes. It is therefore one of the basic characteristics of numerical models that these small scale disturbances cannot be included in an explicit way. The reason for this is the discretization of the model's atmosphere by a finite difference grid or the use of a Galerkin or spectral function representation. The second reason why we cannot explicitly introduce these processes into a numerical model is due to the fact that some physical processes necessary to describe them (such as the local buoyance) are a priori eliminated by the constraints of hydrostatic adjustment. Even if this physical constraint can be relaxed by making the models non-hydrostatic the scale problem is virtually impossible to solve and for the foreseeable future we have to try to incorporate the ensemble or gross effect of these physical processes on the large scale synoptic flow. The formulation of the ensemble effect in terms of grid-scale variables (the parameters of the large-scale flow) is called 'parameterization'. For short range prediction of the synoptic flow at middle and high latitudes, very simple parameterization has proven to be rather successful.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We explore the influence of the choice of attenuation factor on Katz centrality indices for evolving communication networks. For given snapshots of a network observed over a period of time, recently developed communicability indices aim to identify best broadcasters and listeners in the network. In this article, we looked into the sensitivity of communicability indices on the attenuation factor constraint, in relation to spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We proposed relaxed communicability measures where the spectral radius bound on attenuation factor is relaxed and the adjacency matrix is normalised in order to maintain the convergence of the measure. Using a vitality based measure of both standard and relaxed communicability indices we looked at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We illustrated our findings with two examples of real-life networks, MIT reality mining data set of daily communications between 106 individuals during one year and UK Twitter mentions network, direct messages on Twitter between 12.4k individuals during one week.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate is one of the main factors controlling winegrape production. Bioclimatic indices describing the suitability of a particular region for wine production are a widely used zoning tool. Seven suitable bioclimatic indices characterize regions in Europe with different viticultural suitability, and their possible geographical shifts under future climate conditions are addressed using regional climate model simulations. The indices are calculated from climatic variables (daily values of temperature and precipitation) obtained from transient ensemble simulations with the regional model COSMO-CLM. Index maps for recent decades (1960–2000) and for the 21st century (following the IPCC-SRES B1 and A1B scenarios) are compared. Results show that climate change is projected to have a significant effect on European viticultural geography. Detrimental impacts on winegrowing are predicted in southern Europe, mainly due to increased dryness and cumulative thermal effects during the growing season. These changes represent an important constraint to grapevine growth and development, making adaptation strategies crucial, such as changing varieties or introducing water supply by irrigation. Conversely, in western and central Europe, projected future changes will benefit not only wine quality, but might also demarcate new potential areas for viticulture, despite some likely threats associated with diseases. Regardless of the inherent uncertainties, this approach provides valuable information for implementing proper and diverse adaptation measures in different European regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a detailed case study of the characteristics of auroral forms that constitute the first ionospheric signatures of substorm expansion phase onset. Analysis of the optical frequency and along-arc (azimuthal) wave number spectra provides the strongest constraint to date on the potential mechanisms and instabilities in the near-Earth magnetosphere that accompany auroral onset and which precede poleward arc expansion and auroral breakup. We evaluate the frequency and growth rates of the auroral forms as a function of azimuthal wave number to determine whether these wave characteristics are consistent with current models of the substorm onset mechanism. We find that the frequency, spatial scales, and growth rates of the auroral forms are most consistent with the cross-field current instability or a ballooning instability, most likely triggered close to the inner edge of the ion plasma sheet. This result is supportive of a near-Earth plasma sheet initiation of the substorm expansion phase. We also present evidence that the frequency and phase characteristics of the auroral undulations may be generated via resonant processes operating along the geomagnetic field. Our observations provide the most powerful constraint to date on the ionospheric manifestation of the physical processes operating during the first few minutes around auroral substorm onset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent evidence suggests that immobilization of the upper limb for 2–3 weeks induces changes in cortical thickness as well as motor performance. In constraint induced (CI) therapy, one of the most effective interventions for hemiplegia, the non-paretic arm is constrained to enforce the use of the paretic arm in the home setting. With the present study we aimed to explore whether non-paretic arm immobilization in CI therapy induces structural changes in the non-lesioned hemisphere, and how these changes are related to treatment benefit. 31 patients with chronic hemiparesis participated in CI therapy with (N = 14) and without (N = 17) constraint. Motor ability scores were acquired before and after treatment. Diffusion tensor imaging (DTI) data was obtained prior to treatment. Cortical thickness was measured with the Freesurfer software. In both groups cortical thickness in the contralesional primary somatosensory cortex increased and motor function improved with the intervention. However the cortical thickness change was not associated with the magnitude of motor function improvement. Moreover, the treatment effect and the cortical thickness change were not significantly different between the constraint and the non-constraint groups. There was no correlation between fractional anisotropy changes in the non-lesioned hemisphere and treatment outcome. CI therapy induced cortical thickness changes in contralesional sensorimotor regions, but this effect does not appear to be driven by the immobilization of the non-paretic arm, as indicated by the absence of differences between the constraint and the non-constraint groups. Our data does not suggest that the arm immobilization used in CI therapy is associated with noticeable cortical thinning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates transfer at the third-language (L3) initial state, testing between the following possibilities: (1) the first language (L1) transfer hypothesis (an L1 effect for all adult acquisition), (2) the second language (L2) transfer hypothesis, where the L2 blocks L1 transfer (often referred to in the recent literature as the ‘L2 status factor’; Williams and Hammarberg, 1998), and (3) the Cumulative Enhancement Model (Flynn et al., 2004), which proposes selective transfer from all previous linguistic knowledge. We provide data from successful English-speaking learners of L2 Spanish at the initial state of acquiring L3 French and L3 Italian relating to properties of the Null-Subject Parameter (e.g. Chomsky, 1981; Rizzi, 1982). We compare these groups to each other, as well as to groups of English learners of L2 French and L2 Italian at the initial state, and conclude that the data are consistent with the predictions of the ‘L2 status factor’. However, we discuss an alternative possible interpretation based on (psycho)typologically-motivated transfer (borrowing from Kellerman, 1983), providing a methodology for future research in this domain to meaningfully tease apart the ‘L2 status factor’ from this alternative account.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contemporary acquisition theorizing has placed a considerable amount of attention on interfaces, points at which different linguistic modules interact. The claim is that vulnerable interfaces cause particular difficulties in L1, bilingual and adult L2 acquisition (e.g. Platzack, 2001; Montrul, 2004; Müller and Hulk, 2001; Sorace, 2000, 2003, 2004, 2005). Accordingly, it is possible that deficits at the syntax–pragmatics interface cause what appears to be particular non-target-like syntactic behavior in L2 performance. This syntax-before-discourse hypothesis is examined in the present study by analyzing null vs. overt subject pronoun distribution in L2 Spanish of English L1 learners. As ultimately determined by L2 knowledge of the Overt Pronoun Constraint (OPC) (Montalbetti, 1984), the data indicate that L2 learners at the intermediate and advanced levels reset the Null Subject Parameter (NSP), but only advanced learners have acquired a more or less target null/overt subject distribution. Against the predictions of Sorace (2004) and in line with Montrul and Rodríguez-Louro (2006), the data indicate an overuse of both overt and null subject pronouns. As a result, this behavior cannot be from L1 interference alone, suggesting that interface-conditioned properties are simply more complex and therefore, harder to acquire. Furthermore, the data from the advanced learners demonstrate that the syntax–pragmatics interface is not a predetermined locus for fossilization (in contra e.g. Valenzuela, 2006).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of effective environmental management plans and policies requires a sound understanding of the driving forces involved in shaping and altering the structure and function of ecosystems. However, driving forces, especially anthropogenic ones, are defined and operate at multiple administrative levels, which do not always match ecological scales. This paper presents an innovative methodology of analysing drivers of change by developing a typology of scale sensitivity of drivers that classifies and describes the way they operate across multiple administrative levels. Scale sensitivity varies considerably among drivers, which can be classified into five broad categories depending on the response of ‘evenness’ and ‘intensity change’ when moving across administrative levels. Indirect drivers tend to show low scale sensitivity, whereas direct drivers show high scale sensitivity, as they operate in a non-linear way across the administrative scale. Thus policies addressing direct drivers of change, in particular, need to take scale into consideration during their formulation. Moreover, such policies must have a strong spatial focus, which can be achieved either by encouraging local–regional policy making or by introducing high flexibility in (inter)national policies to accommodate increased differentiation at lower administrative levels. High quality data is available for several drivers, however, the availability of consistent data at all levels for non-anthropogenic drivers is a major constraint to mapping and assessing their scale sensitivity. This lack of data may hinder effective policy making for environmental management, since it restricts the ability to fully account for scale sensitivity of natural drivers in policy design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Both the EU’s Renewable Energy Directive (RED) and Article 7a of its Fuel Quality Directive (FQD) seek to reduce greenhouse gas (GHG) emissions from transport fuels. The RED mandates a 10% share of renewable energy in transport fuels by 2020, whilst the FQD requires a 6% reduction in GHG emissions (from a 2010 base) by the same date. In practice, it will mainly be biofuels that economic operators will use to meet these requirements, but the different approaches can lead to either the RED, or the FQD, acting as the binding constraint. A common set of environmental sustainability criteria apply to biofuels under both the RED and the FQD. In particular, biofuels have to demonstrate a 35% (later increasing to 50/60%) saving in life-cycle GHG emissions. This could be problematic in the World Trade Organization (WTO), as a non-compliant biofuel with a 34% emissions saving would probably be judged to be ‘like’ a compliant biofuel. A more economically rational way to reduce GHG emissions, and one that might attract greater public support, would be for the RED to reward emission reductions along the lines of the FQD. Moreover, this modification would probably make the provisions more acceptable in the WTO, as there would be a clearer link between policy measures and the objective of reductions in GHG emissions; and the combination of the revised RED and the FQD would lessen the commercial incentive to import biofuels with modest GHG emission savings, and thus reduce the risk of trade tension.