41 resultados para Threshold crypto-graphic schemes and algorithms
Resumo:
Energetic constraints on precipitation are useful for understanding the response of the hydrological cycle to ongoing climate change, its response to possible geoengineering schemes, and the limits on precipitation in very warm climates of the past. Much recent progress has been made in quantifying the different forcings and feedbacks on precipitation and in understanding how the transient responses of precipitation and temperature might differ qualitatively. Here, we introduce the basic ideas and review recent progress. We also examine the extent to which energetic constraints on precipitation may be viewed as radiative constraints and the extent to which they are confirmed by available observations. Challenges remain, including the need to better demonstrate the link between energetics and precipitation in observations and to better understand energetic constraints on precipitation at sub-global length scales.
Resumo:
Gossip (or Epidemic) protocols have emerged as a communication and computation paradigm for large-scale networked systems. These protocols are based on randomised communication, which provides probabilistic guarantees on convergence speed and accuracy. They also provide robustness, scalability, computational and communication efficiency and high stability under disruption. This work presents a novel Gossip protocol named Symmetric Push-Sum Protocol for the computation of global aggregates (e.g., average) in decentralised and asynchronous systems. The proposed approach combines the simplicity of the push-based approach and the efficiency of the push-pull schemes. The push-pull schemes cannot be directly employed in asynchronous systems as they require synchronous paired communication operations to guarantee their accuracy. Although push schemes guarantee accuracy even with asynchronous communication, they suffer from a slower and unstable convergence. Symmetric Push- Sum Protocol does not require synchronous communication and achieves a convergence speed similar to the push-pull schemes, while keeping the accuracy stability of the push scheme. In the experimental analysis, we focus on computing the global average as an important class of node aggregation problems. The results have confirmed that the proposed method inherits the advantages of both other schemes and outperforms well-known state of the art protocols for decentralized Gossip-based aggregation.
Resumo:
Wireless local area networks (WLANs) based on the IEEE 802.11 standard are now widespread. Most are used to provide access for mobile devices to a conventional wired infrastructure, and some are used where wires are not possible, forming an ad hoc network of their own. There are several varieties at the physical or radio layer (802.11, 802.11a, 802.11b, 802.11g), with each featuring different data rates, modulation schemes and transmission frequencies. However, all of them share a common medium access control (MAC) layer. As this is largely based on a contention approach, it does not allow prioritising of traffic or stations, so it cannot easily provide the quality of service (QoS) required by time-sensitive applications, such as voice or video transmission. In order to address this shortfall of the technology, the IEEE set up a task group that is aiming to enhance the MAC layer protocol so that it can provide QoS. The latest draft at the time of writing is Draft 11, dated October 2004. The article describes the yet-to-be-ratified 802.11e standard and is based on that draft.
Resumo:
Until recently, pollution control in rural drainage basins of the UK consisted solely of water treatment at the point of abstraction. However, prevention of agricultural pollution at source is now a realistic option given the possibility of financing the necessary changes in land use through modification of the Common Agricultural Policy. This paper uses a nutrient export coefficient model to examine the cost of land-use change in relation to improvement of water quality. Catchment-wide schemes and local protection measures are considered. Modelling results underline the need for integrated management of entire drainage basins. A wide range of benefits may accrue from land-use change, including enhanced habitats for wildlife as well as better drinking water.
Resumo:
This article investigates the relation between stimulus-evoked neural activity and cerebral hemodynamics. Specifically, the hypothesis is tested that hemodynamic responses can be modeled as a linear convolution of experimentally obtained measures of neural activity with a suitable hemodynamic impulse response function. To obtain a range of neural and hemodynamic responses, rat whisker pad was stimulated using brief (less than or equal to2 seconds) electrical stimuli consisting of single pulses (0.3 millisecond, 1.2 mA) combined both at different frequencies and in a paired-pulse design. Hemodynamic responses were measured using concurrent optical imaging spectroscopy and laser Doppler flowmetry, whereas neural responses were assessed through current source density analysis of multielectrode recordings from a single barrel. General linear modeling was used to deconvolve the hemodynamic impulse response to a single "neural event" from the hemodynamic and neural responses to stimulation. The model provided an excellent fit to the empirical data. The implications of these results for modeling schemes and for physiologic systems coupling neural and hemodynamic activity are discussed.
Resumo:
The parameterization of surface heat-flux variability in urban areas relies on adequate representation of surface characteristics. Given the horizontal resolutions (e.g. ≈0.1–1km) currently used in numerical weather prediction (NWP) models, properties of the urban surface (e.g. vegetated/built surfaces, street-canyon geometries) often have large spatial variability. Here, a new approach based on Urban Zones to characterize Energy partitioning (UZE) is tested within a NWP model (Weather Research and Forecasting model;WRF v3.2.1) for Greater London. The urban land-surface scheme is the Noah/Single-Layer Urban Canopy Model (SLUCM). Detailed surface information (horizontal resolution 1 km)in central London shows that the UZE offers better characterization of surface properties and their variability compared to default WRF-SLUCM input parameters. In situ observations of the surface energy fluxes and near-surface meteorological variables are used to select the radiation and turbulence parameterization schemes and to evaluate the land-surface scheme
Resumo:
Traditional resource management has had as its main objective the optimisation of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.
Resumo:
Traditional resource management has had as its main objective the optimisation of throughput, based on pa- rameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.
Resumo:
Purpose – The creation of a target market strategy is integral to developing an effective business strategy. The concept of market segmentation is often cited as pivotal to establishing a target market strategy, yet all too often business-to-business marketers utilise little more than trade sectors or product groups as the basis for their groupings of customers, rather than customers' characteristics and buying behaviour. The purpose of this paper is to offer a solution for managers, focusing on customer purchasing behaviour, which evolves from the organisation's existing criteria used for grouping its customers. Design/methodology/approach – One of the underlying reasons managers fail to embrace best practice market segmentation is their inability to manage the transition from how target markets in an organisation are currently described to how they might look when based on customer characteristics, needs, purchasing behaviour and decision-making. Any attempt to develop market segments should reflect the inability of organisations to ignore their existing customer group classification schemes and associated customer-facing operational practices, such as distribution channels and sales force allocations. Findings – A straightforward process has been derived and applied, enabling organisations to practice market segmentation in an evolutionary manner, facilitating the transition to customer-led target market segments. This process also ensures commitment from the managers responsible for implementing the eventual segmentation scheme. This paper outlines the six stages of this process and presents an illustrative example from the agrichemicals sector, supported by other cases. Research implications – The process presented in this paper for embarking on market segmentation focuses on customer purchasing behaviour rather than business sectors or product group classifications - which is true to the concept of market segmentation - but in a manner that participating managers find non-threatening. The resulting market segments have their basis in the organisation's existing customer classification schemes and are an iteration to which most managers readily buy-in. Originality/value – Despite the size of the market segmentation literature, very few papers offer step-by-step guidance for developing customer-focused market segments in business-to-business marketing. The analytical tool for assessing customer purchasing deployed in this paper originally was created to assist in marketing planning programmes, but has since proved its worth as the foundation for creating segmentation schemes in business marketing, as described in this paper.
Resumo:
Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.
Resumo:
Over the last decade the English planning system has placed greater emphasis on the financial viability of development. ‘Calculative’ practices have been used to quantify and capture land value uplifts. Development viability appraisal (DVA) has become a key part of the evidence base used in planning decision-making and informs both ‘site-specific’ negotiations about the level of land value capture for individual schemes and ‘area-wide’ planning policy formation. This paper investigates how implementation of DVA is governed in planning policy formation. It is argued that the increased use of DVA raises important questions about how planning decisions are made and operationalised, not least because DVA is often poorly understood by some key stakeholders. The paper uses the concept of governance to thematically analyse semi-structured interviews conducted with the producers of DVAs and considers key procedural issues including (in)consistencies in appraisal practices, levels of stakeholder consultation and the potential for client and producer bias. Whilst stakeholder consultation is shown to be integral to the appraisal process in order to improve the quality of the appraisals and to legitimise the outputs, participation is restricted to industry experts and excludes some interest groups, including local communities. It is concluded that, largely because of its recent adoption and knowledge asymmetries between local planning authorities and appraisers, DVA is a weakly governed process characterised by emerging and contested guidance and is therefore ‘up for grabs’.