877 resultados para System level policy
Resumo:
Regional approaches to EU energy policies have been termed the ‘Schengenisation’ of energy, making reference to the Schengen Convention eliminating intra-European border controls. They aim to hone the effectiveness of EU energy policy objectives through enhanced policy coordination at the regional scale. Typically, this includes energy market integration while accounting for member states’ continued deployment of national-level policy instruments regarding the appropriate energy mix and the security of energy supply, which is foreseen in the EU Treaty. This report explores the potential for such regional approaches. It assesses lessons from existing initiatives, regional energy arrangements such as the Danube Energy Forum, the Mediterranean Energy Forum, the Pentalateral Energy Forum, the North Seas Countries’ Offshore Grid Initiative and the Nordic Co-operation partnership, to determine whether regional energy initiatives are an efficient, effective and politically acceptable approach toward reaching three EU energy policy objectives: competitiveness, supply security and sustainability. Regional approaches could possibly play an important role for governing EU renewables policy, which the European Commission has identified in the 2030 climate and energy framework as an important element for governance.
Resumo:
Regional approaches to EU energy policies have been termed the ‘Schengenisation’ of energy, making reference to the Schengen Convention eliminating intra-European border controls. They aim to hone the effectiveness of EU energy policy objectives through enhanced policy coordination at the regional scale. Typically, this includes energy market integration while accounting for member states’ continuing deployment of national-level policy instruments regarding the appropriate energy mix and the security of energy supply, which is foreseen in the EU Treaty. This report explores the potential for such regional approaches. It assesses lessons from existing initiatives, regional energy arrangements such as the Danube Energy Forum, the Mediterranean Energy Forum, the Pentalateral Energy Forum, the North Seas Countries’ Offshore Grid Initiative and the Nordic Co-operation partnership, to determine whether regional energy initiatives are an efficient, effective and politically acceptable approach toward reaching three EU energy policy objectives: competitiveness, supply security and sustainability. Regional approaches could possibly play an important role for governing EU renewables policy, which the European Commission has identified in the 2030 climate and energy framework as an important element for governance.
Resumo:
Climatic change is an increasing challenge for agriculture that is driving the development of suitable crops in order to ensure supply for both human nutrition and animal feed. In this context, it is increasingly important to understand the biochemical responses of cells to environmental cues at the whole system level, an aim that is being brought closer by advances in high throughput, cost-efficient plant metabolomics. To support molecular breeding activities, we have assessed the economic, technical and statistical feasibility of using direct mass spectrometry methods to evaluate the physiological state of maize (Zea mays L.) plants grown under different stress conditions.
Resumo:
Systems biology is based on computational modelling and simulation of large networks of interacting components. Models may be intended to capture processes, mechanisms, components and interactions at different levels of fidelity. Input data are often large and geographically disperse, and may require the computation to be moved to the data, not vice versa. In addition, complex system-level problems require collaboration across institutions and disciplines. Grid computing can offer robust, scaleable solutions for distributed data, compute and expertise. We illustrate some of the range of computational and data requirements in systems biology with three case studies: one requiring large computation but small data (orthologue mapping in comparative genomics), a second involving complex terabyte data (the Visible Cell project) and a third that is both computationally and data-intensive (simulations at multiple temporal and spatial scales). Authentication, authorisation and audit systems are currently not well scalable and may present bottlenecks for distributed collaboration particularly where outcomes may be commercialised. Challenges remain in providing lightweight standards to facilitate the penetration of robust, scalable grid-type computing into diverse user communities to meet the evolving demands of systems biology.
Resumo:
The behaviour of control functions in safety critical software systems is typically bounded to prevent the occurrence of known system level hazards. These bounds are typically derived through safety analyses and can be implemented through the use of necessary design features. However, the unpredictability of real world problems can result in changes in the operating context that may invalidate the behavioural bounds themselves, for example, unexpected hazardous operating contexts as a result of failures or degradation. For highly complex problems it may be infeasible to determine the precise desired behavioural bounds of a function that addresses or minimises risk for hazardous operation cases prior to deployment. This paper presents an overview of the safety challenges associated with such a problem and how such problems might be addressed. A self-management framework is proposed that performs on-line risk management. The features of the framework are shown in context of employing intelligent adaptive controllers operating within complex and highly dynamic problem domains such as Gas-Turbine Aero Engine control. Safety assurance arguments enabled by the framework necessary for certification are also outlined.
Resumo:
The connectivity of the Internet at the Autonomous System level is influenced by the network operator policies implemented. These in turn impose a direction to the announcement of address advertisements and, consequently, to the paths that can be used to reach back such destinations. We propose to use directed graphs to properly represent how destinations propagate through the Internet and the number of arc-disjoint paths to quantify this network's path diversity. Moreover, in order to understand the effects that policies have on the connectivity of the Internet, numerical analyses of the resulting directed graphs were conducted. Results demonstrate that, even after policies have been applied, there is still path diversity which the Border Gateway Protocol cannot currently exploit.
Resumo:
Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.
Resumo:
Congestion control is critical for the provisioning of quality of services (QoS) over dedicated short range communications (DSRC) vehicle networks for road safety applications. In this paper we propose a congestion control method for DSRC vehicle networks at road intersection, with the aims of providing high availability and low latency channels for high priority emergency safety applications while maximizing channel utilization for low priority routine safety applications. In this method a offline simulation based approach is used to find out the best possible configurations of message rate and MAC layer backoff exponent (BE) for a given number of vehicles equipped with DSRC radios. The identified best configurations are then used online by an roadside access point (AP) for system operation. Simulation results demonstrated that this adaptive method significantly outperforms the fixed control method under varying number of vehicles. The impact of estimation error on the number of vehicles in the network on system level performance is also investigated.
Resumo:
The importance of interorganizational networks in supporting or hindering the achievement of organizational objectives is now widely acknowledged. Network research is directed at understanding network processes and structures, and their impact upon performance. A key process is learning. The concepts of individual, group and organizational learning are long established. This article argues that learning might also usefully be regarded as occurring at a fourth system level, the interorganizational network. The concept of network learning - learning by a group of organizations as a group - is presented, and differentiated from other types of learning, notably interorganizational learning (learning in interorganizational contexts). Four cases of network learning are identified and analysed to provide insights into network learning processes and outcomes. It is proposed that 'network learning episode' offers a suitable unit of analysis for the empirical research needed to develop our understanding of this potentially important concept.
Resumo:
The simulation of a power system such as the More Electric Aircraft is a complex problem. There are conflicting requirements of the simulation, for example in order to reduce simulation run-times, power ratings that need to be established over long periods of the flight can be calculated using a fairly coarse model, whereas power quality is established over relatively short periods with a detailed model. An important issue is to establish the requirements of the simulation work at an early stage. This paper describes the modelling and simulation strategy adopted for the UK TIMES project, which is looking into the optimisation of the More Electric Aircraft from a system level. Essentially four main requirements of the simulation work have been identified, resulting in four different types of simulation. Each of the simulations is described along with preliminary models and results.
Resumo:
With the features of low-power and flexible networking capabilities IEEE 802.15.4 has been widely regarded as one strong candidate of communication technologies for wireless sensor networks (WSNs). It is expected that with an increasing number of deployments of 802.15.4 based WSNs, multiple WSNs could coexist with full or partial overlap in residential or enterprise areas. As WSNs are usually deployed without coordination, the communication could meet significant degradation with the 802.15.4 channel access scheme, which has a large impact on system performance. In this thesis we are motivated to investigate the effectiveness of 802.15.4 networks supporting WSN applications with various environments, especially when hidden terminals are presented due to the uncoordinated coexistence problem. Both analytical models and system level simulators are developed to analyse the performance of the random access scheme specified by IEEE 802.15.4 medium access control (MAC) standard for several network scenarios. The first part of the thesis investigates the effectiveness of single 802.15.4 network supporting WSN applications. A Markov chain based analytic model is applied to model the MAC behaviour of IEEE 802.15.4 standard and a discrete event simulator is also developed to analyse the performance and verify the proposed analytical model. It is observed that 802.15.4 networks could sufficiently support most WSN applications with its various functionalities. After the investigation of single network, the uncoordinated coexistence problem of multiple 802.15.4 networks deployed with communication range fully or partially overlapped are investigated in the next part of the thesis. Both nonsleep and sleep modes are investigated with different channel conditions by analytic and simulation methods to obtain the comprehensive performance evaluation. It is found that the uncoordinated coexistence problem can significantly degrade the performance of 802.15.4 networks, which is unlikely to satisfy the QoS requirements for many WSN applications. The proposed analytic model is validated by simulations which could be used to obtain the optimal parameter setting before WSNs deployments to eliminate the interference risks.
Resumo:
The basic construction concepts of many-valued intellectual systems, which are adequate to primal problems of person activity and using hybrid tools with many-valued intellectual systems being two-place, but simulating neuron processes of space toting which are different on a level of actions, inertial and threshold of properties of neuron diaphragms, and also frequency modification of the following transmitted messages are created. All enumerated properties and functions in point of fact are essential not only are discrete on time, but also many-valued.
Resumo:
Fizikai példákon és matematikai modelleken bemutatjuk, hogy a rendszerek működésének hatékonyságnövekedése instabilitást eredményezhet. Megvizsgáljuk, hogy az informatika és a telekommunikáció fejlődése okozhat-e rendszerszintű instabilitást, illetve milyen gazdasági eszközök vannak a stabilitás fenntartására. / === / Using examples from physics and mathematical modeling, the paper shows that increasing efficiency in systems can lead to instability. The question thus arises whether the development of information and telecommunication technology can lead to instability in the economic system. The policy tools used to maintain stability are also discussed.
Resumo:
Water management has altered both the natural timing and volume of freshwater delivered to Everglades National Park. This is especially true for Taylor Slough and the C-111 basin, as hypersaline events in Florida Bay have been linked to reduced freshwater flow in this area. In light of recent efforts to restore historical flows to the eastern Everglades, an understanding of the impact of this hydrologic shift is needed in order to predict the trajectory of restoration. I conducted a study to assess the importance of season, water chemistry, and hydrologic conditions on the exchange of nutrients in dwarf and fringe mangrove wetlands along Taylor Slough. I also performed mangrove leaf decomposition studies to determine the contribution of biotic and abiotic processes to mass loss, the effect of salinity and season on degradation rates, and the importance of this litter component as a rapid source of nutrients. ^ Dwarf mangrove wetlands consistently imported total nutrients (C, N, and P) and released NO2− +NO3 −, with enhanced release during the dry season. Ammonium flux shifted from uptake to release over the study period. Dissolved phosphate activity was difficult to discern in either wetland, as concentrations were often below detection limits. Fluxes of dissolved inorganic nitrogen in the fringe wetland were positively related to DIN concentrations. The opposite was found for total nitrogen in the fringe wetland. A dynamic budget revealed a net annual export of TN to Florida Bay that was highest during the wet season. Simulated increases and decreases in freshwater flow yielded reduced exports of TN to Florida Bay as a result of changes in subsystem and water flux characteristics. Finally, abiotic processes yielded substantial nutrient and mass losses from senesced leaves with little influence of salinity. Dwarf mangrove leaf litter appeared to be a considerable source of nutrients to the water column of this highly oligotrophic wetland. To summarize, nutrient dynamics at the subsystem level were sensitive to short-term changes in hydrologic and seasonal conditions. These findings suggest that increased freshwater flow has the potential to lead to long-term, system-level changes that may reach as far as eastern Florida Bay. ^
Resumo:
Disk drives are the bottleneck in the processing of large amounts of data used in almost all common applications. File systems attempt to reduce this by storing data sequentially on the disk drives, thereby reducing the access latencies. Although this strategy is useful when data is retrieved sequentially, the access patterns in real world workloads is not necessarily sequential and this mismatch results in storage I/O performance degradation. This thesis demonstrates that one way to improve the storage performance is to reorganize data on disk drives in the same way in which it is mostly accessed. We identify two classes of accesses: static, where access patterns do not change over the lifetime of the data and dynamic, where access patterns frequently change over short durations of time, and propose, implement and evaluate layout strategies for each of these. Our strategies are implemented in a way that they can be seamlessly integrated or removed from the system as desired. We evaluate our layout strategies for static policies using tree-structured XML data where accesses to the storage device are mostly of two kinds—parent-to-child or child-to-sibling. Our results show that for a specific class of deep-focused queries, the existing file system layout policy performs better by 5–54X. For the non-deep-focused queries, our native layout mechanism shows an improvement of 3–127X. To improve performance of the dynamic access patterns, we implement a self-optimizing storage system that performs rearranges popular block accesses on a dedicated partition based on the observed workload characteristics. Our evaluation shows an improvement of over 80% in the disk busy times over a range of workloads. These results show that applying the knowledge of data access patterns for allocation decisions can substantially improve the I/O performance.