862 resultados para large-scale network


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European Union continues to exert a large influence on the direction of member states energy policy. The 2020 targets for renewable energy integration have had significant impact on the operation of current power systems, forcing a rapid change from fossil fuel dominated systems to those with high levels of renewable power. Additionally, the overarching aim of an internal energy market throughout Europe has and will continue to place importance on multi-jurisdictional co-operation regarding energy supply. Combining these renewable energy and multi-jurisdictional supply goals results in a complicated multi-vector energy system, where the understanding of interactions between fossil fuels, renewable energy, interconnection and economic power system operation is increasingly important. This paper provides a novel and systematic methodology to fully understand the changing dynamics of interconnected energy systems from a gas and power perspective. A fully realistic unit commitment and economic dispatch model of the 2030 power systems in Great Britain and Ireland, combined with a representative gas transmission energy flow model is developed. The importance of multi-jurisdictional integrated energy system operation in one of the most strategically important renewable energy regions is demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale multiple-input multiple-output (MIMO) communication systems can bring substantial improvement in spectral efficiency and/or energy efficiency, due to the excessive degrees-of-freedom and huge array gain. However, large-scale MIMO is expected to deploy lower-cost radio frequency (RF) components, which are particularly prone to hardware impairments. Unfortunately, compensation schemes are not able to remove the impact of hardware impairments completely, such that a certain amount of residual impairments always exists. In this paper, we investigate the impact of residual transmit RF impairments (RTRI) on the spectral and energy efficiency of training-based point-to-point large-scale MIMO systems, and seek to determine the optimal training length and number of antennas which maximize the energy efficiency. We derive deterministic equivalents of the signal-to-noise-and-interference ratio (SINR) with zero-forcing (ZF) receivers, as well as the corresponding spectral and energy efficiency, which are shown to be accurate even for small number of antennas. Through an iterative sequential optimization, we find that the optimal training length of systems with RTRI can be smaller compared to ideal hardware systems in the moderate SNR regime, while larger in the high SNR regime. Moreover, it is observed that RTRI can significantly decrease the optimal number of transmit and receive antennas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-dimensional (2D) materials have generated great interest in the last few years as a new toolbox for electronics. This family of materials includes, among others, metallic graphene, semiconducting transition metal dichalcogenides (such as MoS2) and insulating Boron Nitride. These materials and their heterostructures offer excellent mechanical flexibility, optical transparency and favorable transport properties for realizing electronic, sensing and optical systems on arbitrary surfaces. In this work, we develop several etch stop layer technologies that allow the fabrication of complex 2D devices and present for the first time the large scale integration of graphene with molybdenum disulfide (MoS2) , both grown using the fully scalable CVD technique. Transistor devices and logic circuits with MoS2 channel and graphene as contacts and interconnects are constructed and show high performances. In addition, the graphene/MoS2 heterojunction contact has been systematically compared with MoS2-metal junctions experimentally and studied using density functional theory. The tunability of the graphene work function significantly improves the ohmic contact to MoS2. These high-performance large-scale devices and circuits based on 2D heterostructure pave the way for practical flexible transparent electronics in the future. The authors acknowledge financial support from the Office of Naval Research (ONR) Young Investigator Program, the ONR GATE MURI program, and the Army Research Laboratory. This research has made use of the MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Otto-von-Guericke-Universität Magdeburg, Fakultät für Mathematik, Dissertation, 2016

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is increasingly recognized that ecological restoration demands conservation action beyond the borders of existing protected areas. This requires the coordination of land uses and management over a larger area, usually with a range of partners, which presents novel institutional challenges for conservation planners. Interviews were undertaken with managers of a purposive sample of large-scale conservation areas in the UK. Interviews were open-ended and analyzed using standard qualitative methods. Results show a wide variety of organizations are involved in large-scale conservation projects, and that partnerships take time to create and demand resilience in the face of different organizational practices, staff turnover, and short-term funding. Successful partnerships with local communities depend on the establishment of trust and the availability of external funds to support conservation land uses. We conclude that there is no single institutional model for large-scale conservation: success depends on finding institutional strategies that secure long-term conservation outcomes, and ensure that conservation gains are not reversed when funding runs out, private owners change priorities, or land changes hands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper various techniques in relation to large-scale systems are presented. At first, explanation of large-scale systems and differences from traditional systems are given. Next, possible specifications and requirements on hardware and software are listed. Finally, examples of large-scale systems are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present Dithen, a novel computation-as-a-service (CaaS) cloud platform specifically tailored to the parallel ex-ecution of large-scale multimedia tasks. Dithen handles the upload/download of both multimedia data and executable items, the assignment of compute units to multimedia workloads, and the reactive control of the available compute units to minimize the cloud infrastructure cost under deadline-abiding execution. Dithen combines three key properties: (i) the reactive assignment of individual multimedia tasks to available computing units according to availability and predetermined time-to-completion constraints; (ii) optimal resource estimation based on Kalman-filter estimates; (iii) the use of additive increase multiplicative decrease (AIMD) algorithms (famous for being the resource management in the transport control protocol) for the control of the number of units servicing workloads. The deployment of Dithen over Amazon EC2 spot instances is shown to be capable of processing more than 80,000 video transcoding, face detection and image processing tasks (equivalent to the processing of more than 116 GB of compressed data) for less than $1 in billing cost from EC2. Moreover, the proposed AIMD-based control mechanism, in conjunction with the Kalman estimates, is shown to provide for more than 27% reduction in EC2 spot instance cost against methods based on reactive resource estimation. Finally, Dithen is shown to offer a 38% to 500% reduction of the billing cost against the current state-of-the-art in CaaS platforms on Amazon EC2 (Amazon Lambda and Amazon Autoscale). A baseline version of Dithen is currently available at dithen.com.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compaction control using lightweight deflectometers (LWD) is currently being evaluated in several states and countries and fully implemented for pavement construction quality assurance (QA) by a few. Broader implementation has been hampered by the lack of a widely recognized standard for interpreting the load and deflection data obtained during construction QA testing. More specifically, reliable and practical procedures are required for relating these measurements to the fundamental material property—modulus—used in pavement design. This study presents a unique set of data and analyses for three different LWDs on a large-scale controlled-condition experiment. Three 4.5x4.5 m2 test pits were designed and constructed at target moisture and density conditions simulating acceptable and unacceptable construction quality. LWD testing was performed on the constructed layers along with static plate loading testing, conventional nuclear gauge moisture-density testing, and non-nuclear gravimetric and volumetric water content measurements. Additional material was collected for routine and exploratory tests in the laboratory. These included grain size distributions, soil classification, moisture-density relations, resilient modulus testing at optimum and field conditions, and an advanced experiment of LWD testing on top of the Proctor compaction mold. This unique large-scale controlled-condition experiment provides an excellent high quality resource of data that can be used by future researchers to find a rigorous, theoretically sound, and straightforward technique for standardizing LWD determination of modulus and construction QA for unbound pavement materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization.