929 resultados para Large-scale Distribution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European Union continues to exert a large influence on the direction of member states energy policy. The 2020 targets for renewable energy integration have had significant impact on the operation of current power systems, forcing a rapid change from fossil fuel dominated systems to those with high levels of renewable power. Additionally, the overarching aim of an internal energy market throughout Europe has and will continue to place importance on multi-jurisdictional co-operation regarding energy supply. Combining these renewable energy and multi-jurisdictional supply goals results in a complicated multi-vector energy system, where the understanding of interactions between fossil fuels, renewable energy, interconnection and economic power system operation is increasingly important. This paper provides a novel and systematic methodology to fully understand the changing dynamics of interconnected energy systems from a gas and power perspective. A fully realistic unit commitment and economic dispatch model of the 2030 power systems in Great Britain and Ireland, combined with a representative gas transmission energy flow model is developed. The importance of multi-jurisdictional integrated energy system operation in one of the most strategically important renewable energy regions is demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale multiple-input multiple-output (MIMO) communication systems can bring substantial improvement in spectral efficiency and/or energy efficiency, due to the excessive degrees-of-freedom and huge array gain. However, large-scale MIMO is expected to deploy lower-cost radio frequency (RF) components, which are particularly prone to hardware impairments. Unfortunately, compensation schemes are not able to remove the impact of hardware impairments completely, such that a certain amount of residual impairments always exists. In this paper, we investigate the impact of residual transmit RF impairments (RTRI) on the spectral and energy efficiency of training-based point-to-point large-scale MIMO systems, and seek to determine the optimal training length and number of antennas which maximize the energy efficiency. We derive deterministic equivalents of the signal-to-noise-and-interference ratio (SINR) with zero-forcing (ZF) receivers, as well as the corresponding spectral and energy efficiency, which are shown to be accurate even for small number of antennas. Through an iterative sequential optimization, we find that the optimal training length of systems with RTRI can be smaller compared to ideal hardware systems in the moderate SNR regime, while larger in the high SNR regime. Moreover, it is observed that RTRI can significantly decrease the optimal number of transmit and receive antennas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-dimensional (2D) materials have generated great interest in the last few years as a new toolbox for electronics. This family of materials includes, among others, metallic graphene, semiconducting transition metal dichalcogenides (such as MoS2) and insulating Boron Nitride. These materials and their heterostructures offer excellent mechanical flexibility, optical transparency and favorable transport properties for realizing electronic, sensing and optical systems on arbitrary surfaces. In this work, we develop several etch stop layer technologies that allow the fabrication of complex 2D devices and present for the first time the large scale integration of graphene with molybdenum disulfide (MoS2) , both grown using the fully scalable CVD technique. Transistor devices and logic circuits with MoS2 channel and graphene as contacts and interconnects are constructed and show high performances. In addition, the graphene/MoS2 heterojunction contact has been systematically compared with MoS2-metal junctions experimentally and studied using density functional theory. The tunability of the graphene work function significantly improves the ohmic contact to MoS2. These high-performance large-scale devices and circuits based on 2D heterostructure pave the way for practical flexible transparent electronics in the future. The authors acknowledge financial support from the Office of Naval Research (ONR) Young Investigator Program, the ONR GATE MURI program, and the Army Research Laboratory. This research has made use of the MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Otto-von-Guericke-Universität Magdeburg, Fakultät für Mathematik, Dissertation, 2016

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is increasingly recognized that ecological restoration demands conservation action beyond the borders of existing protected areas. This requires the coordination of land uses and management over a larger area, usually with a range of partners, which presents novel institutional challenges for conservation planners. Interviews were undertaken with managers of a purposive sample of large-scale conservation areas in the UK. Interviews were open-ended and analyzed using standard qualitative methods. Results show a wide variety of organizations are involved in large-scale conservation projects, and that partnerships take time to create and demand resilience in the face of different organizational practices, staff turnover, and short-term funding. Successful partnerships with local communities depend on the establishment of trust and the availability of external funds to support conservation land uses. We conclude that there is no single institutional model for large-scale conservation: success depends on finding institutional strategies that secure long-term conservation outcomes, and ensure that conservation gains are not reversed when funding runs out, private owners change priorities, or land changes hands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Aims: True Colours is an online prospective mood-monitoring system developed at the University of Oxford to assist local patients and clinicians with monitoring course of illness in bipolar disorder. We report our initial experiences of using True Colours for research purposes in the Bipolar Disorder Research Network (BDRN; www.bdrn.org), a large research network of individuals with mood disorders spread throughout the UK. Methods: After initial piloting to ensure the practicality/acceptability of using True Colours within BDRN, we invited all BDRN participants (n = 7000) to participate in weekly True Colours ratings via three postal invitations sent over an 8-month period. Results: Following the three postal invitations, 915 individuals have so far expressed an interest in joining True Colours, and, of these, 662 (72.3%) have registered. 32 of those who registered for True Colours (5%) have so far asked to leave the system. Positive feedback from participants has focused around the ease of use and convenience of True Colours and potential clinical utility of the graphical representation of weekly mood scores. Conclusions: We have demonstrated that large-scale prospective mood monitoring for research purposes using a contemporary online approach is feasible. Challenges have included: (i) variation in participants’ technological ability; (ii) management of requests for clinical advice based on mood scores within a research setting; and, (iii) resources required to provide access and on-going support for participants using True Colours. We continue to expand recruitment to True Colours within BDRN, and plan to trial email invitations in the next phase of recruitment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper various techniques in relation to large-scale systems are presented. At first, explanation of large-scale systems and differences from traditional systems are given. Next, possible specifications and requirements on hardware and software are listed. Finally, examples of large-scale systems are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present Dithen, a novel computation-as-a-service (CaaS) cloud platform specifically tailored to the parallel ex-ecution of large-scale multimedia tasks. Dithen handles the upload/download of both multimedia data and executable items, the assignment of compute units to multimedia workloads, and the reactive control of the available compute units to minimize the cloud infrastructure cost under deadline-abiding execution. Dithen combines three key properties: (i) the reactive assignment of individual multimedia tasks to available computing units according to availability and predetermined time-to-completion constraints; (ii) optimal resource estimation based on Kalman-filter estimates; (iii) the use of additive increase multiplicative decrease (AIMD) algorithms (famous for being the resource management in the transport control protocol) for the control of the number of units servicing workloads. The deployment of Dithen over Amazon EC2 spot instances is shown to be capable of processing more than 80,000 video transcoding, face detection and image processing tasks (equivalent to the processing of more than 116 GB of compressed data) for less than $1 in billing cost from EC2. Moreover, the proposed AIMD-based control mechanism, in conjunction with the Kalman estimates, is shown to provide for more than 27% reduction in EC2 spot instance cost against methods based on reactive resource estimation. Finally, Dithen is shown to offer a 38% to 500% reduction of the billing cost against the current state-of-the-art in CaaS platforms on Amazon EC2 (Amazon Lambda and Amazon Autoscale). A baseline version of Dithen is currently available at dithen.com.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compaction control using lightweight deflectometers (LWD) is currently being evaluated in several states and countries and fully implemented for pavement construction quality assurance (QA) by a few. Broader implementation has been hampered by the lack of a widely recognized standard for interpreting the load and deflection data obtained during construction QA testing. More specifically, reliable and practical procedures are required for relating these measurements to the fundamental material property—modulus—used in pavement design. This study presents a unique set of data and analyses for three different LWDs on a large-scale controlled-condition experiment. Three 4.5x4.5 m2 test pits were designed and constructed at target moisture and density conditions simulating acceptable and unacceptable construction quality. LWD testing was performed on the constructed layers along with static plate loading testing, conventional nuclear gauge moisture-density testing, and non-nuclear gravimetric and volumetric water content measurements. Additional material was collected for routine and exploratory tests in the laboratory. These included grain size distributions, soil classification, moisture-density relations, resilient modulus testing at optimum and field conditions, and an advanced experiment of LWD testing on top of the Proctor compaction mold. This unique large-scale controlled-condition experiment provides an excellent high quality resource of data that can be used by future researchers to find a rigorous, theoretically sound, and straightforward technique for standardizing LWD determination of modulus and construction QA for unbound pavement materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Permeability of a rock is a dynamic property that varies spatially and temporally. Fractures provide the most efficient channels for fluid flow and thus directly contribute to the permeability of the system. Fractures usually form as a result of a combination of tectonic stresses, gravity (i.e. lithostatic pressure) and fluid pressures. High pressure gradients alone can cause fracturing, the process which is termed as hydrofracturing that can determine caprock (seal) stability or reservoir integrity. Fluids also transport mass and heat, and are responsible for the formation of veins by precipitating minerals within open fractures. Veining (healing) thus directly influences the rock’s permeability. Upon deformation these closed factures (veins) can refracture and the cycle starts again. This fracturing-healing-refacturing cycle is a fundamental part in studying the deformation dynamics and permeability evolution of rock systems. This is generally accompanied by fracture network characterization focusing on network topology that determines network connectivity. Fracture characterization allows to acquire quantitative and qualitative data on fractures and forms an important part of reservoir modeling. This thesis highlights the importance of fracture-healing and veins’ mechanical properties on the deformation dynamics. It shows that permeability varies spatially and temporally, and that healed systems (veined rocks) should not be treated as fractured systems (rocks without veins). Field observations also demonstrate the influence of contrasting mechanical properties, in addition to the complexities of vein microstructures that can form in low-porosity and permeability layered sequences. The thesis also presents graph theory as a characterization method to obtain statistical measures on evolving network connectivity. It also proposes what measures a good reservoir should have to exhibit potentially large permeability and robustness against healing. The results presented in the thesis can have applications for hydrocarbon and geothermal reservoir exploration, mining industry, underground waste disposal, CO2 injection or groundwater modeling.