910 resultados para Operational analytics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The value of integrating a heat storage into a geothermal district heating system has been investigated. The behaviour of the system under a novel operational strategy has been simulated focusing on the energetic, economic and environmental effects of the new strategy of incorporation of the heat storage within the system. A typical geothermal district heating system consists of several production wells, a system of pipelines for the transportation of the hot water to end-users, one or more re-injection wells and peak-up devices (usually fossil-fuel boilers). Traditionally in these systems, the production wells change their production rate throughout the day according to heat demand, and if their maximum capacity is exceeded the peak-up devices are used to meet the balance of the heat demand. In this study, it is proposed to maintain a constant geothermal production and add heat storage into the network. Subsequently, hot water will be stored when heat demand is lower than the production and the stored hot water will be released into the system to cover the peak demands (or part of these). It is not intended to totally phase-out the peak-up devices, but to decrease their use, as these will often be installed anyway for back-up purposes. Both the integration of a heat storage in such a system as well as the novel operational strategy are the main novelties of this thesis. A robust algorithm for the sizing of these systems has been developed. The main inputs are the geothermal production data, the heat demand data throughout one year or more and the topology of the installation. The outputs are the sizing of the whole system, including the necessary number of production wells, the size of the heat storage and the dimensions of the pipelines amongst others. The results provide several useful insights into the initial design considerations for these systems, emphasizing particularly the importance of heat losses. Simulations are carried out for three different cases of sizing of the installation (small, medium and large) to examine the influence of system scale. In the second phase of work, two algorithms are developed which study in detail the operation of the installation throughout a random day and a whole year, respectively. The first algorithm can be a potentially powerful tool for the operators of the installation, who can know a priori how to operate the installation on a random day given the heat demand. The second algorithm is used to obtain the amount of electricity used by the pumps as well as the amount of fuel used by the peak-up boilers over a whole year. These comprise the main operational costs of the installation and are among the main inputs of the third part of the study. In the third part of the study, an integrated energetic, economic and environmental analysis of the studied installation is carried out together with a comparison with the traditional case. The results show that by implementing heat storage under the novel operational strategy, heat is generated more cheaply as all the financial indices improve, more geothermal energy is utilised and less fuel is used in the peak-up boilers, with subsequent environmental benefits, when compared to the traditional case. Furthermore, it is shown that the most attractive case of sizing is the large one, although the addition of the heat storage most greatly impacts the medium case of sizing. In other words, the geothermal component of the installation should be sized as large as possible. This analysis indicates that the proposed solution is beneficial from energetic, economic, and environmental perspectives. Therefore, it can be stated that the aim of this study is achieved in its full potential. Furthermore, the new models for the sizing, operation and economic/energetic/environmental analyses of these kind of systems can be used with few adaptations for real cases, making the practical applicability of this study evident. Having this study as a starting point, further work could include the integration of these systems with end-user demands, further analysis of component parts of the installation (such as the heat exchangers) and the integration of a heat pump to maximise utilisation of geothermal energy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As usage metrics continue to attain an increasingly central role in library system assessment and analysis, librarians tasked with system selection, implementation, and support are driven to identify metric approaches that simultaneously require less technical complexity and greater levels of data granularity. Such approaches allow systems librarians to present evidence-based claims of platform usage behaviors while reducing the resources necessary to collect such information, thereby representing a novel approach to real-time user analysis as well as dual benefit in active and preventative cost reduction. As part of the DSpace implementation for the MD SOAR initiative, the Consortial Library Application Support (CLAS) division has begun test implementation of the Google Tag Manager analytic system in an attempt to collect custom analytical dimensions to track author- and university-specific download behaviors. Building on the work of Conrad , CLAS seeks to demonstrate that the GTM approach to custom analytics provides both granular metadata-based usage statistics in an approach that will prove extensible for additional statistical gathering in the future. This poster will discuss the methodology used to develop these custom tag approaches, the benefits of using the GTM model, and the risks and benefits associated with further implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wind energy is one of the most promising and fast growing sector of energy production. Wind is ecologically friendly and relatively cheap energy resource available for development in practically all corners of the world (where only the wind blows). Today wind power gained broad development in the Scandinavian countries. Three important challenges concerning sustainable development, i.e. energy security, climate change and energy access make a compelling case for large-scale utilization of wind energy. In Finland, according to the climate and energy strategy, accepted in 2008, the total consumption of electricity generated by means of wind farms by 2020, should reach 6 - 7% of total consumption in the country [1]. The main challenges associated with wind energy production are harsh operational conditions that often accompany the turbine operation in the climatic conditions of the north and poor accessibility for maintenance and service. One of the major problems that require a solution is the icing of turbine structures. Icing reduces the performance of wind turbines, which in the conditions of a long cold period, can significantly affect the reliability of power supply. In order to predict and control power performance, the process of ice accretion has to be carefully tracked. There are two ways to detect icing – directly or indirectly. The first way applies to the special ice detection instruments. The second one is using indirect characteristics of turbine performance. One of such indirect methods for ice detection and power loss estimation has been proposed and used in this paper. The results were compared to the results directly gained from the ice sensors. The data used was measured in Muukko wind farm, southeast Finland during a project 'Wind power in cold climate and complex terrain'. The project was carried out in 9/2013 - 8/2015 with the partners Lappeenranta university of technology, Alstom renovables España S.L., TuuliMuukko, and TuuliSaimaa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Field lab: Entrepreneurial and innovative ventures

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A comprehensive database of temperature, salinity and bio-chemical parameters in the Mediterranean and Black Sea has been constructed through comprehensive co-operation between the bordering countries. Statistical climatologies have been computed with all assembled and quality controlled data. The database, designed to initiate and validate prediction models, also represents a system to quality-check new incoming data produced by ocean observing systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Copernicus is a European system for monitoring the Earth. COPERNICUS-CMEMS products and services are meant to serve all marine applications: Marine resources, Maritime safety, Coastal and Marine Environment, Seasonal Forecast & Climate. The service is ambitious as the ocean is complex and many processes are involved, from physical oceanography, biology, geology, ocean-atmosphere fluxes, solar radiations, moon induced tides, anthropic activity. A multi-platform approach is essential, taking into account sea-level stations, coastal buoys, HF radars, river flows, drifting buoys, sea-mammal or fishes fitted with sensors, vessels, gliders, floats.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Operational approaches have been more and more widely developed and used for providing marine data and information services for different socio-economic sectors of the Blue Growth and to advance knowledge about the marine environment. The objective of operational oceanographic research is to develop and improve the efficiency, timeliness, robustness and product quality of this approach. This white paper aims to address key scientific challenges and research priorities for the development of operational oceanography in Europe for the next 5-10 years. Knowledge gaps and deficiencies are identified in relation to common scientific challenges in four EuroGOOS knowledge areas: European Ocean Observations, Modelling and Forecasting Technology, Coastal Operational Oceanography and Operational Ecology. The areas "European Ocean Observations" and "Modelling and Forecasting Technology" focus on the further advancement of the basic instruments and capacities for European operational oceanography, while "Coastal Operational Oceanography" and "Operational Ecology" aim at developing new operational approaches for the corresponding knowledge areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Employees are the human capital which, to a great extent, contributes to the success and development of high-performance and sustainable organizations. In a work environment, there is a need to provide a tool for tracking and following-up on each employees' professional progress, while staying aligned with the organization’s strategic and operational goals and objectives. The research work within this Thesis aims to contribute to improve employees' selfawareness and auto-regulation; two predominant research areas are also studied and analyzed: Visual Analytics and Gamification. The Visual Analytics enables the specification of personalized dashboard interfaces with alerts and indicators to keep employees aware of their skills and to continuously monitor how to improve their expertise, promoting simultaneously behavioral change and adoption of good-practices. The study of Gamification techniques with Talent Management features enabled the design of new processes to engage, motivate, and retain highly productive employees, and to foster a competitive working environment, where employees are encouraged to be involved in new and rewarding activities, where knowledge and experience are recognized as a relevant asset. The Design Science Research was selected as the research methodology; the creation of new knowledge is therefore based on an iterative cycle addressing concepts such as design, analysis, reflection, and abstraction. By collaborating in an international project (Active@Work), funded by the Active and Assisted Living Programme, the results followed a design thinking approach regarding the specification of the structure and behavior of the Skills Development Module, namely the identification of requirements and the design of an innovative info-structure of metadata to support the user experience. A set of mockups were designed based on the user role and main concerns. Such approach enabled the conceptualization of a solution to proactively assist the management and assessment of skills in a personalized and dynamic way. The outcomes of this Thesis aims to demonstrate the existing articulation between emerging research areas such as Visual Analytics and Gamification, expecting to represent conceptual gains in these two research fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big data analytics for traffic accidents is a hot topic and has significant values for a smart and safe traffic in the city. Based on the massive traffic accident data from October 2014 to March 2015 in Xiamen, China, we propose a novel accident occurrences analytics method in both spatial and temporal dimensions to predict when and where an accident with a specific crash type will occur consequentially by whom. Firstly, we analyze and visualize accident occurrences in both temporal and spatial view. Second, we illustrate spatio-temporal visualization results through two case studies in multiple road segments, and the impact of weather on crash types. These findings of accident occurrences analysis and visualization would not only help traffic police department implement instant personnel assignments among simultaneous accidents, but also inform individual drivers about accident-prone sections and the time span which requires their most attention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Centre of Clinical Research Excellence (CCRE) in Aboriginal and Torres Strait Islander Health was established in late 2003 through a major National Health and Medical Research Council (NHMRC) grant involving collaboration between the Aboriginal Health Council of South Australia (AHCSA), Flinders University, and Aboriginal Health Services. Our foundation research communities are the Aboriginal communities served by these Aboriginal Health Services in the Spencer Gulf / Eyre Peninsula region. In recent years a number of collaborative research programs involving chronic illness management, self-management and coordinated care have been implemented in these communities and this work is the basis of the initial CCRE activities. Key objectives of the CCRE are to improve the health status of Indigenous people through conducting relevant and meaningful Aboriginal controlled health research, providing formal training for Indigenous health researchers and developing innovative approaches to health care that can be readily translated and applied to support communities. The inclusion, empowerment and engagement of Indigenous people in the process of managing community health represent tangible strategies for achieving more equitable health outcomes for Aboriginal people. This paper outlines the CCRE operational rationale and presents early activities and outcomes across the three strategic areas of CCRE operations: research, education and training, and translation. Some critical reflections are offered on the progress and experience of the CCRE thus far. A common obstacle this CCRE has encountered is that the limited (especially staff) resources available to the Aboriginal Health Services with which we are collaborating make it difficult for them to engage with and progress the projects we are pursuing.