872 resultados para Resilience based management system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a study on the implementation of Real-Time Pricing (RTP) based Demand Side Management (DSM) of water pumping at a clean water pumping station in Northern Ireland, with the intention of minimising electricity costs and maximising the usage of electricity from wind generation. A Genetic Algorithm (GA) was used to create pumping schedules based on system constraints and electricity tariff scenarios. Implementation of this method would allow the water network operator to make significant savings on electricity costs while also helping to mitigate the variability of wind generation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a study on the implementation of Real-Time Pricing (RTP) based Demand Side Management (DSM) of water pumping at a clean water pumping station in Northern Ireland, with the intention of minimising electricity costs and maximising the usage of electricity from wind generation. A Genetic Algorithm (GA) was used to create pumping schedules based on system constraints and electricity tariff scenarios. Implementation of this method would allow the water network operator to make significant savings on electricity costs while also helping to mitigate the variability of wind generation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The resilience of a social-ecological system is measured by its ability to retain core functionality when subjected to perturbation. Resilience is contextually dependent on the state of system components, the complex interactions among these components, and the timing, location, and magnitude of perturbations. The stability landscape concept provides a useful framework for considering resilience within the specified context of a particular social-ecological system but has proven difficult to operationalize. This difficulty stems largely from the complex, multidimensional nature of the systems of interest and uncertainty in system response. Agent-based models are an effective methodology for understanding how cross-scale processes within and across social and ecological domains contribute to overall system resilience. We present the results of a stylized model of agricultural land use in a small watershed that is typical of the Midwestern United States. The spatially explicit model couples land use, biophysical models, and economic drivers with an agent-based model to explore the effects of perturbations and policy adaptations on system outcomes. By applying the coupled modeling approach within the resilience and stability landscape frameworks, we (1) estimate the sensitivity of the system to context-specific perturbations, (2) determine potential outcomes of those perturbations, (3) identify possible alternative states within state space, (4) evaluate the resilience of system states, and (5) characterize changes in system-scale resilience brought on by changes in individual land use decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SD card (Secure Digital Memory Card) is widely used in portable storage medium. Currently, latest researches on SD card, are mainly SD card controller based on FPGA (Field Programmable Gate Array). Most of them are relying on API interface (Application Programming Interface), AHB bus (Advanced High performance Bus), etc. They are dedicated to the realization of ultra high speed communication between SD card and upper systems. Studies about SD card controller, really play a vital role in the field of high speed cameras and other sub-areas of expertise. This design of FPGA-based file systems and SD2.0 IP (Intellectual Property core) does not only exhibit a nice transmission rate, but also achieve the systematic management of files, while retaining a strong portability and practicality. The file system design and implementation on a SD card covers the main three IP innovation points. First, the combination and integration of file system and SD card controller, makes the overall system highly integrated and practical. The popular SD2.0 protocol is implemented for communication channels. Pure digital logic design based on VHDL (Very-High-Speed Integrated Circuit Hardware Description Language), integrates the SD card controller in hardware layer and the FAT32 file system for the entire system. Secondly, the document management system mechanism makes document processing more convenient and easy. Especially for small files in batch processing, it can ease the pressure of upper system to frequently access and process them, thereby enhancing the overall efficiency of systems. Finally, digital design ensures the superior performance. For transmission security, CRC (Cyclic Redundancy Check) algorithm is for data transmission protection. Design of each module is platform-independent of macro cells, and keeps a better portability. Custom integrated instructions and interfaces may facilitate easily to use. Finally, the actual test went through multi-platform method, Xilinx and Altera FPGA developing platforms. The timing simulation and debugging of each module was covered. Finally, Test results show that the designed FPGA-based file system IP on SD card can support SD card, TF card and Micro SD with 2.0 protocols, and the successful implementation of systematic management for stored files, and supports SD bus mode. Data read and write rates in Kingston class10 card is approximately 24.27MB/s and 16.94MB/s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fisheries plays a significant and important part in the economy of the country contributing to foreign exchange, food security and employment creation. Lake Victoria contributes over 50% of the total annual fish catch. The purpose of fisheries management is to ensure conservation, protection, proper use, economic efficiency and equitable distribution of the fisheries resources both for the present and future generations through sustainable utilization. The earliest fisheries were mainly at the subsistence level. Fishing gear consisted of locally made basket traps, hooks and seine nets of papyrus. Fishing effort begun to increase with the introduction of more efficient flax gillnets in 1905. Fisheries management in Uganda started in 1914. Before then, the fishery was under some form of traditional management based on the do and don'ts. History shows that the Baganda had strong spiritual beliefs in respect of "god Mukasa" (god of the Lake) and these indirectly contributed to sustainable management of the lake. If a fisherman neglected to comply witt'l any of the ceremonies related to fishing he was expected to encounter a bad omen (Rev. Roscoe, 1965) However, with the introduction of the nylon gill nets, which could catch more fish, traditional management regime broke down. By 1955 the indigenous fish species like Oreochromis variabilis and Oreochromis esculentus had greatly declined in catches. Decline in catches led to introduction of poor fishing methods because of competition for fish. Government in an attempt to regulate the fishing irldustry enacted the first Fisheries Ordinance in 1951 and recruited Fisheries Officers to enforce them. The government put in place minimum net mesh-sizes and Fisheries Officers arrested fishermen without explaining the reason. This led to continued poor fishing practices. The development of government centred management systems led to increased alienation of resource users and to wilful disregard of specific regulations. The realisation of the problems faced by the central management system led to the recognition that user groups need to be actively involved in fisheries management if the systems are to be consistent with sustainable fisheries and be legitimate. Community participation in fisheries management under the Comanagement approach has been adopted in Lake Victoria including other water bodies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation investigates customer behavior modeling in service outsourcing and revenue management in the service sector (i.e., airline and hotel industries). In particular, it focuses on a common theme of improving firms’ strategic decisions through the understanding of customer preferences. Decisions concerning degrees of outsourcing, such as firms’ capacity choices, are important to performance outcomes. These choices are especially important in high-customer-contact services (e.g., airline industry) because of the characteristics of services: simultaneity of consumption and production, and intangibility and perishability of the offering. Essay 1 estimates how outsourcing affects customer choices and market share in the airline industry, and consequently the revenue implications from outsourcing. However, outsourcing decisions are typically endogenous. A firm may choose whether to outsource or not based on what a firm expects to be the best outcome. Essay 2 contributes to the literature by proposing a structural model which could capture a firm’s profit-maximizing decision-making behavior in a market. This makes possible the prediction of consequences (i.e., performance outcomes) of future strategic moves. Another emerging area in service operations management is revenue management. Choice-based revenue systems incorporate discrete choice models into traditional revenue management algorithms. To successfully implement a choice-based revenue system, it is necessary to estimate customer preferences as a valid input to optimization algorithms. The third essay investigates how to estimate customer preferences when part of the market is consistently unobserved. This issue is especially prominent in choice-based revenue management systems. Normally a firm only has its own observed purchases, while those customers who purchase from competitors or do not make purchases are unobserved. Most current estimation procedures depend on unrealistic assumptions about customer arriving. This study proposes a new estimation methodology, which does not require any prior knowledge about the customer arrival process and allows for arbitrary demand distributions. Compared with previous methods, this model performs superior when the true demand is highly variable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Complex chronic diseases are a challenge for the current configuration of Health services. Case management is a service frequently provided for people with chronic conditions and despite its effectiveness in many outcomes, such as mortality or readmissions, uncertainty remains about the most effective form of team organization, structures, and the nature of the interventions. Many processes and outcomes of case management for people with complex chronic conditions cannot be addressed with the information provided by electronic clinical records. Registries are frequently used to deal with this weakness. The aim of this study was to generate a registry-based information system of patients receiving case management to identify their clinical characteristics, their context of care, events identified during their follow-up, interventions developed by case managers, and services used. Methods and design: The study was divided into three phases, covering the detection of information needs, the design and its implementation in the healthcare system, using literature review and expert consensus methods to select variables that would be included in the registry. Objective: To describe the essential characteristics of the provision of ca re lo people who receive case management (structure, process and outcomes), with special emphasis on those with complex chronic diseases. Study population: Patients from any District of Primary Care, who initiate the utilization of case management services, to avoid information bias that may occur when including subjects who have already been received the service, and whose outcomes and characteristics could not be properly collected. Results: A total of 102 variables representing structure, processes and outcomes of case management were selected for their inclusion in the registry after the consensus phase. Total sample was composed of 427 patients, of which 211 (49.4%) were women and 216 (50.6%) were men. The average functional level (Barthel lndex) was 36.18 (SD 29.02), cognitive function (Pfeiffer) showed an average of 4.37 {SD 6.57), Chat1son Comorbidity lndex, obtained a mean of 3.03 (SD 2.7) and Social Support (Duke lndex) was 34.2 % (SD 17.57). More than half of patients include in the Registry, correspond lo immobilized or transitional care for patients discharged from hospital (66.5 %). The patient's educational level was low or very low (50.4%). Caregivers overstrain (Caregiver stress index), obtained an average value of 6.09% (SD 3.53). Only 1.2 % of patients had declared their advanced directives, 58.6 had not defined the tutelage and the vast majority lived at home 98.8 %. Regarding the major events recorded at RANGE Registry, 25.8 % of the selected patients died in the first three months, 8.2 % suffered a hospital admission at least once time, 2.3%, two times, and 1.2% three times, 7.5% suffered a fall, 8.7% had pressure ulcer, 4.7% had problems with medication, and 3.3 % were institutionalized. Stroke is the more prevalent health problem recorded (25.1%), followed by hypertension (11.1%) and COPD (11.1%). Patients registered by NCMs had as main processes diabetes (16.8%) and dementia (11.3 %). The most frequent nursing diagnoses referred to the self-care deficit in various activities of daily living. Regarding to nursing interventions, described by the Nursing Intervention Classification (NIC), dementia management is the most used intervention, followed by mutual goal setting, caregiver and emotional support. Conclusions: The patient profile who receive case management services is a chronic complex patient with severe dependence, cognitive impairment, normal social support, low educational level, health problems such as stroke, hypertension or COPD, diabetes or dementia, and has an informal caregiver. At the first follow up, mortality was 19.2%, and a discrete rate of readmissions and falls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two key solutions to reduce the greenhouse gas emissions and increase the overall energy efficiency are to maximize the utilization of renewable energy resources (RERs) to generate energy for load consumption and to shift to low or zero emission plug-in electric vehicles (PEVs) for transportation. The present U.S. aging and overburdened power grid infrastructure is under a tremendous pressure to handle the issues involved in penetration of RERS and PEVs. The future power grid should be designed with for the effective utilization of distributed RERs and distributed generations to intelligently respond to varying customer demand including PEVs with high level of security, stability and reliability. This dissertation develops and verifies such a hybrid AC-DC power system. The system will operate in a distributed manner incorporating multiple components in both AC and DC styles and work in both grid-connected and islanding modes. ^ The verification was performed on a laboratory-based hybrid AC-DC power system testbed as hardware/software platform. In this system, RERs emulators together with their maximum power point tracking technology and power electronics converters were designed to test different energy harvesting algorithms. The Energy storage devices including lithium-ion batteries and ultra-capacitors were used to optimize the performance of the hybrid power system. A lithium-ion battery smart energy management system with thermal and state of charge self-balancing was proposed to protect the energy storage system. A grid connected DC PEVs parking garage emulator, with five lithium-ion batteries was also designed with the smart charging functions that can emulate the future vehicle-to-grid (V2G), vehicle-to-vehicle (V2V) and vehicle-to-house (V2H) services. This includes grid voltage and frequency regulations, spinning reserves, micro grid islanding detection and energy resource support. ^ The results show successful integration of the developed techniques for control and energy management of future hybrid AC-DC power systems with high penetration of RERs and PEVs.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper describes a forest management system to be applied on smallholder farms, particularly on settlement projects in the Brazilian Amazon. The proposed forest management system was designed to generate a new source of family income and to maintain forest structure and biodiversity. The system is new in three main characteristics: the use of short cycles in the management of tropical forests, the low harvesting intensity and environmental impact, and the direct involvement of the local population in ali forest management activities. It is based on a minimum felling cycle of ten years and an annual timber harvest of 5-10 m3 ha-1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work proposes different approaches to extend the mathematical methods of supervisory energy management used in terrestrial environments to the maritime sector, that diverges in constraints, variables and disturbances. The aim is to find the optimal real-time solution that includes the minimization of a defined track time, while maintaining the classical energetic approach. Starting from analyzing and modelling the powertrain and boat dynamics, the energy economy problem formulation is done, following the mathematical principles behind the optimal control theory. Then, an adaptation aimed in finding a winning strategy for the Monaco Energy Boat Challenge endurance trial is performed via ECMS and A-ECMS control strategies, which lead to a more accurate knowledge of energy sources and boat’s behaviour. The simulations show that the algorithm accomplishes fuel economy and time optimization targets, but the latter adds huge tuning and calculation complexity. In order to assess a practical implementation on real hardware, the knowledge of the previous approaches has been translated into a rule-based algorithm, that let it be run on an embedded CPU. Finally, the algorithm has been tuned and tested in a real-world race scenario, showing promising results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The following thesis work focuses on the use and implementation of advanced models for measuring the resilience of water distribution networks. In particular, the functions implemented in GRA Tool, a software developed by the University of Exeter (UK), and the functions of the Toolkit of Epanet 2.2 were investigated. The study of the resilience and failure, obtained through GRA Tool and the development of the methodology based on the combined use of EPANET 2.2 and MATLAB software, was tested in a first phase, on a small-sized literature water distribution network, so that the variability of the results could be perceived more clearly and with greater immediacy, and then, on a more complex network, that of Modena. In the specific, it has been decided to go to recreate a mode of failure deferred in time, one proposed by the software GRA Tool, that is failure to the pipes, to make a comparison between the two methodologies. The analysis of hydraulic efficiency was conducted using a synthetic and global network performance index, i.e., Resilience index, introduced by Todini in the years 2000-2016. In fact, this index, being one of the parameters with which to evaluate the overall state of "hydraulic well-being" of a network, has the advantage of being able to act as a criterion for selecting any improvements to be made on the network itself. Furthermore, during these analyzes, was shown the analytical development undergone over time by the formula of the Resilience Index. The final intent of this thesis work was to understand by what means to improve the resilience of the system in question, as the introduction of the scenario linked to the rupture of the pipelines was designed to be able to identify the most problematic branches, i.e., those that in the event of a failure it would entail greater damage to the network, including lowering the Resilience Index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of fertilization in forest stands results in yield gains, yet little attention has been directed to its potential effects on the quality of wood produced. Information is scarce about the effect of fertilization on anatomical structures of older Eucalyptus wood. This work aims to study the effect of fertilization on tissue cell size of wood from an Eucalyptus grandis stand at age 21 years, the management system of which is based on selective thinning and fertilizer application at the start of the thinning season. Factors to consider include: presence or absence of fertilizers, two log positions and five radial (pith to bark) positions. Results led to the conclusion that fertilization significantly influenced only vessel frequency. Vessel element length was influenced by tree height. Fiber length, fiber diameter, fiber wall thickness, vessel element length, vessel diameter and vessel frequency were influenced by the radial position of the sample in relation to the log. A positive correlation was observed between fiber length, fiber diameter, fiber wall thickness, vessel element length, vessel diameter, ray width and radial position, while a negative correlation was observed between ray frequency and radial position.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In geophysics and seismology, raw data need to be processed to generate useful information that can be turned into knowledge by researchers. The number of sensors that are acquiring raw data is increasing rapidly. Without good data management systems, more time can be spent in querying and preparing datasets for analyses than in acquiring raw data. Also, a lot of good quality data acquired at great effort can be lost forever if they are not correctly stored. Local and international cooperation will probably be reduced, and a lot of data will never become scientific knowledge. For this reason, the Seismological Laboratory of the Institute of Astronomy, Geophysics and Atmospheric Sciences at the University of Sao Paulo (IAG-USP) has concentrated fully on its data management system. This report describes the efforts of the IAG-USP to set up a seismology data management system to facilitate local and international cooperation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecological systems are vulnerable to irreversible change when key system properties are pushed over thresholds, resulting in the loss of resilience and the precipitation of a regime shift. Perhaps the most important of such properties in human-modified landscapes is the total amount of remnant native vegetation. In a seminal study Andren proposed the existence of a fragmentation threshold in the total amount of remnant vegetation, below which landscape-scale connectivity is eroded and local species richness and abundance become dependent on patch size. Despite the fact that species patch-area effects have been a mainstay of conservation science there has yet to be a robust empirical evaluation of this hypothesis. Here we present and test a new conceptual model describing the mechanisms and consequences of biodiversity change in fragmented landscapes, identifying the fragmentation threshold as a first step in a positive feedback mechanism that has the capacity to impair ecological resilience, and drive a regime shift in biodiversity. The model considers that local extinction risk is defined by patch size, and immigration rates by landscape vegetation cover, and that the recovery from local species losses depends upon the landscape species pool. Using a unique dataset on the distribution of non-volant small mammals across replicate landscapes in the Atlantic forest of Brazil, we found strong evidence for our model predictions - that patch-area effects are evident only at intermediate levels of total forest cover, where landscape diversity is still high and opportunities for enhancing biodiversity through local management are greatest. Furthermore, high levels of forest loss can push native biota through an extinction filter, and result in the abrupt, landscape-wide loss of forest-specialist taxa, ecological resilience and management effectiveness. The proposed model links hitherto distinct theoretical approaches within a single framework, providing a powerful tool for analysing the potential effectiveness of management interventions.