878 resultados para Management capacity indicator


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Persistent daily congestion has been increasing in recent years, particularly along major corridors during selected periods in the mornings and evenings. On certain segments, these roadways are often at or near capacity. However, a conventional Predefined control strategy did not fit the demands that changed over time, making it necessary to implement the various dynamical lane management strategies discussed in this thesis. Those strategies include hard shoulder running, reversible HOV lanes, dynamic tolls and variable speed limit. A mesoscopic agent-based DTA model is used to simulate different strategies and scenarios. From the analyses, all strategies aim to mitigate congestion in terms of the average speed and average density. The largest improvement can be found in hard shoulder running and reversible HOV lanes while the other two provide more stable traffic. In terms of average speed and travel time, hard shoulder running is the most congested strategy for I-270 to help relieve the traffic pressure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doutoramento em Engenharia Agronómica - Instituto Superior de Agronomia - UL

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Management of hyperbilirubinemia remains a challenge for neonatal medicine because of the risk of neurological complications related to the toxicity of severe hyperbilirubinemia. Objectives: The purpose of this study was to examine the validity of cord blood alkaline phosphatase level for predicting neonatal hyperbilirubinemia. Patients and Methods: Between October and December 2013 a total of 102 healthy term infants born to healthy mothers were studied. Cord blood samples were collected for measurement of alkaline Phosphatase levels immediately after birth. Neonates were followed-up for the emergence of jaundice. Newborns with clinical jaundice were recalled and serum bilirubin levels measured. Appropriate treatment based on serum bilirubin level was performed. Alkaline phosphatase levels between the non-jaundiced and jaundiced treated neonates were compared. Results: The incidence of severe jaundice that required treatment among followed-up neonates was 9.8%. The mean alkaline phosphatase level was 309.09 ± 82.51 IU/L in the non-jaundiced group and 367.80 ± 73.82 IU/L in the severely jaundiced group (P = 0.040). The cutoff value of 314 IU/L was associated with sensitivity 80% and specificity 63% for predicting neonatal hyperbilirubinemia requiring treatment. Conclusions: The cord blood alkaline phosphatase level can be used as a predictor of severe neonatal jaundice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of social carrying capacity, though opens to debate and critique, is a valuable tool that enhances the management of recreational use in protected natural areas. In this study, conducted in Sierra de las Nieves natural park (Spain), we first categorised the hikers making use of the park and then, from the profiles obtained, analysed their perception of crowding on the trails. This assessment was subsequently used to assess levels of user satisfaction and thus to determine the psychosocial carrying capacity of the park. The results obtained can be extrapolated to most of the Spanish natural parks in Mediterranean mountain areas, due to their comparable levels of visitor numbers and to the prevalence of recreational hiking use. The results suggest that management efforts should be directed toward relocating trails outside the core areas, such that user preferences may be satisfied while less impact is made on the areas of highest environmental value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By employing interpretive policy analysis this thesis aims to assess, measure, and explain policy capacity for government and non-government organizations involved in reclaiming Alberta's oil sands. Using this type of analysis to assess policy capacity is a novel approach for understanding reclamation policy; and therefore, this research will provide a unique contribution to the literature surrounding reclamation policy. The oil sands region in northeast Alberta, Canada is an area of interest for a few reasons; primarily because of the vast reserves of bitumen and the environmental cost associated with developing this resource. An increase in global oil demand has established incentive for industry to seek out and develop new reserves. Alberta's oil sands are one of the largest remaining reserves in the world, and there is significant interest in increasing production in this region. Furthermore, tensions in several oil exporting nations in the Middle East remain unresolved, and this has garnered additional support for a supply side solution to North American oil demands. This solution relies upon the development of reserves in both the United States and Canada. These compounding factors have contributed to the increased development in the oil sands of northeastern Alberta. Essentially, a rapid expansion of oil sands operations is ongoing, and is the source of significant disturbance across the region. This disturbance, and the promises of reclamation, is a source of contentious debates amongst stakeholders and continues to be highly visible in the media. If oil sands operations are to retain their social license to operate, it is critical that reclamation efforts be effective. One concern non-governmental organizations (NGOs) expressed criticizes the current monitoring and enforcement of regulatory programs in the oil sands. Alberta's NGOs have suggested the data made available to them originates from industrial sources, and is generally unchecked by government. In an effort to discern the overall status of reclamation in the oil sands this study explores several factors essential to policy capacity: work environment, training, employee attitudes, perceived capacity, policy tools, evidence based work, and networking. Data was collected through key informant interviews with senior policy professionals in government and non-government agencies in Alberta. The following are agencies of interest in this research: Canadian Association of Petroleum Producers (CAPP); Alberta Environment and Sustainable Resource Development (AESRD); Alberta Energy Regulator (AER); Cumulative Environmental Management Association (CEMA); Alberta Environment Monitoring, Evaluation, and Reporting Agency (AEMERA); Wood Buffalo Environmental Association (WBEA). The aim of this research is to explain how and why reclamation policy is conducted in Alberta's oil sands. This will illuminate government capacity, NGO capacity, and the interaction of these two agency typologies. In addition to answering research questions, another goal of this project is to show interpretive analysis of policy capacity can be used to measure and predict policy effectiveness. The oil sands of Alberta will be the focus of this project, however, future projects could focus on any government policy scenario utilizing evidence-based approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequency, time and places of charging and discharging have critical impact on the Quality of Experience (QoE) of using Electric Vehicles (EVs). EV charging and discharging scheduling schemes should consider both the QoE of using EV and the load capacity of the power grid. In this paper, we design a traveling plan-aware scheduling scheme for EV charging in driving pattern and a cooperative EV charging and discharging scheme in parking pattern to improve the QoE of using EV and enhance the reliability of the power grid. For traveling planaware scheduling, the assignment of EVs to Charging Stations (CSs) is modeled as a many-to-one matching game and the Stable Matching Algorithm (SMA) is proposed. For cooperative EV charging and discharging in parking pattern, the electricity exchange between charging EVs and discharging EVs in the same parking lot is formulated as a many-to-many matching model with ties, and we develop the Pareto Optimal Matching Algorithm (POMA). Simulation results indicates that the SMA can significantly improve the average system utility for EV charging in driving pattern, and the POMA can increase the amount of electricity offloaded from the grid which is helpful to enhance the reliability of the power grid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We quantified the ecosystem effects of small-scale gears operating in southern European waters (Portugal, Spain, Greece), based on a widely accepted ecosystem measure and indicator, the trophic level (TL). We used data from experimental fishing trials during 1997 to 2000. We studied a wide range of gear types and sizes: (1) gill nets of 8 mesh sizes, ranging from 44 to 80 mm; (2) trammel nets of 9 inner panel mesh sizes, ranging from 40 to 140 mm; and (3) longlines of 8 hook sizes, ranging from Nos. 15 (small) to 5 (large). We used the number of species caught per TL class for constructing trophic signatures (i.e. cumulative TL distributions), and estimated the TL at 25, 50 and 75% cumulative frequency (TL25, TL50, TL75) and the slopes using the logistic function. We also estimated the mean weighted TL of the catches (TLW). Our analyses showed that the TL characteristics of longlines varied much more than those of gill and trammel nets. The longlines of large hooks (Nos. 10, 9, 7, 5) were very TL selective, and their trophic signatures had very steep slopes, the highest mean TL50 values, very narrow mean TL25 to TL75 ranges and mean TLW > 4. In addition, the mean number of TL classes exploited was smaller and the mean TL50 and TLW were larger for the longlines of small hooks (Nos. 15, 13, 12, 11) in Greek than in Portuguese waters. Trammel and gill nets caught more TL classes, and the mean slopes of their trophic signatures were significantly smaller than those of longlines as a group. In addition, the mean number of TL classes exploited, the mean TL50 and the TLW of gill nets were significantly smaller than those of trammel nets. We attribute the differences between longlines of small hooks to bait type, and the differences between all gear types to their characteristic species and size-selectivity patterns. Finally, we showed how the slope and the TL50 Of the trophic signatures can be used to characterise different gears along the ecologically 'unsustainable-sustainable' continuum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and aims: Advances in modern medicine have led to improved outcomes after stroke, yet an increased treatment burden has been placed on patients. Treatment burden is the workload of health care for people with chronic illness and the impact that this has on functioning and well-being. Those with comorbidities are likely to be particularly burdened. Excessive treatment burden can negatively affect outcomes. Individuals are likely to differ in their ability to manage health problems and follow treatments, defined as patient capacity. The aim of this thesis was to explore the experience of treatment burden for people who have had a stroke and the factors that influence patient capacity. Methods: There were four phases of research. 1) A systematic review of the qualitative literature that explored the experience of treatment burden for those with stroke. Data were analysed using framework synthesis, underpinned by Normalisation Process Theory (NPT). 2) A cross-sectional study of 1,424,378 participants >18 years, demographically representative of the Scottish population. Binary logistic regression was used to analyse the relationship between stroke and the presence of comorbidities and prescribed medications. 3) Interviews with twenty-nine individuals with stroke, fifteen analysed by framework analysis underpinned by NPT and fourteen by thematic analysis. The experience of treatment burden was explored in depth along with factors that influence patient capacity. 4) Integration of findings in order to create a conceptual model of treatment burden and patient capacity in stroke. Results: Phase 1) A taxonomy of treatment burden in stroke was created. The following broad areas of treatment burden were identified: making sense of stroke management and planning care; interacting with others including health professionals, family and other stroke patients; enacting management strategies; and reflecting on management. Phase 2) 35,690 people (2.5%) had a diagnosis of stroke and of the 39 co-morbidities examined, 35 were significantly more common in those with stroke. The proportion of those with stroke that had >1 additional morbidities present (94.2%) was almost twice that of controls (48%) (odds ratio (OR) adjusted for age, gender and socioeconomic deprivation; 95% confidence interval: 5.18; 4.95-5.43) and 34.5% had 4-6 comorbidities compared to 7.2% of controls (8.59; 8.17-9.04). In the stroke group, 12.6% of people had a record of >11 repeat prescriptions compared to only 1.5% of the control group (OR adjusted for age, gender, deprivation and morbidity count: 15.84; 14.86-16.88). Phase 3) The taxonomy of treatment burden from Phase 1 was verified and expanded. Additionally, treatment burdens were identified as arising from either: the workload of healthcare; or the endurance of care deficiencies. A taxonomy of patient capacity was created. Six factors were identified that influence patient capacity: personal attributes and skills; physical and cognitive abilities; support network; financial status; life workload, and environment. A conceptual model of treatment burden was created. Healthcare workload and the presence of care deficiencies can influence and be influenced by patient capacity. The quality and configuration of health and social care services influences healthcare workload, care deficiencies and patient capacity. Conclusions: This thesis provides important insights into the considerable treatment burden experienced by people who have had a stroke and the factors that affect their capacity to manage health. Multimorbidity and polypharmacy are common in those with stroke and levels of these are high. Findings have important implications for the design of clinical guidelines and healthcare delivery, for example co-ordination of care should be improved, shared decision-making enhanced, and patients better supported following discharge from hospital.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Montado decline has been reported since the end of the nineteenth century in southern Portugal and increased markedly during the 1980s. Consensual reports in the literature suggest that this decline is due to a number of factors, such as environmental constraints, forest diseases, inappropriate management, and socioeconomic issues. An assessment on the pattern of montado distribution was conducted to reveal how the extent of land management, environmental variables, and spatial factors contributed to montado area loss in southern Portugal from 1990 to 2006. A total of 14 independent variables, presumably related to montado loss, were grouped into three sets: environmental variables, land management variables, and spatial variables. From 1990 to 2006, approximately 90,054 ha disappeared in the montado area, with an estimated annual regression rate of 0.14 % year-1. Variation partitioning showed that the land management model accounted for the highest percentage of explained variance (51.8 %), followed by spatial factors (44.6 %) and environmental factors (35.5 %). These results indicate that most variance in the large-scale distribution of recent montado loss is due to land management, either alone or in combination with environmental and spatial factors. The full GAM model showed that different livestock grazing is one of the most important variables affecting montado loss. This suggests that optimum carrying capacity should decrease to 0.18–0.60 LU ha-1 for livestock grazing in montado under current ecological conditions in southern Portugal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: The study of labile carbon fractions (LCF) provides an understanding of the behavior of soil organic matter (SOM) under different soil management systems and cover crops. The aim of this study was to assess the effect of different soil management systems with respect to tillage, cover crop and phosphate fertilization on the amount of the LCF of SOM. Treatments consisted of conventional tillage (CT) and no-tillage (NT) with millet as the cover crop and a no-tillage system with velvet bean at two phosphorus dosages. Soil samples were collected and analyzed for organic carbon (OC), C oxidizable by KMnO4 (C-KMnO4), particulate OC (POC), microbial biomass carbon and light SOM in the 0.0-0.05, 0.05-0.10 and 0.10-0.20 m soil layers. The Carbon Management Index (CMI) was calculated to evaluate the impacts of soil management treatments on the quality of the SOM. The different LCFs are sensitive to different soil management systems, and there are significant correlations between them. C-KMnO4 is considered the best indicator of OC carbon lability. In the soil surface layers, the CT reduced the carbon content in all of the labile fractions of the SOM. The use of phosphorus led to the accumulation of OC and carbon in the different soil fractions regardless of the tillage system or cover crop. The application of phosphate fertilizer improved the ability of the NTsystem to promote soil quality, as assessed by the CMI.

Relevância:

20.00% 20.00%

Publicador: