893 resultados para Policy-based network management
Resumo:
Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.
Resumo:
Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.
Resumo:
This book brings together experts in the fields of spatial planning, landuse and infrastructure management to explore the emerging agenda of spatially-oriented integrated evaluation. It weaves together the latest theories, case studies, methods, policy and practice to examine and assess the values, impacts, benefits and the overall success in integrated land-use management. In doing so, the book clarifies the nature and roles of evaluation and puts forward guidance for future policy and practice.
Resumo:
Based on an original and comprehensive database of all feature fiction films produced in Mercosur between 2004 and 2012, the paper analyses whether the Mercosur film industry has evolved towards an integrated and culturally more diverse market. It provides a summary of policy opportunities in terms of integration and diversity, emphasizing the limiter role played by regional policies. It then shows that although the Mercosur film industry remains rather disintegrated, it tends to become more integrated and culturally more diverse. From a methodological point of view, the combination of Social Network Analysis and the Stirling Model opens up interesting research tracks to analyse creative industries in terms of their market integration and their cultural diversity.
Resumo:
Over the last thirty years, there has been an increased demand for better management of public sector organisations (PSOs). This requires that they are answerable for the inputs that they are given but also for what they achieve with these inputs (Hood 1991; Hood 1995). It is suggested that this will improve the management of the organisation through better planning and control, and the achievement of greater accountability (Smith 1995). However, such a rational approach with clear goals and the means to measure achievement can cause difficulties for many PSOs. These difficulties include the distinctive nature of the public sector due to the political environment within which the public sector manager operates (Stewart and Walsh 1992) and the fact that PSOs will have many stakeholders, each of whom will have their own specific objectives based on their own perspective (Boyle 1995). This can
result in goal ambiguity which means that there is leeway in interpreting the results of the PSO. The National Asset Management Agency (NAMA) was set up to bring stability to the financial system by buying loans from the banks (which were in most cases, non-performing loans). The intention was to cleanse the banks of these loans so that they could return to their normal business of taking deposits and making loans. However, the legislation, also gave NAMA a wide range of other responsibilities including responsibility for facilitating credit in the economy and protecting the interests of taxpayers. In more recent times, NAMA has been given responsibility for building social housing. This wide-range of activities is a clear example of a PSO being given multiple goals which may conflict and is therefore likely to lead to goal ambiguity. This makes it very difficult to evaluate NAMA’s performance as they are attempting to meet numerous goals at the same time and also highlights the complexity of policy making in the public sector. The purpose of this paper is to examine how NAMA dealt with goal ambiguity. This will be done through a thematic analysis of its annual reports over the last five years. The paper’s will contribute to the ongoing debate about the evaluation of PSOs and the complex environment within which they operate which makes evaluation difficult as they are
answerable to multiple stakeholders who have different objectives and different criteria for measuring success.
Resumo:
The International Long-Term Ecological Research (ILTER) network comprises > 600 scientific groups conducting site-based research within 40 countries. Its mission includes improving the understanding of global ecosystems and informs solutions to current and future environmental problems at the global scales. The ILTER network covers a wide range of social-ecological conditions and is aligned with the Programme on Ecosystem Change and Society (PECS) goals and approach. Our aim is to examine and develop the conceptual basis for proposed collaboration between ILTER and PECS. We describe how a coordinated effort of several contrasting LTER site-based research groups contributes to the understanding of how policies and technologies drive either toward or away from the sustainable delivery of ecosystem services. This effort is based on three tenets: transdisciplinary research; cross-scale interactions and subsequent dynamics; and an ecological stewardship orientation. The overarching goal is to design management practices taking into account trade-offs between using and conserving ecosystems toward more sustainable solutions. To that end, we propose a conceptual approach linking ecosystem integrity, ecosystem services, and stakeholder well-being, and as a way to analyze trade-offs among ecosystem services inherent in diverse management options. We also outline our methodological approach that includes: (i) monitoring and synthesis activities following spatial and temporal trends and changes on each site and by documenting cross-scale interactions; (ii) developing analytical tools for integration; (iii) promoting trans-site comparison; and (iv) developing conceptual tools to design adequate policies and management interventions to deal with trade-offs. Finally, we highlight the heterogeneity in the social-ecological setting encountered in a subset of 15 ILTER sites. These study cases are diverse enough to provide a broad cross-section of contrasting ecosystems with different policy and management drivers of ecosystem conversion; distinct trends of biodiversity change; different stakeholders’ preferences for ecosystem services; and diverse components of well-being issues.
Resumo:
This chapter establishes a framework for the governance of intermodal terminals throughout their life cycle, based on the product life cycle. The framework covers the initial planning by the public sector, the public/private split in funding and ownership, the selection of an operator, ensuring fair access to all users, and finally reconcessioning the terminal to a new operator, managing the handover and maintaining the terminal throughout its life cycle. This last point is especially important as industry conditions change and the terminal's role in the transport network comes under threat, either by a lack of demand or by increased demand requiring expansion, redesign and reinvestment. Each stage of the life cycle framework is operationalised based on empirical examples drawn from research by the authors on intermodal terminal planning and funding, the tender process and concession and operation contracts. In future the framework can be applied in additional international contexts to form a basis for transport cost analysis, logistics planning and government policy.
Resumo:
This PhD thesis is an empirical research project in the field of modern Polish history. The thesis focuses on Solidarity, the Network and the idea of workers’ self-management. In addition, the thesis is based on an in-depth analysis of Solidarity archival material. The Solidarity trade union was born in August 1980 after talks between the communist government and strike leaders at the Gdansk Lenin Shipyards. In 1981 a group called the Network rose up, due to cooperation between Poland’s great industrial factory plants. The Network grew out of Solidarity; it was made up of Solidarity activists, and the group acted as an economic partner to the union. The Network was the base of a grass-roots, nationwide workers’ self-management movement. Solidarity and the self-management movement were crushed by the imposition of Martial Law in December 1981. Solidarity revived itself immediately, and the union created an underground society. The Network also revived in the underground, and it continued to promote self-management activity where this was possible. When Solidarity regained its legal status in April 1989, workers’ self-management no longer had the same importance in the union. Solidarity’s new politico-economic strategy focused on free markets, foreign investment and privatization. This research project ends in July 1990, when the new Solidarity-backed government enacted a privatization law. The government decided to transform the property ownership structure through a centralized privatization process, which was a blow for supporters of workers’ self-management. This PhD thesis provides new insight into the evolution of the Solidarity union from 1980-1990 by analyzing the fate of workers’ self-management. This project also examines the role of the Network throughout the 1980s. There is analysis of the important link between workers’ self-management and the core ideas of Solidarity. In addition, the link between political and economic reform is an important theme in this research project. The Network was aware that authentic workers’ self-management required reforms to the authoritarian political system. Workers’ self-management competed against other politico-economic ideas during the 1980s in Poland. The outcome of this competition between different reform concepts has shaped modern-day Polish politics, economics and society.
Resumo:
Part 15: Performance Management Frameworks
Resumo:
Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.
Resumo:
32
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.
Resumo:
Chagas disease is still a major public health problem in Latin America. Its causative agent, Trypanosoma cruzi, can be typed into three major groups, T. cruzi I, T. cruzi II and hybrids. These groups each have specific genetic characteristics and epidemiological distributions. Several highly virulent strains are found in the hybrid group; their origin is still a matter of debate. The null hypothesis is that the hybrids are of polyphyletic origin, evolving independently from various hybridization events. The alternative hypothesis is that all extant hybrid strains originated from a single hybridization event. We sequenced both alleles of genes encoding EF-1 alpha, actin and SSU rDNA of 26 T. cruzi strains and DHFR-TS and TR of 12 strains. This information was used for network genealogy analysis and Bayesian phylogenies. We found T. cruzi I and T. cruzi II to be monophyletic and that all hybrids had different combinations of T. cruzi I and T. cruzi II haplotypes plus hybrid-specific haplotypes. Bootstrap values (networks) and posterior probabilities (Bayesian phylogenies) of clades supporting the monophyly of hybrids were far below the 95% confidence interval, indicating that the hybrid group is polyphyletic. We hypothesize that T. cruzi I and T. cruzi II are two different species and that the hybrids are extant representatives of independent events of genome hybridization, which sporadically have sufficient fitness to impact on the epidemiology of Chagas disease.
Resumo:
The Brazilian Amazon is one of the most rapidly developing agricultural areas in the world and represents a potentially large future source of greenhouse gases from land clearing and subsequent agricultural management. In an integrated approach, we estimate the greenhouse gas dynamics of natural ecosystems and agricultural ecosystems after clearing in the context of a future climate. We examine scenarios of deforestation and postclearing land use to estimate the future (2006-2050) impacts on carbon dioxide (CO(2)), methane (CH(4)), and nitrous oxide (N(2)O) emissions from the agricultural frontier state of Mato Grosso, using a process-based biogeochemistry model, the Terrestrial Ecosystems Model (TEM). We estimate a net emission of greenhouse gases from Mato Grosso, ranging from 2.8 to 15.9 Pg CO(2)-equivalents (CO(2)-e) from 2006 to 2050. Deforestation is the largest source of greenhouse gas emissions over this period, but land uses following clearing account for a substantial portion (24-49%) of the net greenhouse gas budget. Due to land-cover and land-use change, there is a small foregone carbon sequestration of 0.2-0.4 Pg CO(2)-e by natural forests and cerrado between 2006 and 2050. Both deforestation and future land-use management play important roles in the net greenhouse gas emissions of this frontier, suggesting that both should be considered in emissions policies. We find that avoided deforestation remains the best strategy for minimizing future greenhouse gas emissions from Mato Grosso.