919 resultados para Policy network
Resumo:
Cette thèse étudie une approche intégrant la gestion de l’horaire et la conception de réseaux de services pour le transport ferroviaire de marchandises. Le transport par rail s’articule autour d’une structure à deux niveaux de consolidation où l’affectation des wagons aux blocs ainsi que des blocs aux services représentent des décisions qui complexifient grandement la gestion des opérations. Dans cette thèse, les deux processus de consolidation ainsi que l’horaire d’exploitation sont étudiés simultanément. La résolution de ce problème permet d’identifier un plan d’exploitation rentable comprenant les politiques de blocage, le routage et l’horaire des trains, de même que l’habillage ainsi que l’affectation du traffic. Afin de décrire les différentes activités ferroviaires au niveau tactique, nous étendons le réseau physique et construisons une structure de réseau espace-temps comprenant trois couches dans lequel la dimension liée au temps prend en considération les impacts temporels sur les opérations. De plus, les opérations relatives aux trains, blocs et wagons sont décrites par différentes couches. Sur la base de cette structure de réseau, nous modélisons ce problème de planification ferroviaire comme un problème de conception de réseaux de services. Le modèle proposé se formule comme un programme mathématique en variables mixtes. Ce dernie r s’avère très difficile à résoudre en raison de la grande taille des instances traitées et de sa complexité intrinsèque. Trois versions sont étudiées : le modèle simplifié (comprenant des services directs uniquement), le modèle complet (comprenant des services directs et multi-arrêts), ainsi qu’un modèle complet à très grande échelle. Plusieurs heuristiques sont développées afin d’obtenir de bonnes solutions en des temps de calcul raisonnables. Premièrement, un cas particulier avec services directs est analysé. En considérant une cara ctéristique spécifique du problème de conception de réseaux de services directs nous développons un nouvel algorithme de recherche avec tabous. Un voisinage par cycles est privilégié à cet effet. Celui-ci est basé sur la distribution du flot circulant sur les blocs selon les cycles issus du réseau résiduel. Un algorithme basé sur l’ajustement de pente est développé pour le modèle complet, et nous proposons une nouvelle méthode, appelée recherche ellipsoidale, permettant d’améliorer davantage la qualité de la solution. La recherche ellipsoidale combine les bonnes solutions admissibles générées par l’algorithme d’ajustement de pente, et regroupe les caractéristiques des bonnes solutions afin de créer un problème élite qui est résolu de facon exacte à l’aide d’un logiciel commercial. L’heuristique tire donc avantage de la vitesse de convergence de l’algorithme d’ajustement de pente et de la qualité de solution de la recherche ellipsoidale. Les tests numériques illustrent l’efficacité de l’heuristique proposée. En outre, l’algorithme représente une alternative intéressante afin de résoudre le problème simplifié. Enfin, nous étudions le modèle complet à très grande échelle. Une heuristique hybride est développée en intégrant les idées de l’algorithme précédemment décrit et la génération de colonnes. Nous proposons une nouvelle procédure d’ajustement de pente où, par rapport à l’ancienne, seule l’approximation des couts liés aux services est considérée. La nouvelle approche d’ajustement de pente sépare ainsi les décisions associées aux blocs et aux services afin de fournir une décomposition naturelle du problème. Les résultats numériques obtenus montrent que l’algorithme est en mesure d’identifier des solutions de qualité dans un contexte visant la résolution d’instances réelles.
Resumo:
One objective of artificial intelligence is to model the behavior of an intelligent agent interacting with its environment. The environment's transformations can be modeled as a Markov chain, whose state is partially observable to the agent and affected by its actions; such processes are known as partially observable Markov decision processes (POMDPs). While the environment's dynamics are assumed to obey certain rules, the agent does not know them and must learn. In this dissertation we focus on the agent's adaptation as captured by the reinforcement learning framework. This means learning a policy---a mapping of observations into actions---based on feedback from the environment. The learning can be viewed as browsing a set of policies while evaluating them by trial through interaction with the environment. The set of policies is constrained by the architecture of the agent's controller. POMDPs require a controller to have a memory. We investigate controllers with memory, including controllers with external memory, finite state controllers and distributed controllers for multi-agent systems. For these various controllers we work out the details of the algorithms which learn by ascending the gradient of expected cumulative reinforcement. Building on statistical learning theory and experiment design theory, a policy evaluation algorithm is developed for the case of experience re-use. We address the question of sufficient experience for uniform convergence of policy evaluation and obtain sample complexity bounds for various estimators. Finally, we demonstrate the performance of the proposed algorithms on several domains, the most complex of which is simulated adaptive packet routing in a telecommunication network.
Resumo:
Abstract 1: Social Networks such as Twitter are often used for disseminating and collecting information during natural disasters. The potential for its use in Disaster Management has been acknowledged. However, more nuanced understanding of the communications that take place on social networks are required to more effectively integrate this information into the processes within disaster management. The type and value of information shared should be assessed, determining the benefits and issues, with credibility and reliability as known concerns. Mapping the tweets in relation to the modelled stages of a disaster can be a useful evaluation for determining the benefits/drawbacks of using data from social networks, such as Twitter, in disaster management.A thematic analysis of tweets’ content, language and tone during the UK Storms and Floods 2013/14 was conducted. Manual scripting was used to determine the official sequence of events, and classify the stages of the disaster into the phases of the Disaster Management Lifecycle, to produce a timeline. Twenty- five topics discussed on Twitter emerged, and three key types of tweets, based on the language and tone, were identified. The timeline represents the events of the disaster, according to the Met Office reports, classed into B. Faulkner’s Disaster Management Lifecycle framework. Context is provided when observing the analysed tweets against the timeline. This illustrates a potential basis and benefit for mapping tweets into the Disaster Management Lifecycle phases. Comparing the number of tweets submitted in each month with the timeline, suggests users tweet more as an event heightens and persists. Furthermore, users generally express greater emotion and urgency in their tweets.This paper concludes that the thematic analysis of content on social networks, such as Twitter, can be useful in gaining additional perspectives for disaster management. It demonstrates that mapping tweets into the phases of a Disaster Management Lifecycle model can have benefits in the recovery phase, not just in the response phase, to potentially improve future policies and activities. Abstract2: The current execution of privacy policies, as a mode of communicating information to users, is unsatisfactory. Social networking sites (SNS) exemplify this issue, attracting growing concerns regarding their use of personal data and its effect on user privacy. This demonstrates the need for more informative policies. However, SNS lack the incentives required to improve policies, which is exacerbated by the difficulties of creating a policy that is both concise and compliant. Standardization addresses many of these issues, providing benefits for users and SNS, although it is only possible if policies share attributes which can be standardized. This investigation used thematic analysis and cross- document structure theory, to assess the similarity of attributes between the privacy policies (as available in August 2014), of the six most frequently visited SNS globally. Using the Jaccard similarity coefficient, two types of attribute were measured; the clauses used by SNS and the coverage of forty recommendations made by the UK Information Commissioner’s Office. Analysis showed that whilst similarity in the clauses used was low, similarity in the recommendations covered was high, indicating that SNS use different clauses, but to convey similar information. The analysis also showed that low similarity in the clauses was largely due to differences in semantics, elaboration and functionality between SNS. Therefore, this paper proposes that the policies of SNS already share attributes, indicating the feasibility of standardization and five recommendations are made to begin facilitating this, based on the findings of the investigation.
Resumo:
We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.
Resumo:
The article examines the structure of the collaboration networks of research groups where Slovenian and Spanish PhD students are pursuing their doctorate. The units of analysis are student-supervisor dyads. We use duocentred networks, a novel network structure appropriate for networks which are centred around a dyad. A cluster analysis reveals three typical clusters of research groups. Those which are large and belong to several institutions are labelled under a bridging social capital label. Those which are small, centred in a single institution but have high cohesion are labelled as bonding social capital. Those which are small and with low cohesion are called weak social capital groups. Academic performance of both PhD students and supervisors are highest in bridging groups and lowest in weak groups. Other variables are also found to differ according to the type of research group. At the end, some recommendations regarding academic and research policy are drawn
Resumo:
The Sustainably Managing Environmental Health Risk in Ecuador project was launched in 2004 as a partnership linking a large Canadian university with leading Cuban and Mexican institutes to strengthen the capacities of four Ecuadorian universities for leading community-based learning and research in areas as diverse as pesticide poisoning, dengue control, water and sanitation, and disaster preparedness. By 2009, train-the-trainer project initiation involved 27 participatory action research Master’s theses in 15 communities where 1200 community learners participated in the implementation of associated interventions. This led to establishment of innovative Ecuadorian-led master’s and doctoral programs, and a Population Health Observatory on Collective Health, Environment and Society for the Andean region based at the Universidad Andina Simon Bolivar. Building on this network, numerous initiatives were begun, such as an internationally funded research project to strengthen dengue control in the coastal community of Machala, and establishment of a local community eco-health centre focusing on determinants of health near Cuenca. Alliances of academic and non-academic partners from the South and North provide a promising orientation for learning together about ways of addressing negative trends of development. Assessing the impacts and sustainability of such processes, however, requires longer term monitoring of results and related challenges.
Resumo:
This Policy Contribution assesses the broad obstacles hampering ICT-led growth in Europe and identifies the main areas in which policy could unlock the greatest value. We review estimates of the value that could be generated through take-up of various technologies and carry out a broad matching with policy areas. According to the literature survey and the collected estimates, the areas in which the right policies could unlock the greatest ICT-led growth are product and labour market regulations and the European Single Market. These areas should be reformed to make European markets more flexible and competitive. This would promote wider adoption of modern data-driven organisational and management practices thereby helping to close the productivity gap between the United States and the European Union. Gains could also be made in the areas of privacy, data security, intellectual property and liability pertaining to the digital economy, especially cloud computing, and next generation network infrastructure investment. Standardisation and spectrum allocation issues are found to be important, though to a lesser degree. Strong complementarities between the analysed technologies suggest, however, that policymakers need to deal with all of the identified obstacles in order to fully realise the potential of ICT to spur long-term growth beyond the partial gains that we report.
Resumo:
The proliferation of designated areas following the implementation of Natura 2000 in Greece has initiated changes in the protected area design and conservation policy making aiming at delivering action for biodiversity and integrative planning on a wider landscape. Following the sustainability concept, an integrative approach cannot realistically take place simply by extending the protected area and designations. The paper addresses public involvement and inter-sectoral coordination as major procedural elements of integrative management and evaluates the nature and strength of their negative or positive influences on the fulfillment of an integrative vision of nature conservation. A review of the history of protected areas and administration developments in Greece provide useful input in the research. The analysis has shown that the selected network of Natura 2000 sites has been superimposed upon the existing system and resulted in duplication of administrative effort and related legislation. As a result the overall picture of protected areas in the country appears complex, confusing and fragmented. Major failures to integrated conservation perspective can be traced to structural causes rooted in politico-economic power structures of mainstream policy and in a rather limited political commitment to conservation. It is concluded that greater realisation. of integrated conservation in Greece necessitates policy reforms related mainly to sectoral legal frameworks to promote environmentalism as well as an increased effort by the managing authorities to facilitate a broader framework of public dialogue and give local communities incentives to sustainably benefit from protected areas. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The deployment of Quality of Service (QoS) techniques involves careful analysis of area including: those business requirements; corporate strategy; and technical implementation process, which can lead to conflict or contradiction between those goals of various user groups involved in that policy definition. In addition long-term change management provides a challenge as these implementations typically require a high-skill set and experience level, which expose organisations to effects such as “hyperthymestria” [1] and “The Seven Sins of Memory”, defined by Schacter and discussed further within this paper. It is proposed that, given the information embedded within the packets of IP traffic, an opportunity exists to augment the traffic management with a machine-learning agent-based mechanism. This paper describes the process by which current policies are defined and that research required to support the development of an application which enables adaptive intelligent Quality of Service controls to augment or replace those policy-based mechanisms currently in use.
Resumo:
Agri-environment schemes (AESs) have been implemented across EU member states in an attempt to reconcile agricultural production methods with protection of the environment and maintenance of the countryside. To determine the extent to which such policy objectives are being fulfilled, participating countries are obliged to monitor and evaluate the environmental, agricultural and socio-economic impacts of their AESs. However, few evaluations measure precise environmental outcomes and critically, there are no agreed methodologies to evaluate the benefits of particular agri-environmental measures, or to track the environmental consequences of changing agricultural practices. In response to these issues, the Agri-Environmental Footprint project developed a common methodology for assessing the environmental impact of European AES. The Agri-Environmental Footprint Index (AFI) is a farm-level, adaptable methodology that aggregates measurements of agri-environmental indicators based on Multi-Criteria Analysis (MCA) techniques. The method was developed specifically to allow assessment of differences in the environmental performance of farms according to participation in agri-environment schemes. The AFI methodology is constructed so that high values represent good environmental performance. This paper explores the use of the AFI methodology in combination with Farm Business Survey data collected in England for the Farm Accountancy Data Network (FADN), to test whether its use could be extended for the routine surveillance of environmental performance of farming systems using established data sources. Overall, the aim was to measure the environmental impact of three different types of agriculture (arable, lowland livestock and upland livestock) in England and to identify differences in AFI due to participation in agri-environment schemes. However, because farm size, farmer age, level of education and region are also likely to influence the environmental performance of a holding, these factors were also considered. Application of the methodology revealed that only arable holdings participating in agri-environment schemes had a greater environmental performance, although responses differed between regions. Of the other explanatory variables explored, the key factors determining the environmental performance for lowland livestock holdings were farm size, farmer age and level of education. In contrast, the AFI value of upland livestock holdings differed only between regions. The paper demonstrates that the AFI methodology can be used readily with English FADN data and therefore has the potential to be applied more widely to similar data sources routinely collected across the EU-27 in a standardised manner.
Resumo:
This paper examines how innovation-related capabilities for production, design and marketing develop at the subsidiary level within multinational enterprises (MNEs). We focus on how subsidiary autonomy and changing opportunities to access internal (MNE) and external (host country) sources of capability contribute in a combined way to the accumulation of specialist capabilities in five Taiwan-based MNE subsidiaries in the semiconductor industry. Longitudinal analysis shows how the accumulation process is subject to discontinuities, as functional divisions are (re)opened and closed during the lifetime of the subsidiary. A composite set of innovation output measures also shows significant variations in within-function levels of capability across our sample. We conclude that subsidiary specialisation and unique subsidiary-specific advantages have evolved in a way that is strongly influenced by the above factors.
Resumo:
Dominant paradigms of causal explanation for why and how Western liberal-democracies go to war in the post-Cold War era remain versions of the 'liberal peace' or 'democratic peace' thesis. Yet such explanations have been shown to rest upon deeply problematic epistemological and methodological assumptions. Of equal importance, however, is the failure of these dominant paradigms to account for the 'neoliberal revolution' that has gripped Western liberal-democracies since the 1970s. The transition from liberalism to neoliberalism remains neglected in analyses of the contemporary Western security constellation. Arguing that neoliberalism can be understood simultaneously through the Marxian concept of ideology and the Foucauldian concept of governmentality – that is, as a complementary set of 'ways of seeing' and 'ways of being' – the thesis goes on to analyse British security in policy and practice, considering it as an instantiation of a wider neoliberal way of war. In so doing, the thesis draws upon, but also challenges and develops, established critical discourse analytic methods, incorporating within its purview not only the textual data that is usually considered by discourse analysts, but also material practices of security. This analysis finds that contemporary British security policy is predicated on a neoliberal social ontology, morphology and morality – an ideology or 'way of seeing' – focused on the notion of a globalised 'network-market', and is aimed at rendering circulations through this network-market amenable to neoliberal techniques of government. It is further argued that security practices shaped by this ideology imperfectly and unevenly achieve the realisation of neoliberal 'ways of being' – especially modes of governing self and other or the 'conduct of conduct' – and the re-articulation of subjectivities in line with neoliberal principles of individualism, risk, responsibility and flexibility. The policy and practice of contemporary British 'security' is thus recontextualised as a component of a broader 'neoliberal way of war'.
Resumo:
The Department of Public Policy Analysis of Getulio Vargas Foundation (DAPP-FGV) is a research boutique concerned with the innovation of state structures and the relations between these and Civil Society, based on an interdisciplinary approach of applied social science, together with Information and Communication Technologies.