947 resultados para Libyan Data Protection Authority
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
County jurisdictions in America are increasingly exercising self-government in the provision of public community services through the context of second order federalism. In states exercising this form of contemporary governance, county governments with “reformed” policy-making structures and professional management practices, have begun to rival or surpass municipalities in the delivery of local services with regional implications such as environmental protection (Benton 2002, 2003; Marando and Reeves, 1993). The voter referendum, a form of direct democracy, is an important component of county land preservation and environmental protection governmental policies. The recent growth and success of land preservation voter referendums nationwide reflects an increase in citizen participation in government and their desire to protect vacant land and its natural environment from threats of over-development, urbanization and sprawl, loss of open space and farmland, deterioration of ecosystems, and inadequate park and recreational amenities. The study’s design employs a sequential, mixed method. First, a quantitative approach employs the Heckman two-step model. It is fitted with variables for the non-random sample of 227 voter referendum counties and all non-voter referendum counties in the U.S. from 1988 to 2009. Second, the qualitative data collected from the in-depth investigation of three South Florida county case studies with twelve public administrator interviews is transformed for integration with the quantitative findings. The purpose of the qualitative method is to complement, explain and enrich the statistical analysis of county demographic, socio-economic, terrain, regional, governance and government, political preference, environmentalism, and referendum-specific factors. The research finds that government factors are significant in terms of the success of land preservation voter referendums; more specifically, the presence of self-government authority (home rule charter), a reformed structure (county administrator/manager or elected executive), and environmental interest groups. In addition, this study concludes that successful counties are often located coastal, exhibit population and housing growth, and have older and more educated citizens who vote democratic in presidential elections. The analysis of case study documents and public administrator interviews finds that pragmatic considerations of timing, local politics and networking of regional stakeholders are also important features of success. Further research is suggested utilizing additional public participation, local government and public administration factors.
Resumo:
Contingent Protection has grown to become an important trade restricting device. In the European Union, protection instruments like antidumping are used extensively. This paper analyses whether macroeconomic pressures may contribute to explain the variations in the intensity of antidumping protectionism in the EU. The empirical analysis uses count data models, applying various specification tests to derive the most appropriate specification. Our results suggest that the filing activity is inversely related to the macroeconomic conditions. Moreover, they confirm existing evidence for the US suggesting that domestic macroeconomic pressures are a more important determinant of contingent protection policy than external pressures.
Resumo:
Traditional knowledge associated with genetic resources (TKaGRs) is acknowledged as a valuable resource. Its value draws from economic, social, cultural, and innovative uses. This value places TK at the heart of competing interests as between indigenous peoples who hold it and depend on it for their survival, and profitable industries which seek to exploit it in the global market space. The latter group seek, inter alia, to advance and maintain their global competitiveness by exploiting TKaGRs leads in their research and development activities connected with modern innovation. Biopiracy remains an issue of central concern to the developing world and has emerged in this context as a label for the inequity arising from the misappropriation of TKaGRs located in the South by commercial interests usually located in the North. Significant attention and resources are being channeled at global efforts to design and implement effective protection mechanisms for TKaGRs against the incidence of biopiracy. The emergence and recent entry into force of the Nagoya Protocol offers the latest example of a concluded multilateral effort in this regard. The Nagoya Protocol, adopted on the platform of the Convention on Biological Diversity (CBD), establishes an open-ended international access and benefit sharing (ABS) regime which is comprised of the Protocol as well as several complementary instruments. By focusing on the trans-regime nature of biopiracy, this thesis argues that the intellectual property (IP) system forms a central part of the problem of biopiracy, and so too to the very efforts to implement solutions, including through the Nagoya Protocol. The ongoing related work within the World Intellectual Property Organization (WIPO), aimed at developing an international instrument (or a series of instruments) to address the effective protection of TK, constitutes an essential complementary process to the Nagoya Protocol, and, as such, forms a fundamental element within the Nagoya Protocol’s evolving ABS regime-complex. By adopting a third world approach to international law, this thesis draws central significance from its reconceptualization of biopiracy as a trans-regime concept. By construing the instrument(s) being negotiated within WIPO as forming a central component part of the Nagoya Protocol, this dissertation’s analysis highlights the importance of third world efforts to secure an IP-based reinforcement to the Protocol for the effective eradication of biopiracy.
Resumo:
In recent years, the 380V DC and 48V DC distribution systems have been extensively studied for the latest data centers. It is widely believed that the 380V DC system is a very promising candidate because of its lower cable cost compared to the 48V DC system. However, previous studies have not adequately addressed the low reliability issue with the 380V DC systems due to large amount of series connected batteries. In this thesis, a quantitative comparison for the two systems has been presented in terms of efficiency, reliability and cost. A new multi-port DC UPS with both high voltage output and low voltage output is proposed. When utility ac is available, it delivers power to the load through its high voltage output and charges the battery through its low voltage output. When utility ac is off, it boosts the low battery voltage and delivers power to the load form the battery. Thus, the advantages of both systems are combined and the disadvantages of them are avoided. High efficiency is also achieved as only one converter is working in either situation. Details about the design and analysis of the new UPS are presented. For the main AC-DC part of the new UPS, a novel bridgeless three-level single-stage AC-DC converter is proposed. It eliminates the auxiliary circuit for balancing the capacitor voltages and the two bridge rectifier diodes in previous topology. Zero voltage switching, high power factor, and low component stresses are achieved with this topology. Compared to previous topologies, the proposed converter has a lower cost, higher reliability, and higher efficiency. The steady state operation of the converter is analyzed and a decoupled model is proposed for the converter. For the battery side converter as a part of the new UPS, a ZVS bidirectional DC-DC converter based on self-sustained oscillation control is proposed. Frequency control is used to ensure the ZVS operation of all four switches and phase shift control is employed to regulate the converter output power. Detailed analysis of the steady state operation and design of the converter are presented. Theoretical, simulation, and experimental results are presented to verify the effectiveness of the proposed concepts.
Resumo:
Chaque année, le piratage mondial de la musique coûte plusieurs milliards de dollars en pertes économiques, pertes d’emplois et pertes de gains des travailleurs ainsi que la perte de millions de dollars en recettes fiscales. La plupart du piratage de la musique est dû à la croissance rapide et à la facilité des technologies actuelles pour la copie, le partage, la manipulation et la distribution de données musicales [Domingo, 2015], [Siwek, 2007]. Le tatouage des signaux sonores a été proposé pour protéger les droit des auteurs et pour permettre la localisation des instants où le signal sonore a été falsifié. Dans cette thèse, nous proposons d’utiliser la représentation parcimonieuse bio-inspirée par graphe de décharges (spikegramme), pour concevoir une nouvelle méthode permettant la localisation de la falsification dans les signaux sonores. Aussi, une nouvelle méthode de protection du droit d’auteur. Finalement, une nouvelle attaque perceptuelle, en utilisant le spikegramme, pour attaquer des systèmes de tatouage sonore. Nous proposons tout d’abord une technique de localisation des falsifications (‘tampering’) des signaux sonores. Pour cela nous combinons une méthode à spectre étendu modifié (‘modified spread spectrum’, MSS) avec une représentation parcimonieuse. Nous utilisons une technique de poursuite perceptive adaptée (perceptual marching pursuit, PMP [Hossein Najaf-Zadeh, 2008]) pour générer une représentation parcimonieuse (spikegramme) du signal sonore d’entrée qui est invariante au décalage temporel [E. C. Smith, 2006] et qui prend en compte les phénomènes de masquage tels qu’ils sont observés en audition. Un code d’authentification est inséré à l’intérieur des coefficients de la représentation en spikegramme. Puis ceux-ci sont combinés aux seuils de masquage. Le signal tatoué est resynthétisé à partir des coefficients modifiés, et le signal ainsi obtenu est transmis au décodeur. Au décodeur, pour identifier un segment falsifié du signal sonore, les codes d’authentification de tous les segments intacts sont analysés. Si les codes ne peuvent être détectés correctement, on sait qu’alors le segment aura été falsifié. Nous proposons de tatouer selon le principe à spectre étendu (appelé MSS) afin d’obtenir une grande capacité en nombre de bits de tatouage introduits. Dans les situations où il y a désynchronisation entre le codeur et le décodeur, notre méthode permet quand même de détecter des pièces falsifiées. Par rapport à l’état de l’art, notre approche a le taux d’erreur le plus bas pour ce qui est de détecter les pièces falsifiées. Nous avons utilisé le test de l’opinion moyenne (‘MOS’) pour mesurer la qualité des systèmes tatoués. Nous évaluons la méthode de tatouage semi-fragile par le taux d’erreur (nombre de bits erronés divisé par tous les bits soumis) suite à plusieurs attaques. Les résultats confirment la supériorité de notre approche pour la localisation des pièces falsifiées dans les signaux sonores tout en préservant la qualité des signaux. Ensuite nous proposons une nouvelle technique pour la protection des signaux sonores. Cette technique est basée sur la représentation par spikegrammes des signaux sonores et utilise deux dictionnaires (TDA pour Two-Dictionary Approach). Le spikegramme est utilisé pour coder le signal hôte en utilisant un dictionnaire de filtres gammatones. Pour le tatouage, nous utilisons deux dictionnaires différents qui sont sélectionnés en fonction du bit d’entrée à tatouer et du contenu du signal. Notre approche trouve les gammatones appropriés (appelés noyaux de tatouage) sur la base de la valeur du bit à tatouer, et incorpore les bits de tatouage dans la phase des gammatones du tatouage. De plus, il est montré que la TDA est libre d’erreur dans le cas d’aucune situation d’attaque. Il est démontré que la décorrélation des noyaux de tatouage permet la conception d’une méthode de tatouage sonore très robuste. Les expériences ont montré la meilleure robustesse pour la méthode proposée lorsque le signal tatoué est corrompu par une compression MP3 à 32 kbits par seconde avec une charge utile de 56.5 bps par rapport à plusieurs techniques récentes. De plus nous avons étudié la robustesse du tatouage lorsque les nouveaux codec USAC (Unified Audion and Speech Coding) à 24kbps sont utilisés. La charge utile est alors comprise entre 5 et 15 bps. Finalement, nous utilisons les spikegrammes pour proposer trois nouvelles méthodes d’attaques. Nous les comparons aux méthodes récentes d’attaques telles que 32 kbps MP3 et 24 kbps USAC. Ces attaques comprennent l’attaque par PMP, l’attaque par bruit inaudible et l’attaque de remplacement parcimonieuse. Dans le cas de l’attaque par PMP, le signal de tatouage est représenté et resynthétisé avec un spikegramme. Dans le cas de l’attaque par bruit inaudible, celui-ci est généré et ajouté aux coefficients du spikegramme. Dans le cas de l’attaque de remplacement parcimonieuse, dans chaque segment du signal, les caractéristiques spectro-temporelles du signal (les décharges temporelles ;‘time spikes’) se trouvent en utilisant le spikegramme et les spikes temporelles et similaires sont remplacés par une autre. Pour comparer l’efficacité des attaques proposées, nous les comparons au décodeur du tatouage à spectre étendu. Il est démontré que l’attaque par remplacement parcimonieux réduit la corrélation normalisée du décodeur de spectre étendu avec un plus grand facteur par rapport à la situation où le décodeur de spectre étendu est attaqué par la transformation MP3 (32 kbps) et 24 kbps USAC.
Resumo:
This study focuses on the learning and teaching of Reading in English as a Foreign Language (REFL), in Libya. The study draws on an action research process in which I sought to look critically at students and teachers of English as a Foreign Language (EFL) in Libya as they learned and taught REFL in four Libyan research sites. The Libyan EFL educational system is influenced by two main factors: the method of teaching the Holy-Quran and the long-time ban on teaching EFL by the former Libyan regime under Muammar Gaddafi. Both of these factors have affected the learning and teaching of REFL and I outline these contextual factors in the first chapter of the thesis. This investigation, and the exploration of the challenges that Libyan university students encounter in their REFL, is supported by attention to reading models. These models helped to provide an analytical framework and starting point for understanding the many processes involved in reading for meaning and in reading to satisfy teacher instructions. The theoretical framework I adopted was based, mainly and initially, on top-down, bottom-up, interactive and compensatory interactive models. I drew on these models with a view to understanding whether and how the processes of reading described in the models could be applied to the reading of EFL students and whether these models could help me to better understand what was going on in REFL. The diagnosis stage of the study provided initial data collected from four Libyan research sites with research tools including video-recorded classroom observations, semi-structured interviews with teachers before and after lesson observation, and think-aloud protocols (TAPs) with 24 students (six from each university) in which I examined their REFL reading behaviours and strategies. This stage indicated that the majority of students shared behaviours such as reading aloud, reading each word in the text, articulating the phonemes and syllables of words, or skipping words if they could not pronounce them. Overall this first stage indicated that alternative methods of teaching REFL were needed in order to encourage ‘reading for meaning’ that might be based on strategies related to eventual interactive reading models adapted for REFL. The second phase of this research project was an Intervention Phase involving two team-teaching sessions in one of the four stage one universities. In each session, I worked with the teacher of one group to introduce an alternative method of REFL. This method was based on teaching different reading strategies to encourage the students to work towards an eventual interactive way of reading for meaning. A focus group discussion and TAPs followed the lessons with six students in order to discuss the 'new' method. Next were two video-recorded classroom observations which were followed by an audio-recorded discussion with the teacher about these methods. Finally, I conducted a Skype interview with the class teacher at the end of the semester to discuss any changes he had made in his teaching or had observed in his students' reading with respect to reading behaviour strategies, and reactions and performance of the students as he continued to use the 'new' method. The results of the intervention stage indicate that the teacher, perhaps not surprisingly, can play an important role in adding to students’ knowledge and confidence and in improving their REFL strategies. For example, after the intervention stage, students began to think about the title, and to use their own background knowledge to comprehend the text. The students employed, also, linguistic strategies such as decoding and, above all, the students abandoned the behaviour of reading for pronunciation in favour of reading for meaning. Despite the apparent efficacy of the alternative method, there are, inevitably, limitations related to the small-scale nature of the study and the time I had available to conduct the research. There are challenges, too, related to the students’ first language, the idiosyncrasies of the English language, the teacher training and continuing professional development of teachers, and the continuing political instability of Libya. The students’ lack of vocabulary and their difficulties with grammatical functions such as phrasal and prepositional verbs, forms which do not exist in Arabic, mean that REFL will always be challenging. Given such constraints, the ‘new’ methods I trialled and propose for adoption can only go so far in addressing students’ difficulties in REFL. Overall, the study indicates that the Libyan educational system is underdeveloped and under resourced with respect to REFL. My data indicates that the teacher participants have received little to no professional developmental that could help them improve their teaching in REFL and skills in teaching EFL. These circumstances, along with the perennial problem of large but varying class sizes; student, teacher and assessment expectations; and limited and often poor quality resources, affect the way EFL students learn to read in English. Against this background, the thesis concludes by offering tentative conclusions; reflections on the study, including a discussion of its limitations, and possible recommendations designed to improve REFL learning and teaching in Libyan universities.
Resumo:
Este estudio de caso busca evaluar los alcances y limitaciones que tiene la movilización social para lograr transformaciones en las instituciones a partir del estudio de la movilización social en Egipto durante el período 2010-2013. Se analiza y se explica en qué sentido las instituciones de movimiento lento, como las estructuras de poder y estructuras mentales, han frustrado lo acontecido en Egipto conocido como la primavera árabe. Siguiendo la perspectiva de las instituciones de Gérard Roland y Alejandro Portes, se avanza hacia el resultado de la investigación de que las instituciones de movimiento lento tienen en cuenta aspectos estructurales de una sociedad tales como el poder y la cultura. Por ello, no pueden ser cambiadas con facilidad ya que cuentan con bases sólidas que han sido construidas mediante procesos históricos fundamentados en ideologías y valores.
Resumo:
Con el fin de la unipolaridad no sólo se fortalecieron mecanismos de gobernanza global como los Regímenes Internacionales, sino también se fortalecieron actores no estatales. A pesar de la importancia que tomaron estos dos elementos aún no existe una teoría que explique exhaustivamente la relación que existe entre ellos. Es por lo anterior que, la investigación busca responder de qué manera el rol de las Redes de Apoyo Transnacional ha incidido en la evolución del régimen de tráfico de personas en la Región del Mekong. Asimismo tiene como objetivo comprender las relación entre el Régimen y las Redes de Apoyo Transnacional a través de la formulación de un caso de estudio basado en metodologías cualitativas, específicamente, en el análisis teórico-constructivista y el análisis de contenido de documentos producidos por actores estatales y no estatales.
Resumo:
Data sharing between organizations through interoperability initiatives involving multiple information systems is fundamental to promote the collaboration and integration of services. However, in terms of data, the considerable increase in its exposure to additional risks, require a special attention to issues related to privacy of these data. For the Portuguese healthcare sector, where the sharing of health data is, nowadays, a reality at national level, data privacy is a central issue, which needs solutions according to the agreed level of interoperability between organizations. This context led the authors to study the factors with influence on data privacy in a context of interoperability, through a qualitative and interpretative research, based on the method of case study. This article presents the final results of the research that successfully identifies 10 subdomains of factors with influence on data privacy, which should be the basis for the development of a joint protection program, targeted at issues associated with data privacy.
Resumo:
This dissertation looks at three widely accepted assumptions about how the patent system works: patent documents disclose inventions; this disclosure happens quickly, and patent owners are able to enforce patents. The first chapter estimates the effect of stronger trade secret protection on the number of patented innovations. When firms find it easier to protect business information, there is less need for patent protection, and accordingly less need for the disclosure of technical information that is required by patent law. The novel finding is that when it is easier to keep innovations, there is not only a reduction in the number of patents but also a sizeable reduction in disclosed knowledge per patent. The chapter then shows how this endogeneity of the amount of knowledge per patent can affect the measurement of innovation using patent data. The second chapter develops a game-theoretic model to study how the introduction of fee-shifting in US patent litigation would influence firms’ patenting propensities. When the defeated party to a lawsuit has to bear not only their own cost but also the legal expenditure of the winning party, manufacturing firms in the model unambiguously reduce patenting, with small firms affected the most. For fee-shifting to have the same effect as in Europe, the US legal system would require shifting of a much smaller share of fees. Lessons from European patent litigation may, therefore, have only limited applicability in the US case. The third chapter contains a theoretical analysis of the influence of delayed disclosure of patent applications by the patent office. Such a delay is a feature of most patent systems around the world but has so far not attracted analytical scrutiny. This delay may give firms various kinds of strategic (non-)disclosure incentives when they are competing for more than a single innovation.
Resumo:
In digital markets personal information is pervasively collected by firms. In the first chapter I study data ownership and product customization when there is exclusive access to non rival but excludable data about consumer preferences. I show that an incumbent firm does not have an incentive to sell an exclusively held dataset with a rival firm, but instead it has an incentive to trade a customizing technology with the other firm. In the second chapter I investigate the effects of consumer information on the intensity of competition. In a two dimensional model of product differentiation, firms use information on preferences to practice price discrimination. I contrast a full privacy and a no privacy benchmark with a regime in which firms are able to target consumers only partially. When data is partially informative, firms are always better-off with price discrimination and an exclusive access to user data is not necessarily a competition policy concern. From a consumer protection perspective, the policy recommendation is that the regulator should promote either no privacy or full privacy. In the third chapter I introduce a data broker that observes either only one or both dimensions of consumer information and sells this data to competing firms for price discrimination purposes. When the seller exogenously holds a partially informative dataset, an exclusive allocation arises. Instead, when the dataset held is fully informative, the data broker trades information non exclusively but each competitor acquires consumer data on a different dimension. When data collection is made endogenous, non exclusivity is robust if collection costs are not too high. The competition policy suggestion is that exclusivity should not be banned per se, but it is data differentiation in equilibrium that rises market power in competitive markets. Upstream competition is sufficient to ensure that both firms get access to consumer information.
Resumo:
Big data are reshaping the way we interact with technology, thus fostering new applications to increase the safety-assessment of foods. An extraordinary amount of information is analysed using machine learning approaches aimed at detecting the existence or predicting the likelihood of future risks. Food business operators have to share the results of these analyses when applying to place on the market regulated products, whereas agri-food safety agencies (including the European Food Safety Authority) are exploring new avenues to increase the accuracy of their evaluations by processing Big data. Such an informational endowment brings with it opportunities and risks correlated to the extraction of meaningful inferences from data. However, conflicting interests and tensions among the involved entities - the industry, food safety agencies, and consumers - hinder the finding of shared methods to steer the processing of Big data in a sound, transparent and trustworthy way. A recent reform in the EU sectoral legislation, the lack of trust and the presence of a considerable number of stakeholders highlight the need of ethical contributions aimed at steering the development and the deployment of Big data applications. Moreover, Artificial Intelligence guidelines and charters published by European Union institutions and Member States have to be discussed in light of applied contexts, including the one at stake. This thesis aims to contribute to these goals by discussing what principles should be put forward when processing Big data in the context of agri-food safety-risk assessment. The research focuses on two interviewed topics - data ownership and data governance - by evaluating how the regulatory framework addresses the challenges raised by Big data analysis in these domains. The outcome of the project is a tentative Roadmap aimed to identify the principles to be observed when processing Big data in this domain and their possible implementations.
Resumo:
Il presente lavoro tratta il tema della violenza di genere, in particolare femminicidio, stalking e violenza domestica per la connessione tra questi intercorrente nell’escalation violenta. Spesso, i femminicidi sono preceduti da stalking o ripetuti episodi di violenza fisica prettamente consumata in ambito domestico e relazioni affettive pregresse/attuali. La prima parte del lavoro descrive ambito scientifico e giuridico, internazionale e nazionale, della violenza di genere, dimensione del fenomeno ed evoluzione normativa a tutela delle vittime, come richiesto dalla Convenzione di Istanbul. La seconda parte affronta aspetti medico-legali della violenza di genere (attività di patologia forense, genetica forense e tossicologia forense nei femminicidi, assistenza medico-legale alla vittima di maltrattamenti e violenza sessuale, valutazione medico-legale del danno alla persona per stalking). La terza parte presenta lo studio su casistica autoptica di femminicidi del 1950-2019 e provvedimenti di ammonimento del Questore per stalking e per violenza domestica del 2009-2020 nella provincia bolognese. Secondo i risultati il femminicidio è un fenomeno di “vecchia data”, in un quadro normativo-culturale che “tollerava” violenza a danno di vittime femminili. L’andamento del fenomeno è costante fino a oggi con modifiche di cause e mezzi del decesso. L’aumentata età media delle vittime richiama il crescente fenomeno dell’elder abuse. Negli ammonimenti per stalking e per violenza domestica analizzati prevalgono vittime di genere femminile e autori di genere maschile, in vicende affettive e familiari. Lo studio approfondito delle caratteristiche di autori e vittime di femminicidio, stalking e violenza domestica permette di individuare indicatori di rischio per implementare strategie di prevenzione mirate. Il punto di vista privilegiato medico-legale può assumere un ruolo centrale insieme alle altre figure coinvolte in prevenzione, repressione e contrasto alla violenza di genere. Solo un rigoroso approccio metodologico multidisciplinare può aiutare nella prevenzione. La ricerca in tale ambito è il punto di forza della gestione multidisciplinare della vittima.
Resumo:
The Myanmar “period of transition” (2011-2021) has often been described as a puzzle. Various scholars have begun to engage with the Myanmar context in an effort to grasp the essence of the transition it underwent during President Thein Sein’s USPD and Aung San Suu Kyi’s NLD governments. My work focuses on a specific policy sector, higher education, with a view to contributing to this scholarly debate regarding what was actually happening inside this complex country “transition”, especially in terms of collective participation in the process of political and social change. Reviewing existing scholarly literature on the politics of higher education, my study employs a triangle of analysis in which higher education reform is framed as the interplay of action on the part of “state authority”, “student politics” and “international actors”. What does this interplay lens reveal if we consider Myanmar’s “period of transition”? I argue that it shows the ambiguity and contradiction of tangible pushes for progressive social change that coexisted with authoritarian currents and the reinforcement of the societal position of dominant elites. At the policy level, ultimately, a convergence of interests between international actors and state authority served as the force driving the new higher education reform towards a neo-liberal model of governance and autonomy. This work unpacks the higher education reform process thanks to qualitative data gathered through extensive participant observation, in-depth interviewing and critical discourse analysis, shedding light on the rich narratives of those involved in the politics of higher education in Myanmar.