819 resultados para Emerging Challenges in offshoring
Resumo:
Wireless sensor networks (WSNs) are one of the most important users of wireless communication technologies in the coming years and some challenges in this area must be addressed for their complete development. Energy consumption and spectrum availability are two of the most severe constraints of WSNs due to their intrinsic nature. The introduction of cognitive capabilities into these networks has arisen to face the issue of spectrum scarcity but could be used to face energy challenges too due to their new range of communication possibilities. In this paper a new strategy based on game theory for cognitive WSNs is discussed. The presented strategy improves energy consumption by taking advantage of the new change-communication-channel capability. Based on game theory, the strategy decides when to change the transmission channel depending on the behavior of the rest of the network nodes. The strategy presented is lightweight but still has higher energy saving rates as compared to noncognitive networks and even to other strategies based on scheduled spectrum sensing. Simulations are presented for several scenarios that demonstrate energy saving rates of around 65% as compared to WSNs without cognitive techniques.
Resumo:
Due to the significant increase of population and their natural desire of improving their standard of living, usage of energy extracted from world commodities, especially shaped as electricity, has increased in an intense manner during the last decades. This fact brings up a challenge with a complicated solution, which is how to guarantee that there will be enough energy so as to satisfy the energy demand of the world population. Among all the possible solutions that can be adopted to mitigate this problem one of them is almost of mandatory adoption, which consists of rationalizing energy utilization, in a way that its wasteful usage is minimized and it can be leveraged during a longer period of time. One of the ways to achieve it is by means of the improvement of the power distribution grid, so that it will be able to react in a more efficient manner against common issues, such as energy demand peaks or inaccurate electricity consumption forecasts. However, in order to be able to implement this improvement it is necessary to use technologies from the ICT (Information and Communication Technologies) sphere that often present challenges in some key areas: advanced metering infrastructure integration, interoperability and interconnectivity of the devices, interfaces to offer the applications, security measures design, etc. All these challenges may imply slowing down the adoption of the smart grid as a system to prolong the lifespan and utilization of the available energy. A proposal for an intermediation architecture that will make possible solving these challenges is put forward in this Master Thesis. Besides, one implementation and the tests that have been carried out to know the performance of the presented concepts have been included as well, in a way that it can be proved that the challenges set out by the smart grid can be resolved. RESUMEN. Debido al incremento significativo de la población y su deseo natural de mejorar su nivel de vida, la utilización de la energía extraída de las materias primas mundiales, especialmente en forma de electricidad, ha aumentado de manera intensa durante las últimas décadas. Este hecho plantea un reto de solución complicada, el cual es cómo garantizar que se dispondrá de la energía suficiente como para satisfacer la demanda energética de la población mundial. De entre todas las soluciones posibles que se pueden adoptar para mitigar este problema una de ellas es de casi obligatoria adopción, la cual consiste en racionalizar la utilización de la energía, de tal forma que se minimice su malgasto y pueda aprovecharse durante más tiempo. Una de las maneras de conseguirlo es mediante la mejora de la red de distribución de electricidad para que ésta pueda reaccionar de manera más eficaz contra problemas comunes, tales como los picos de demanda de energía o previsiones imprecisas acerca del consumo de electricidad. Sin embargo, para poder implementar esta mejora es necesario utilizar tecnologías del ámbito de las TIC (Tecnologías de la Información y la Comunicación) que a menudo presentan problemas en algunas áreas clave: integración de infraestructura de medición avanzada, interoperabilidad e interconectividad de los dispositivos, interfaces que ofrecer a las aplicaciones, diseño de medidas de seguridad, etc. Todos estos retos pueden implicar una ralentización en la adopción de la red eléctrica inteligente como un sistema para alargar la vida y la utilización de la energía disponible. En este Trabajo Fin de Máster se sugiere una propuesta para una arquitectura de intermediación que posibilite la resolución de estos retos. Además, una implementación y las pruebas que se han llevado a cabo para conocer el rendimiento de los conceptos presentados también han sido incluidas, de tal forma que se demuestre que los retos que plantea la red eléctrica inteligente pueden ser solventados.
Resumo:
La tomografía axial computerizada (TAC) es la modalidad de imagen médica preferente para el estudio de enfermedades pulmonares y el análisis de su vasculatura. La segmentación general de vasos en pulmón ha sido abordada en profundidad a lo largo de los últimos años por la comunidad científica que trabaja en el campo de procesamiento de imagen; sin embargo, la diferenciación entre irrigaciones arterial y venosa es aún un problema abierto. De hecho, la separación automática de arterias y venas está considerado como uno de los grandes retos futuros del procesamiento de imágenes biomédicas. La segmentación arteria-vena (AV) permitiría el estudio de ambas irrigaciones por separado, lo cual tendría importantes consecuencias en diferentes escenarios médicos y múltiples enfermedades pulmonares o estados patológicos. Características como la densidad, geometría, topología y tamaño de los vasos sanguíneos podrían ser analizados en enfermedades que conllevan remodelación de la vasculatura pulmonar, haciendo incluso posible el descubrimiento de nuevos biomarcadores específicos que aún hoy en dípermanecen ocultos. Esta diferenciación entre arterias y venas también podría ayudar a la mejora y el desarrollo de métodos de procesamiento de las distintas estructuras pulmonares. Sin embargo, el estudio del efecto de las enfermedades en los árboles arterial y venoso ha sido inviable hasta ahora a pesar de su indudable utilidad. La extrema complejidad de los árboles vasculares del pulmón hace inabordable una separación manual de ambas estructuras en un tiempo realista, fomentando aún más la necesidad de diseñar herramientas automáticas o semiautomáticas para tal objetivo. Pero la ausencia de casos correctamente segmentados y etiquetados conlleva múltiples limitaciones en el desarrollo de sistemas de separación AV, en los cuales son necesarias imágenes de referencia tanto para entrenar como para validar los algoritmos. Por ello, el diseño de imágenes sintéticas de TAC pulmonar podría superar estas dificultades ofreciendo la posibilidad de acceso a una base de datos de casos pseudoreales bajo un entorno restringido y controlado donde cada parte de la imagen (incluyendo arterias y venas) está unívocamente diferenciada. En esta Tesis Doctoral abordamos ambos problemas, los cuales están fuertemente interrelacionados. Primero se describe el diseño de una estrategia para generar, automáticamente, fantomas computacionales de TAC de pulmón en humanos. Partiendo de conocimientos a priori, tanto biológicos como de características de imagen de CT, acerca de la topología y relación entre las distintas estructuras pulmonares, el sistema desarrollado es capaz de generar vías aéreas, arterias y venas pulmonares sintéticas usando métodos de crecimiento iterativo, que posteriormente se unen para formar un pulmón simulado con características realistas. Estos casos sintéticos, junto a imágenes reales de TAC sin contraste, han sido usados en el desarrollo de un método completamente automático de segmentación/separación AV. La estrategia comprende una primera extracción genérica de vasos pulmonares usando partículas espacio-escala, y una posterior clasificación AV de tales partículas mediante el uso de Graph-Cuts (GC) basados en la similitud con arteria o vena (obtenida con algoritmos de aprendizaje automático) y la inclusión de información de conectividad entre partículas. La validación de los fantomas pulmonares se ha llevado a cabo mediante inspección visual y medidas cuantitativas relacionadas con las distribuciones de intensidad, dispersión de estructuras y relación entre arterias y vías aéreas, los cuales muestran una buena correspondencia entre los pulmones reales y los generados sintéticamente. La evaluación del algoritmo de segmentación AV está basada en distintas estrategias de comprobación de la exactitud en la clasificación de vasos, las cuales revelan una adecuada diferenciación entre arterias y venas tanto en los casos reales como en los sintéticos, abriendo así un amplio abanico de posibilidades en el estudio clínico de enfermedades cardiopulmonares y en el desarrollo de metodologías y nuevos algoritmos para el análisis de imágenes pulmonares. ABSTRACT Computed tomography (CT) is the reference image modality for the study of lung diseases and pulmonary vasculature. Lung vessel segmentation has been widely explored by the biomedical image processing community, however, differentiation of arterial from venous irrigations is still an open problem. Indeed, automatic separation of arterial and venous trees has been considered during last years as one of the main future challenges in the field. Artery-Vein (AV) segmentation would be useful in different medical scenarios and multiple pulmonary diseases or pathological states, allowing the study of arterial and venous irrigations separately. Features such as density, geometry, topology and size of vessels could be analyzed in diseases that imply vasculature remodeling, making even possible the discovery of new specific biomarkers that remain hidden nowadays. Differentiation between arteries and veins could also enhance or improve methods processing pulmonary structures. Nevertheless, AV segmentation has been unfeasible until now in clinical routine despite its objective usefulness. The huge complexity of pulmonary vascular trees makes a manual segmentation of both structures unfeasible in realistic time, encouraging the design of automatic or semiautomatic tools to perform the task. However, this lack of proper labeled cases seriously limits in the development of AV segmentation systems, where reference standards are necessary in both algorithm training and validation stages. For that reason, the design of synthetic CT images of the lung could overcome these difficulties by providing a database of pseudorealistic cases in a constrained and controlled scenario where each part of the image (including arteries and veins) is differentiated unequivocally. In this Ph.D. Thesis we address both interrelated problems. First, the design of a complete framework to automatically generate computational CT phantoms of the human lung is described. Starting from biological and imagebased knowledge about the topology and relationships between structures, the system is able to generate synthetic pulmonary arteries, veins, and airways using iterative growth methods that can be merged into a final simulated lung with realistic features. These synthetic cases, together with labeled real CT datasets, have been used as reference for the development of a fully automatic pulmonary AV segmentation/separation method. The approach comprises a vessel extraction stage using scale-space particles and their posterior artery-vein classification using Graph-Cuts (GC) based on arterial/venous similarity scores obtained with a Machine Learning (ML) pre-classification step and particle connectivity information. Validation of pulmonary phantoms from visual examination and quantitative measurements of intensity distributions, dispersion of structures and relationships between pulmonary air and blood flow systems, show good correspondence between real and synthetic lungs. The evaluation of the Artery-Vein (AV) segmentation algorithm, based on different strategies to assess the accuracy of vessel particles classification, reveal accurate differentiation between arteries and vein in both real and synthetic cases that open a huge range of possibilities in the clinical study of cardiopulmonary diseases and the development of methodological approaches for the analysis of pulmonary images.
Resumo:
Emotion is generally argued to be an influence on the behavior of life systems, largely concerning flexibility and adaptivity. The way in which life systems acts in response to a particular situations of the environment, has revealed the decisive and crucial importance of this feature in the success of behaviors. And this source of inspiration has influenced the way of thinking artificial systems. During the last decades, artificial systems have undergone such an evolution that each day more are integrated in our daily life. They have become greater in complexity, and the subsequent effects are related to an increased demand of systems that ensure resilience, robustness, availability, security or safety among others. All of them questions that raise quite a fundamental challenges in control design. This thesis has been developed under the framework of the Autonomous System project, a.k.a the ASys-Project. Short-term objectives of immediate application are focused on to design improved systems, and the approaching of intelligence in control strategies. Besides this, long-term objectives underlying ASys-Project concentrate on high order capabilities such as cognition, awareness and autonomy. This thesis is placed within the general fields of Engineery and Emotion science, and provides a theoretical foundation for engineering and designing computational emotion for artificial systems. The starting question that has grounded this thesis aims the problem of emotion--based autonomy. And how to feedback systems with valuable meaning has conformed the general objective. Both the starting question and the general objective, have underlaid the study of emotion, the influence on systems behavior, the key foundations that justify this feature in life systems, how emotion is integrated within the normal operation, and how this entire problem of emotion can be explained in artificial systems. By assuming essential differences concerning structure, purpose and operation between life and artificial systems, the essential motivation has been the exploration of what emotion solves in nature to afterwards analyze analogies for man--made systems. This work provides a reference model in which a collection of entities, relationships, models, functions and informational artifacts, are all interacting to provide the system with non-explicit knowledge under the form of emotion-like relevances. This solution aims to provide a reference model under which to design solutions for emotional operation, but related to the real needs of artificial systems. The proposal consists of a multi-purpose architecture that implement two broad modules in order to attend: (a) the range of processes related to the environment affectation, and (b) the range or processes related to the emotion perception-like and the higher levels of reasoning. This has required an intense and critical analysis beyond the state of the art around the most relevant theories of emotion and technical systems, in order to obtain the required support for those foundations that sustain each model. The problem has been interpreted and is described on the basis of AGSys, an agent assumed with the minimum rationality as to provide the capability to perform emotional assessment. AGSys is a conceptualization of a Model-based Cognitive agent that embodies an inner agent ESys, the responsible of performing the emotional operation inside of AGSys. The solution consists of multiple computational modules working federated, and aimed at conforming a mutual feedback loop between AGSys and ESys. Throughout this solution, the environment and the effects that might influence over the system are described as different problems. While AGSys operates as a common system within the external environment, ESys is designed to operate within a conceptualized inner environment. And this inner environment is built on the basis of those relevances that might occur inside of AGSys in the interaction with the external environment. This allows for a high-quality separate reasoning concerning mission goals defined in AGSys, and emotional goals defined in ESys. This way, it is provided a possible path for high-level reasoning under the influence of goals congruence. High-level reasoning model uses knowledge about emotional goals stability, letting this way new directions in which mission goals might be assessed under the situational state of this stability. This high-level reasoning is grounded by the work of MEP, a model of emotion perception that is thought as an analogy of a well-known theory in emotion science. The work of this model is described under the operation of a recursive-like process labeled as R-Loop, together with a system of emotional goals that are assumed as individual agents. This way, AGSys integrates knowledge that concerns the relation between a perceived object, and the effect which this perception induces on the situational state of the emotional goals. This knowledge enables a high-order system of information that provides the sustain for a high-level reasoning. The extent to which this reasoning might be approached is just delineated and assumed as future work. This thesis has been studied beyond a long range of fields of knowledge. This knowledge can be structured into two main objectives: (a) the fields of psychology, cognitive science, neurology and biological sciences in order to obtain understanding concerning the problem of the emotional phenomena, and (b) a large amount of computer science branches such as Autonomic Computing (AC), Self-adaptive software, Self-X systems, Model Integrated Computing (MIC) or the paradigm of models@runtime among others, in order to obtain knowledge about tools for designing each part of the solution. The final approach has been mainly performed on the basis of the entire acquired knowledge, and described under the fields of Artificial Intelligence, Model-Based Systems (MBS), and additional mathematical formalizations to provide punctual understanding in those cases that it has been required. This approach describes a reference model to feedback systems with valuable meaning, allowing for reasoning with regard to (a) the relationship between the environment and the relevance of the effects on the system, and (b) dynamical evaluations concerning the inner situational state of the system as a result of those effects. And this reasoning provides a framework of distinguishable states of AGSys derived from its own circumstances, that can be assumed as artificial emotion.
Resumo:
An emerging topic in plant biology is whether plants display analogous elements of mammalian programmed cell death during development and defense against pathogen attack. In many plant–pathogen interactions, plant cell death occurs in both susceptible and resistant host responses. For example, specific recognition responses in plants trigger formation of the hypersensitive response and activation of host defense mechanisms, resulting in restriction of pathogen growth and disease development. Several studies indicate that cell death during hypersensitive response involves activation of a plant-encoded pathway for cell death. Many susceptible interactions also result in host cell death, although it is not clear how or if the host participates in this response. We have generated transgenic tobacco plants to express animal genes that negatively regulate apoptosis. Plants expressing human Bcl-2 and Bcl-xl, nematode CED-9, or baculovirus Op-IAP transgenes conferred heritable resistance to several necrotrophic fungal pathogens, suggesting that disease development required host–cell death pathways. In addition, the transgenic tobacco plants displayed resistance to a necrogenic virus. Transgenic tobacco harboring Bcl-xl with a loss-of-function mutation did not protect against pathogen challenge. We also show that discrete DNA fragmentation (laddering) occurred in susceptible tobacco during fungal infection, but does not occur in transgenic-resistant plants. Our data indicate that in compatible plant–pathogen interactions apoptosis-like programmed cell death occurs. Further, these animal antiapoptotic genes function in plants and should be useful to delineate resistance pathways. These genes also have the potential to generate effective disease resistance in economically important crops.
Resumo:
A murine model for antigen-induced bronchial hyperreactivity (BHR) and airway eosinophilia, two hallmarks of asthma, was developed using ovalbumin-immunized mice, which produce large amounts of IgE (named BP2, "Bons Producteurs 2," for High Line of Selection 2). A single intranasal ovalbumin challenge failed to modify the bronchial responses, despite the intense eosinophil recruitment into the bronchoalveolar lavage fluid and airways. When mice were challenged twice a day for 2 days or once a day for 10 days, BHR in response to i.v. 5-hydroxytryptamine or to inhaled methacholine was induced in BP2 mice but not in BALB/c mice. Histological examination showed that eosinophils reached the respiratory epithelium after multiple ovalbumin challenges in BP2 mice but remained in the bronchial submucosa in BALB/c mice. Total IgE titers in serum were augmented significantly with immunization in both strains, but much more so in BP2 mice. Interleukin 5 (IL-5) titers in serum and bronchoalveolar lavage fluid of BP2 mice were augmented by the antigenic provocation, and a specific anti-IL5 neutralizing antibody suppressed altogether airway eosinophilia and BHR, indicating a participation of IL-5 in its development. Our results indicate that the recruitment of eosinophils to the airways alone does not induce BHR in mice and that the selective effect on BP2 mice is related to their increased IgE titers associated with antigen-driven eosinophil migration to the epithelium, following formation and secretion of IL-5.
Resumo:
Advances in digital speech processing are now supporting application and deployment of a variety of speech technologies for human/machine communication. In fact, new businesses are rapidly forming about these technologies. But these capabilities are of little use unless society can afford them. Happily, explosive advances in microelectronics over the past two decades have assured affordable access to this sophistication as well as to the underlying computing technology. The research challenges in speech processing remain in the traditionally identified areas of recognition, synthesis, and coding. These three areas have typically been addressed individually, often with significant isolation among the efforts. But they are all facets of the same fundamental issue--how to represent and quantify the information in the speech signal. This implies deeper understanding of the physics of speech production, the constraints that the conventions of language impose, and the mechanism for information processing in the auditory system. In ongoing research, therefore, we seek more accurate models of speech generation, better computational formulations of language, and realistic perceptual guides for speech processing--along with ways to coalesce the fundamental issues of recognition, synthesis, and coding. Successful solution will yield the long-sought dictation machine, high-quality synthesis from text, and the ultimate in low bit-rate transmission of speech. It will also open the door to language-translating telephony, where the synthetic foreign translation can be in the voice of the originating talker.
Resumo:
La presente tesi analizza il reato di traffico di esseri umani in Europa, con particolare attenzione al fenomeno dello sfruttamento sessuale. La ricerca è stata condotta in parte nell’ambito del progetto “FIDUCIA. New European crimes and trust-based policy” (www.fiduciaproject.eu). La tesi è composta da 5 capitoli. Il primo capitolo introduce il reato di tratta di esseri umani, a livello globale e, successivamente, nello specifico, in Europa. Vengono presentati i fattori determinanti e le origini del fenomeno. Inoltre, ne vengono fornite le definizioni e le principali caratteristiche, in linea con i più importanti documenti internazionali sul tema. Il capitolo si chiude con una panoramica statistica, che affronta anche le criticità della raccolta di dati relativi ai reati. Il secondo capitolo analizza l’approccio correntemente adottato a livello domestico ed europeo contro la tratta. Le misure vengono presentate prima dal punto di vista teorico; successivamente ne vengono forniti esempi concreti, ad esempio convenzioni internazionali, direttive, ma anche progetti di ricerca, collaborazioni internazionali tra autorità ed ONG. Il terzo capitolo si concentra sulla tratta a fini di sfruttamento sessuale. Vengono analizzati il potenziale legame con la prostituzione, e l’approccio europeo. Segue un approfondimento dei modelli legali implementati a livello europeo ed uno studio comparato di cinque paesi membri, rappresentativi dei vari modelli di regolamentazione della prostituzione (Italia, Belgio, Polonia, Germania e Svezia). Il quarto capitolo raccoglie le interviste condotte con diversi esperti che si occupano di contrasto alla tratta: ONG italiane e straniere, referenti nazionali anti-tratta di Italia, Belgio e Germania, FRONTEX, membri del Parlamento Europeo. Nelle conclusioni, vengono proposte prima una valutazione complessiva del quadro attuale, e poi alcune raccomandazioni ai governi nazionali e agli organismi sopranazionali. In particolare, visto l’obiettivo di un contrasto omogeneo e coordinato della tratta (per sfruttamento sessuale specificamente) a livello europeo, si ritiene che un modello regolamentare uniforme della prostituzione negli stati membri possa contribuire a migliorare uniformità ad efficacia dell’approccio europeo alla tratta.
Resumo:
Novice therapists training in Acceptance and Commitment Therapy (ACT) may encounter challenges in therapy in which their own personal history functions as a barrier to flexible modes of therapeutic engagement with the therapist. From the ACT perspective, counter-therapeutic interpersonal responses may be examined relative to six behavioral sub-processes. It is suggested that the most vulnerable moments for the therapist will involve those in which certain contextual features of therapy pull historical awareness of a painful personal past into relation with the psychological present. This paper hypothesizes that utilizing approaches based in ACT will assist therapists in overcoming these challenges and will illustrate how to approach case formulation and intervention with therapists in training from a functional contextualistic perspective. To begin, the philosophical and theoretical underpinnings of ACT will be outlined in sufficient depth to intellectually ground the model and its therapeutic project. This conceptual foundation will then be brought to applied focus using hypothetical case material, followed by ACT interventions designed to increase clinical flexibility in the given therapeutic scenario. Future research that systematically examines the effectiveness of such methods among therapists is encouraged.
Resumo:
This paper analyzes the learning experiences and opinions obtained from a group of undergraduate students in their interaction with several on-line multimedia resources included in a free on-line course about Computer Networks. These new educational resources employed are based on the Web 2.0 approach such as blogs, videos and virtual labs which have been added in a web-site for distance self-learning.
Resumo:
Customizing shoe manufacturing is one of the great challenges in the footwear industry. It is a production model change where design adopts not only the main role, but also the main bottleneck. It is therefore necessary to accelerate this process by improving the accuracy of current methods. Rapid prototyping techniques are based on the reuse of manufactured footwear lasts so that they can be modified with CAD systems leading rapidly to new shoe models. In this work, we present a shoe last fast reconstruction method that fits current design and manufacturing processes. The method is based on the scanning of shoe last obtaining sections and establishing a fixed number of landmarks onto those sections to reconstruct the shoe last 3D surface. Automated landmark extraction is accomplished through the use of the self-organizing network, the growing neural gas (GNG), which is able to topographically map the low dimensionality of the network to the high dimensionality of the contour manifold without requiring a priori knowledge of the input space structure. Moreover, our GNG landmark method is tolerant to noise and eliminates outliers. Our method accelerates up to 12 times the surface reconstruction and filtering processes used by the current shoe last design software. The proposed method offers higher accuracy compared with methods with similar efficiency as voxel grid.
Resumo:
One of the main challenges in biological conservation has been to understand species distribution across space and time. Over the last decades, many diversity and conservation surveys have been conducted that have revealed that habitat heterogeneity acts as a major factor that determines saproxylic assemblages. However, temporal dynamics have been poorly studied, especially in Mediterranean forests. We analyzed saproxylic beetle distribution at inter and intra-annual scales in a “dehesa” ecosystem, which is a traditional Iberian agrosilvopastoral ecosystem that is characterized by the presence of old and scattered trees that dominate the landscape. Significant differences in effective numbers of families/species and species richness were found at the inter-annual scale, but this was not the case for composition. Temperature and relative humidity did not explain these changes which were mainly due to the presence of rare species. At the intra-annual scale, significant differences in the effective numbers of families/species, species richness and composition between seasons were found, and diversity partitioning revealed that season contributed significantly to gamma-diversity. Saproxylic beetle assemblages exhibited a marked seasonality in richness but not in abundance, with two peaks of activity, the highest between May and June, and the second between September and October. This pattern is mainly driven by the seasonality of the climate in the Mediterranean region, which influences ecosystem dynamics and imposes a marked seasonality on insect assemblages. An extended sampling period over different seasons allowed an overview of saproxylic dynamics, and revealed which families/species were restricted to particular seasons. Recognizing that seasons act as a driver in modelling saproxylic beetle assemblages might be a valuable tool in monitoring and for conservation strategies in Mediterranean forests.
Resumo:
The end of 2015 was the deadline that 189 countries gave themselves to achieve the United Nations Millennium Development Goals (MDGs), a list of eight goals that were agreed upon and approved by the UN after the Millennium Summit in year 2000. Despite some legitimate criticism, the MDGs were revealed as an important tool towards building a more equitable and sustainable world. Yet our planet still faces many challenges. In September 2015, the UN approved a new set of 17 goals, the Sustainable Development Goals (SDGs), aiming to develop and implement strategies to create “The Future We Want”; strategies that 192 countries agreed upon to work together towards a more sustainable planet.
Resumo:
The global financial crisis, which started in the summer of 2007 and deepened in the aftermath of the Lehman failure in September 2008, has led to a virtual collapse in economic activity and increased financial volatility worldwide. For the developing countries, the main channel of transmission has been a drop in external transactions, such as trade, financial and capital flows, and remittances. The emerging economies in the southern and eastern Mediterranean have also faced declining economic activity, although there seems to be considerable variation in the relative magnitude and timing. Most of these economies have shown a delayed but more lasting response to the crisis, driven mostly by their close trade and investment ties with the EU and the Gulf Cooperation Council (GCC) countries. This book explores the fiscal, monetary and financial effects of the crisis in the region and provides an in-depth analysis of the fiscal, monetary and banking policies in the post-crisis era, the viability of their exit strategies and the future of reforms in the region. These analyses not only provide a comprehensive comparison between the countries but also provide a solid basis for assessing future economic and financial developments and reforms in the region.
Resumo:
Ukraine’s parliamentary elections on 26 October 2014 seem set to be the most important and most challenging the country has ever held. For the first time in Ukraine’s history, the presidential election of Petro Poroshenko in May gave many Ukrainians new hope. His victory seemed to unite the country, being the first president to have won in most of the regions despite the ongoing conflict in the East. However, with many corrupt elites still in power, reforms have become hostage to vested interests and in-fighting which has raised fears of ‘business as usual’. This has made this election campaign set against a backdrop of serious challenges dominating the agenda for the foreseeable future. In this policy brief, Amanda Paul and Svitlana Kobzar explore the status of the reform agenda needed for a stable and democratic Ukraine as well as the challenges in the run up to the election including corruption, energy and EU relations.