905 resultados para interviewer effects, multi-level, random interviewer assignment, panel survey, political opinion
Resumo:
Untersucht werden Prozess-Ergebnis-Zusammenhänge einer kognitiv-verhaltenstherapeutischen Gruppentherapie für Diabetes und Depression im Rahmen der DAD-Studie. rnAufgrund des Mangels an geeigneten Erhebungsinstrumenten der validen, ökonomischen und komplementären Sitzungsbewertung von Gruppenpatienten und -therapeuten wurden angelehnt an einen Patienten- (GTS-P) zwei Therapeutenstundenbögen entwickelt: der GTS-T zur Bewertung der Gesamtgruppe und der GTS-TP zur Bewertung einzelner Patienten. Die GTS-Bögen zeigen bei der Überprüfung der Testgüte insgesamt gute Itemparameter und Reliabilitäten. Das in den exploratorischen Faktorenanaylsen des GTS-P identifizierte zweifaktorielle Modell (1. wahrgenommene Zuversicht hinsichtlich der Gruppentherapie, 2. wahrgenommene persönliche Beteiligung) kann in den konfirmatorischen Faktorenanalysen bestätigt werden. Dazu wurden GTS-P-Daten aus einer Untersuchung mit Patienten mit somatoformen Störungen (Schulte, 2001) einbezogen. Den Ergebnissen der Item- und Faktorenanalysen folgend, wurden zwei Items des GTS-P und zwei weitere Items des GTS-T aus den Instrumenten ausgeschlossen. Für den GTS-T zeigt sich eine einfaktorielle, für den GTS-TP eine zum GTS-P parallele zweifaktorielle Struktur. rnIn den Mehrebenenanalysen zur Vorhersage des Therapieergebnisses (Post-Depressionssymptomatik) zeigt sich die Skala Zuversicht des GTS-P zu Therapiebeginn (1.-4. Sitzung) kontrolliert an der Skala Beteiligung und der Prä-Symptomatik, als valider Prädiktor. Das Item 5 „Anregungen“ (Skala Zuversicht) und Item 2 „Aktive Mitwirkung“ (Skala Beteiligung) sind am stärksten an diesem Effekt beteiligt, da diese Itemkombination das Therapieergebnis ebenfalls valide vorhersagen kann. Die Prognose ist schon durch die Werte der ersten Gruppentherapiesitzungen in der Remoralisierungsphase (Howard et al., 1993) möglich und verbessert sich nicht bei Berücksichtigung aller 10 Gruppensitzungen. Die Therapeutenbögen zeigen keine prädiktive Validität. Bedeutsame Zusammenhänge der Patienten- und Therapeutenbewertungen finden sich lediglich für den GTS-P und GTS-TP. Weitere Prädiktoren, wie der Diabetestyp, Diabeteskomplikationen und die Adhärenz, konnten nicht zur Verbesserung der Vorhersage beitragen. Für sekundär überprüfte Kriterien gelang die Prognose lediglich für ein weiteres Maß der Depressionssymptomatik und für eine Gesamtbewertung der Gruppentherapie durch die Patienten zu Therapieende. Bei der deskriptiven Betrachtung der Prozessqualität der DAD-Gruppentherapien zeigen sich positive, über den Verlauf der Gruppe zunehmende und nach Therapiephasen differenzierbare Bewertungsverläufe. rnDie Ergebnisse der Studie sprechen für die Relevanz von unspezifischen Wirkfaktoren für das Therapieergebnis von kognitiv-behavioralen Gruppentherapien. Die von den Gruppenpatienten wahrgenommene Zuversicht und Beteiligung als Zeichen der Ansprechbarkeit auf die Therapie sollte mit Hilfe von Stundenbögen, wie den GTS-Bögen, von Gruppentherapeuten zur Prozessoptimierung und Prävention von Therapieabbrüchen und Misserfolgen beachtet werden. rn
Resumo:
Die Regulierung und Steuerung von Hochschulen unterliegt spätestens seit der Liberalisierung des Hochschulrahmengesetzes im Jahr 1998 einer erheblichen Reformdynamik. Hochschulautonomie, New Public Management, Profilbildung, Exzellenz und Wettbewerb sind zentrale Stichworte im Rahmen der durchgeführten politischen Reformen und Programme.rnDer politisch intendierte Ausbau einer organisationalen Selbststeuerung von Universitäten stellt die Hochschulen vor beachtliche Herausforderungen und kann als Paradigmenwechsel im Bereich der Hochschul-Governance betrachtet werden. In der Fachdiskussion wird der entsprechende Wandel auch als Stärkung der „managerial governance“ (bspw. de Boer et al. 2007) oder als Wandel von Universitäten hin zu „more complete organizations“ (Brunsson/ Sahlin-Andersson 2000) bzw. „organisational actors“ (Krücken/Meier 2006) beschrieben. rnGleichzeitig liegt bislang eher fragmentarisches Wissen darüber vor, wie der veränderte Re-gulierungskontext von den Steuerungsakteuren in deutschen Hochschulen aufgegriffen wird, d.h. ob auf Organisationsebene tatsächlich ein Ausbau der organisationalen Selbststeuerung stattfindet, welche Steuerungsinitiativen und -instrumente sich bewähren und warum dies der Fall ist. Die vorliegende Arbeit geht diesen Fragen im Rahmen einer vergleichenden Fallstudie an sechs Universitäten nach. rnIm Zentrum der empirischen Erhebung stehen 60 qualitative sozialwissenschaftliche Interviews mit Leitungsakteuren auf Hochschul- und Fachbereichsebene. Diese Daten werden ergänzt durch umfangreiche Dokumentenanalysen, insbesondere von Jahresberichten, Grundordnungen, Strategie- und Planungsdokumenten sowie durch Daten der amtlichen Hochschul-statistik. Das Untersuchungsdesign erlaubt überdies eine Gegenüberstellung von großen und kleinen Universitäten sowie von Hochschulen mit einer technisch-naturwissenschaftlichen Ausrichtung gegenüber solchen mit einem kultur- und sozialwissenschaftlichen Schwerpunkt. Die Untersuchung zeigt, dass an fünf der sechs untersuchten Hochschulen ein zum Teil deutlicher Ausbau der organisationalen Selbststeuerung festzustellen ist, wenngleich der spezifische organisationale Charakter von Universitäten, d.h. eine weitgehend lose Kopplung mit autonomen Professionals, im Wesentlichen erhalten bleibt. Die Zusammenschau der Veränderungen ergibt ein idealtypisches Modell des Wandels von Strategie, Struktur und Kultur der Hochschulen. Auf Basis der empirischen Ergebnisse werden weiterhin zentrale externe und interne Einflussfaktoren auf den spezifischen organisationalen Wandel analysiert. Schließlich werden Kosten und Nutzen sowie Risiken und Chancen der Governance-Reformen im Hoch-schulbereich gegenübergestellt.
Resumo:
Im Rahmen dieser Dissertation wurden quantenchemische Untersuchungen zum Phänomen des elektronischen Energietransfers durchgeführt. Zum einen wurden theoretische Modelle zur Berücksichtigung temperaturabhängiger Elektron-Phonon-Kopplung in vibronischen Spektren ausgearbeitet und numerischen Tests unterzogen. Zum anderen erfolgte die Bestimmung molekularer Eigenschaften bichromophorer Systeme unter Anwendung etablierter Rechenmethoden. Im Fokus stehen das Zusammenspiel elektronischer Kopplung und statischer Unordnung sowie Energietransferzeiten und der Einfluss molekularer Brücken in Dimeren auf die Kopplung. Da sich elektronischer Energietransfer spektroskopisch nachweisen lässt, wurden temperaturabhängige Simulationen der Linienform von vibronischen Übergängen, die an ein Wärmebad ankoppeln, durchgeführt. Die erforderliche Antwortfunktion zur Bestimmung der spektralen Linienform kann aus einer Kumulantenentwicklung und alternativ aus der Multi-Level Redfieldtheorie abgeleitet werden. Statt der genäherten Schwingungsstruktur des Brownschen Oszillatormodells wurde eine explizit berechnete Zustandsdichte als Ausgangspunkt verwendet. Sowohl reine Elektron-Phonon- als auch Schwingung-Phonon-Kopplung werden für verschiedene Spektraldichten der Badmoden diskutiert. Im Zuge eines Kooperationsprojekts führten wir Untersuchungen zur elektronischen Kopplung an einer homologen Reihe von Rylendimeren mit unterschiedlichen Brückenlängen durch. Zu diesem Zweck wurden Ergebnisse aus Tieftemperatureinzelmolekülmessungen und quantenchemischen Berechnungen auf Grundlage des vibronischen Kopplungsmodells herangezogen und ausgewertet. Die untersuchten Dimere zeigen einen Übergang vom Grenzfall starker Kopplung hin zu schwacher Kopplung und die mittleren Energietransferzeiten konnten in guter Übereinstimmung mit experimentellen Messwerten berechnet werden. Da eine molekulare Brücke zwischen Donor- und Akzeptoreinheit die elektronische Kopplung modifiziert, kann sie sich störend auf experimentelle Messungen auswirken. Daher wurde untersucht, ob das interchromophore Kopplungsverhalten vorwiegend durch die Polarisierbarkeit des verbrückenden Elements oder durch bindungsvermittelte Wechselwirkungen beeinflusst wird und welche Brückentypen sich folglich für experimentelle Studien eignen. Sämtliche untersuchten Brückenelemente führten zu einer Vergrößerung der elektronischen Kopplung und die Kopplungsstärke wurde maßgeblich durch brückenvermittelte Wechselwirkungen bestimmt.
Resumo:
In questo lavoro di ricerca ho esaminato la Teoria della Transizione e più nello specifico lo sviluppo di un possibile Living Lab della sostenibilità nel contesto universitario. In primo luogo, ho analizzato la situazione attuale per quanto riguarda lo sviluppo sostenibile in un contesto generale. Inoltre ho dovuto analizzare anche quali sono gli indici che usiamo per definire il benessere umano e su cui basiamo la nostra intera economia, come ad esempio il PIL. In secondo luogo, ho definito la Teoria della Transizione in ambito generale elencandone i vari strumenti di applicazione e i metodi. In fine ho cercato di applicare la Teoria della Transizione nel contesto della sostenibilità in ambito universitario, utilizzando i progetti di Transizione attraverso i Living Lab di “Terracini in Transizione” dell’Università di Bologna e “GOU Living Lab” dell’Università di Utrecht. Dai risultati ottenuti ho definito i limiti e le potenzialità che questi progetti di Living Lab avevano attraverso l’utilizzo della SWOT analysis. La quale ha evidenziato la necessità della costituzione di un gruppo all’interno dell’Università di Bologna che si occupi della gestione dei progetti green di Transizione, come nel contesto in cui mi sono venuto a trovare nell’Università di Utrecht con la presenza del Green Office.
Resumo:
Secondo l'Agenzia Europea dell'Ambiente una delle principali minacce per le risorse di acqua dolce della zone costiere italiane è l’intrusione salina. L’obiettivo di questa tesi magistrale è la caratterizzazione idrogeologica di una frazione dell’acquifero freatico costiero situato in due differenti corpi dunosi. L’indagine proseguita per cinque mesi ha evidenziano differenze tra un’area sottoposta a forte pressione antropica (Marina Romea) e un’area che mostra un relativo sviluppo naturale del sistema spiaggia-duna (Porto Corsini). La tecnica di campionamento utilizzata è il sistema a minifiltri (multi level samplers), metodologia innovativa che garantisce tempistiche di monitoraggio rapide e una campionatura multi-livello puntuale e precisa. La campagna di monitoraggio ha coinvolto misure di freatimetria, conduttività elettrica e analisi chimiche delle acque che hanno portato ad una loro classificazione geo-chimica. Dai risultati si evidenzia che l’acquifero è molto salinizzato, gli strati d’acqua dolce sono isolati in lenti superficiali e i tipi di acque presenti sono dominati da ioni sodio e cloro. Tra i due siti il più vulnerabile risulta essere Marina Romea per molti fattori: l’erosione costiera che assottiglia la fascia dunale adibita alla ricarica di acqua dolce, un’estensione spaziale della duna minore rispetto a Porto Corsini, la presenza di infrastrutture turistiche che hanno frazionato la duna, la vicinanza al canale di drenaggio che causa la risalita delle acque profonde saline, la presenza di specie arboree idro-esigenti che attingono e quindi assottigliano le lenti d’acqua dolce. Si propone di migliorare la qualità dell’acqua sotterranea con una migliore gestione del canale di drenaggio, sostituendo alcuni esemplari di pinacee con specie arbustive tipiche degli ambienti dunosi ed infine imponendo misure per il risparmio idrico durante la stagione turistica.
Resumo:
PURPOSE OF REVIEW: Mechanical ventilation is a cornerstone of ICU treatment. Because of its interaction with blood flow and intra-abdominal pressure, mechanical ventilation has the potential to alter hepato-splanchnic perfusion, abdominal organ function and thereby outcome of the most critically ill patients. RECENT FINDINGS: Mechanical ventilation can alter hepato-splanchnic perfusion, but the effects are minimal (with moderate inspiratory pressures, tidal volumes, and positive end-expiratory pressure levels) or variable (with high ones). Routine nursing procedures may cause repeated episodes of inadequate hepato-splanchnic perfusion in critically ill patients, but an association between perfusion and multiple organ dysfunction cannot yet be determined. Clinical research continues to be challenging as a result of difficulties in measuring hepato-splanchnic blood flow at the bedside. SUMMARY: Mechanical ventilation and attempts to improve oxygenation such as intratracheal suctioning and recruitment maneuvers, may have harmful consequences in patients with already limited cardiovascular reserves or deteriorated intestinal perfusion. Due to difficulties in assessing hepato-splanchnic perfusion, such effects are often not detected.
Resumo:
Pathological complete response (pCR) to neoadjuvant treatment correlates with outcome in breast cancer. We determined whether characteristics of neoadjuvant therapy are associated with pCR. We used multi-level models, which accounted for heterogeneity in pCR across trials and trial arms, to analyze individual patient data from 3332 women included in 7 German neoadjuvant trials with uniform protocols. PCR was associated with an increase in number of chemotherapy cycles (odds ratio [OR] 1.2 for every two additional cycles; P = 0.009), with higher cumulative anthracycline doses (OR 1.6; P = 0.002), higher cumulative taxane doses (OR 1.6; P = 0.009), and with capecitabine containing regimens (OR 1.62; P = 0.022). Association of pCR with increase in number of cycles appeared more pronounced in hormone receptor (HR)-positive tumors (OR 1.35) than in HR-negative tumors (OR 1.04; P for interaction = 0.046). Effect of anthracycline dose was particularly pronounced in HER2-negative tumors (OR 1.61), compared to HER2-positive tumors (OR 0.83; P for interaction = 0.14). Simultaneous trastuzumab treatment in HER2-positive tumors increased odds of pCR 3.2-fold (P < 0.001). No association of pCR and number of trastuzumab cycles was found (OR 1.20, P = 0.39). Dosing characteristics appear important for successful treatment of breast cancer. Longer treatment, higher cumulative doses of anthracyclines and taxanes, and the addition of capecitabine and trastuzumab are associated with better response. Tailoring according to breast cancer phenotype might be possible: longer treatment in HR-positive tumors, higher cumulative anthracycline doses for HER2-negative tumors, shorter treatment at higher cumulative doses for triple-negative tumors, and limited number of preoperative trastuzumab cycles in HER2-positive tumors.
Resumo:
We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.
Resumo:
This study examined the impact of the Nursing Home Reform Act of 1987 on resident-and-facility-level risk factors for physical restraint use in nursing homes. Data on the 1990 and 1993 cohorts were obtained from 268 facilities in 10 states, and data on a 1996 cohort were obtained from the Medical Expenditure Panel Survey, which sampled more than 800 nursing homes nationwide. Multivariate logistic regression models were generated for each cohort to identify the impact of resident- and facility-level risk factors for restraint use. The results indicate that the use of physical restraints continues to decline. Thirty-six percent of the 1990 cohort, 26 percent of the 1993 cohort, and 17 percent of the 1996 cohort were physically restrained. Although there was a reduced rate of restraint use from 1990 to 1996, similar resident-level factors but different facility-level factors were associated with restraint use at different points in time.
Resumo:
Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.
Resumo:
Genome-wide association studies (GWAS) are used to discover genes underlying complex, heritable disorders for which less powerful study designs have failed in the past. The number of GWAS has skyrocketed recently with findings reported in top journals and the mainstream media. Mircorarrays are the genotype calling technology of choice in GWAS as they permit exploration of more than a million single nucleotide polymorphisms (SNPs)simultaneously. The starting point for the statistical analyses used by GWAS, to determine association between loci and disease, are genotype calls (AA, AB, or BB). However, the raw data, microarray probe intensities, are heavily processed before arriving at these calls. Various sophisticated statistical procedures have been proposed for transforming raw data into genotype calls. We find that variability in microarray output quality across different SNPs, different arrays, and different sample batches has substantial inuence on the accuracy of genotype calls made by existing algorithms. Failure to account for these sources of variability, GWAS run the risk of adversely affecting the quality of reported findings. In this paper we present solutions based on a multi-level mixed model. Software implementation of the method described in this paper is available as free and open source code in the crlmm R/BioConductor.
Resumo:
A patient-specific surface model of the proximal femur plays an important role in planning and supporting various computer-assisted surgical procedures including total hip replacement, hip resurfacing, and osteotomy of the proximal femur. The common approach to derive 3D models of the proximal femur is to use imaging techniques such as computed tomography (CT) or magnetic resonance imaging (MRI). However, the high logistic effort, the extra radiation (CT-imaging), and the large quantity of data to be acquired and processed make them less functional. In this paper, we present an integrated approach using a multi-level point distribution model (ML-PDM) to reconstruct a patient-specific model of the proximal femur from intra-operatively available sparse data. Results of experiments performed on dry cadaveric bones using dozens of 3D points are presented, as well as experiments using a limited number of 2D X-ray images, which demonstrate promising accuracy of the present approach.
Resumo:
Transformers are very important elements of any power system. Unfortunately, they are subjected to through-faults and abnormal operating conditions which can affect not only the transformer itself but also other equipment connected to the transformer. Thus, it is essential to provide sufficient protection for transformers as well as the best possible selectivity and sensitivity of the protection. Nowadays microprocessor-based relays are widely used to protect power equipment. Current differential and voltage protection strategies are used in transformer protection applications and provide fast and sensitive multi-level protection and monitoring. The elements responsible for detecting turn-to-turn and turn-to-ground faults are the negative-sequence percentage differential element and restricted earth-fault (REF) element, respectively. During severe internal faults current transformers can saturate and slow down the speed of relay operation which affects the degree of equipment damage. The scope of this work is to develop a modeling methodology to perform simulations and laboratory tests for internal faults such as turn-to-turn and turn-to-ground for two step-down power transformers with capacity ratings of 11.2 MVA and 290 MVA. The simulated current waveforms are injected to a microprocessor relay to check its sensitivity for these internal faults. Saturation of current transformers is also studied in this work. All simulations are performed with the Alternative Transients Program (ATP) utilizing the internal fault model for three-phase two-winding transformers. The tested microprocessor relay is the SEL-487E current differential and voltage protection relay. The results showed that the ATP internal fault model can be used for testing microprocessor relays for any percentage of turns involved in an internal fault. An interesting observation from the experiments was that the SEL-487E relay is more sensitive to turn-to-turn faults than advertized for the transformers studied. The sensitivity of the restricted earth-fault element was confirmed. CT saturation cases showed that low accuracy CTs can be saturated with a high percentage of turn-to-turn faults, where the CT burden will affect the extent of saturation. Recommendations for future work include more accurate simulation of internal faults, transformer energization inrush, and other scenarios involving core saturation, using the newest version of the internal fault model. The SEL-487E relay or other microprocessor relays should again be tested for performance. Also, application of a grounding bank to the delta-connected side of a transformer will increase the zone of protection and relay performance can be tested for internal ground faults on both sides of a transformer.
Resumo:
High concentrations of fluoride naturally occurring in the ground water in the Arusha region of Tanzania cause dental, skeletal and non-skeletal fluorosis in up to 90% of the region’s population [1]. Symptoms of this incurable but completely preventable disease include brittle, discolored teeth, malformed bones and stiff and swollen joints. The consumption of high fluoride water has also been proven to cause headaches and insomnia [2] and adversely affect the development of children’s intelligence [3, 4]. Despite the fact that this array of symptoms may significantly impact a society’s development and the citizens’ ability to perform work and enjoy a reasonable quality of life, little is offered in the Arusha region in the form of solutions for the poor, those hardest hit by the problem. Multiple defluoridation technologies do exist, yet none are successfully reaching the Tanzanian public. This report takes a closer look at the efforts of one local organization, the Defluoridation Technology Project (DTP), to address the region’s fluorosis problem through the production and dissemination of bone char defluoridation filters, an appropriate technology solution that is proven to work. The goal of this research is to improve the sustainability of DTP’s operations and help them reach a wider range of clients so that they may reduce the occurrence of fluorosis more effectively. This was done first through laboratory testing of current products. Results of this testing show a wide range in uptake capacity across batches of bone char emphasizing the need to modify kiln design in order to produce a more consistent and high quality product. The issue of filter dissemination was addressed through the development of a multi-level, customerfunded business model promoting the availability of filters to Tanzanians of all socioeconomic levels. Central to this model is the recommendation to focus on community managed, institutional sized filters in order to make fluoride free water available to lower income clients and to increase Tanzanian involvement at the management level.
Resumo:
Approaching Switzerland as a “laboratory” for democracy, this Handbook contributes to a refined understanding of the res publica. Over the years, the Handbook of Swiss Politics has established itself as a classic work. This new and extended second edition of the Handbook comprises 32 chapters, all by leading Swiss political scientists. The contributors write about fundamentals, institutions, interest groups, political parties, new social movements, the cantons and municipalities, elections, popular votes, policy processes and public policies. They address several important issues in the current international debates, such as the internationalization of domestic politics, multi-level governance, and the role of metropolitan agglomerations. Nine new chapters enrich this second, completely updated version. The section on public policies has been significantly extended, and covers a dozen of policy domains. Grounded on the latest scientific knowledge, this volume also serves as an indispensable reference for a non-academic audience of decision-makers, diplomats, senior officials and journalists.