42 resultados para Coffeee certification schemes


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Several studies have explored physicians' attitudes towards prevention and barriers to the delivery of preventive health interventions. However, the relative importance of these previously identified barriers, both in general terms and in the context of a number of specific preventive interventions, has not been identified. Certain barriers may only pertain to a subset of preventive interventions. OBJECTIVES: We aimed to determine the relative importance of identified barriers to preventive interventions and to explore the association between physicians' characteristics and their attitudes towards prevention. METHODS: We conducted a cross-sectional survey of 496 of the 686 (72.3% response rate) generalist physicians from three Swiss cantons through a questionnaire asking physicians to rate the general importance of eight preventive health strategies and the relative importance of seven commonly cited barriers in relation to each specific preventive health strategy. RESULTS: The proportion of physicians rating each preventive intervention as being important varied from 76% for colorectal cancer screening to 100% for blood pressure control. Lack of time and lack of patient interest were generally considered to be important barriers by 41% and 44% of physicians, respectively, but the importance of these two barriers tended to be specifically higher for counselling-based interventions. Lack of training was most notably a barrier to counselling about alcohol and nutrition. Four characteristics of physicians predicted negative attitudes toward alcohol and smoking counselling: consumption of more than three alcoholic drinks per day [odds ratio (OR) = 8.4], sedentary lifestyle (OR = 3.4), lack of national certification (OR = 2.2) and lack of awareness of their own blood pressure (OR = 2.0). CONCLUSIONS: The relative importance of specific barriers varies across preventive interventions. This points to a need for tailored practice interventions targeting the specific barriers that impede a given preventive service. The negative influence of physicians' own health behaviours indicates a need for associated population-based interventions that reduce the prevalence of high-risk behaviours in the population as a whole.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10-year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Switzerland. Methods: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Lausanne, Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10-year risk of fatal or non-fatal CHD >20% and a 10-year risk of fatal CVD >= 5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. Results: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC (Figure). Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24'310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). Conclusion: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Switzerland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4−year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard−setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: potential toxic and safety hazards of nanomaterials throughout their lifecycles; fate and persistence of nanoparticles in humans, animals and the environment; risks associated to nanoparticle exposure; participation in the preparation of nomenclature, standards, methodologies, protocols and benchmarks; development of best practice guidelines; voluntary schemes on responsibility; databases of materials, research topics and themes. Findings show that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Consequently NIN will encourage stakeholders to be active members. These survey findings will be used to improve NIN's communication tools to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction Lesion detection in multiple sclerosis (MS) is an essential part of its clinical diagnosis. In addition, radiological characterisation of MS lesions is an important research field that aims at distinguishing different MS types, monitoring drug response and prognosis. To date, various MR protocols have been proposed to obtain optimal lesion contrast for early and comprehensive diagnosis of the MS disease. In this study, we compare the sensitivity of five different MR contrasts for lesion detection: (i) the DIR sequence (Double Inversion Recovery, [4]), (ii) the Dark-fluid SPACE acquisition schemes, a 3D variant of a 2D FLAIR sequence [1], (iii) the MP2RAGE [2], an MP-RAGE variant that provides homogeneous T1 contrast and quantitative T1-values, and the sequences currently used for clinical MS diagnosis (2D FLAIR, MP-RAGE). Furthermore, we investigate the T1 relaxation times of cortical and sub-cortical regions in the brain hemispheres and the cerebellum at 3T. Methods 10 early-stage female MS patients (age: 31.64.7y; disease duration: 3.81.9y; disability score, EDSS: 1.80.4) and 10 healthy controls (age and gender-matched: 31.25.8y) were included in the study after obtaining informed written consent according to the local ethic protocol. All experiments were performed at 3T (Magnetom Trio a Tim System, Siemens, Germany) using a 32-channel head coil [5]. The imaging protocol included the following sequences, (all except for axial FLAIR 2D with 1x1x1.2 mm3 voxel and 256x256x160 matrix): DIR (TI1/TI2/TR XX/3652/10000 ms, iPAT=2, TA 12:02 min), MP-RAGE (TI/TR 900/2300 ms, iPAT=3, TA 3:47 min); MP2RAGE (TI1/TI2/TR 700/2500/5000 ms, iPAT=3, TA 8:22 min, cf. [2]); 3D FLAIR SPACE (only for patient 4-6, TI/TR 1800/5000 ms, iPAT=2, TA=5;52 min, cf. [1]); Axial FLAIR (0.9x0.9x2.5 mm3, 256x256x44 matrix, TI/TR 2500/9000 ms, iPAT=2, TA 4:05 min). Lesions were identified by two experienced neurologist and radiologist, manually contoured and assigned to regional locations (s. table 1). Regional lesion masks (RLM) from each contrast were compared for number and volumes of lesions. In addition, RLM were merged in a single "master" mask, which represented the sum of the lesions of all contrasts. T1 values were derived for each location from this mask for patients 5-10 (3D FLAIR contrast was missing for patient 1-4). Results & Discussion The DIR sequence appears the most sensitive for total lesions count, followed by the MP2RAGE (table 1). The 3D FLAIR SPACE sequence turns out to be more sensitive than the 2D FLAIR, presumably due to reduced partial volume effects. Looking for sub-cortical hemispheric lesions, the DIR contrast appears to be equally sensitive to the MP2RAGE and SPACE, but most sensitive for cerebellar MS plaques. The DIR sequence is also the one that reveals cortical hemispheric lesions best. T1 relaxation times at 3T in the WM and GM of the hemispheres and the cerebellum, as obtained with the MP2RAGE sequence, are shown in table 2. Extending previous studies, we confirm overall longer T1-values in lesion tissue and higher standard deviations compared to the non-lesion tissue and control tissue in healthy controls. We hypothesize a biological (different degree of axonal loss and demyelination) rather than technical origin. Conclusion In this study, we applied 5 MR contrasts including two novel sequences to investigate the contrast of highest sensitivity for early MS diagnosis. In addition, we characterized for the first time the T1 relaxation time in cortical and sub-cortical regions of the hemispheres and the cerebellum. Results are in agreement with previous publications and meaningful biological interpretation of the data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on the SYMPLICITY studies and CE (Conformité Européenne) certification, renal denervation is currently applied as a novel treatment of resistant hypertension in Europe. However, information on the proportion of patients with resistant hypertension qualifying for renal denervation after a thorough work-up and treatment adjustment remains scarce. The aim of this study was to investigate the proportion of patients eligible for renal denervation and the reasons for noneligibility at 11 expert centers participating in the European Network COordinating Research on renal Denervation in treatment-resistant hypertension (ENCOReD). The analysis included 731 patients. Age averaged 61.6 years, office blood pressure at screening was 177/96 mm Hg, and the number of blood pressure-lowering drugs taken was 4.1. Specialists referred 75.6% of patients. The proportion of patients eligible for renal denervation according to the SYMPLICITY HTN-2 criteria and each center's criteria was 42.5% (95% confidence interval, 38.0%-47.0%) and 39.7% (36.2%-43.2%), respectively. The main reasons of noneligibility were normalization of blood pressure after treatment adjustment (46.9%), unsuitable renal arterial anatomy (17.0%), and previously undetected secondary causes of hypertension (11.1%). In conclusion, after careful screening and treatment adjustment at hypertension expert centers, only ≈40% of patients referred for renal denervation, mostly by specialists, were eligible for the procedure. The most frequent cause of ineligibility (approximately half of cases) was blood pressure normalization after treatment adjustment by a hypertension specialist. Our findings highlight that hypertension centers with a record in clinical experience and research should remain the gatekeepers before renal denervation is considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the extent and limits of non-state forms of authority in international relations. It analyses how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for adjustment of ICT-related skills. Companies and associations provide training and certification programmes as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasised that the consent of actors subject to informal rules and explicit or implicit state recognition remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues and fail to fully explore the differentiated space in which non-state authority is emerging. This paper examines the form of authority underpinning the global knowledge-based economy within the broader perspective of the issues likely to be standardised by technical ICT specification, the wide range of actors involved, and the highly differentiated space where standards become authoritative. The empirical findings highlight the role of different private actors in establishing international educational norms in this field. They also pinpoint the limits of profit-oriented standard-settings, notably with regard to generic norms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on individual social policy preferences has highlighted a number of socio-structural cleavages as determinants. Studies investigating public opinion on the various redistributive schemes that make up today's welfare states have shown the relevance of class-related factors such as income or education as key explanatory variables (Ferrera 1993; Taylor-Gooby 1995, 1998; and Svallfors 1997). More recent studies, however, have suggested that other factors are also likely to play a role. Among these, the most important are age, gender, and individual values (Armingeon 2006; Deitch 2004; and Roller 2000, 2002). The scenario that emerges from the existing literature is one of multiple intersecting cleavages, but it remains unclear as to what today is the relative weight and specific impact of each of these cleavages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the effect of vouchers for maternity care in public health-care facilities on the utilization of maternal health-care services in Cambodia. METHODS: The study involved data from the 2010 Cambodian Demographic and Health Survey, which covered births between 2005 and 2010. The effect of voucher schemes, first implemented in 2007, on the utilization of maternal health-care services was quantified using a difference-in-differences method that compared changes in utilization in districts with voucher schemes with changes in districts without them. FINDINGS: Overall, voucher schemes were associated with an increase of 10.1 percentage points (pp) in the probability of delivery in a public health-care facility; among women from the poorest 40% of households, the increase was 15.6 pp. Vouchers were responsible for about one fifth of the increase observed in institutional deliveries in districts with schemes. Universal voucher schemes had a larger effect on the probability of delivery in a public facility than schemes targeting the poorest women. Both types of schemes increased the probability of receiving postnatal care, but the increase was significant only for non-poor women. Universal, but not targeted, voucher schemes significantly increased the probability of receiving antenatal care. CONCLUSION: Voucher schemes increased deliveries in health centres and, to a lesser extent, improved antenatal and postnatal care. However, schemes that targeted poorer women did not appear to be efficient since these women were more likely than less poor women to be encouraged to give birth in a public health-care facility, even with universal voucher schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose of the study: Basic life support (BLS) and automated externaldefibrillation (AED) represent important skills to be acquired duringpregraduate medical training. Since 3 years, our medical school hasintroduced a BLS-AED course (with certification) for all second yearmedical students. Few reports about quality and persistence over timeof BLS-AED learning are available to date in the medical literature.Comprehensive evaluation of students' acquired skills was performedat the end of the 2008 academic year, 6 month after certification.Materials and methods: The students (N = 142) were evaluated duringa 9 minutes «objective structured clinical examination» (OSCE) station.Out of a standardized scenario, they had to recognize a cardiac arrestsituation and start a resuscitation process. Their performance wererecorded on a PC using an Ambuman(TM) mannequin and the AmbuCPR software kit(TM) during a minimum of 8 cycles (30 compressions:2 ventilations each). BLS parameters were systematically checked. Nostudent-rater interactions were allowed during the whole evaluation.Results: Response of the victim was checked by 99% of the students(N = 140), 96% (N = 136) called for an ambulance and/or an AED. Openthe airway and check breathing were done by 96% (N = 137), 92% (N =132) gave 2 rescue breaths. Pulse was checked by 95% (N=135), 100%(N = 142) begun chest compression, 96% (N = 136) within 1 minute.Chest compression rate was 101 ± 18 per minute (mean ± SD), depthcompression 43 ± 8 mm, 97% (N = 138) respected a compressionventilationratio of 30:2.Conclusions: Quality of BLS skills acquisition is maintained during a6-month period after a BLS-AED certification. Main targets of 2005 AHAguidelines were well respected. This analysis represents one of thelargest evaluations of specific BLS teaching efficiency reported. Furtherfollow-up is needed to control the persistence of these skills during alonger time period and noteworthy at the end of the pregraduatemedical curriculum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: From most recent available data, we projected cancer mortality statistics for 2014, for the European Union (EU) and its six more populous countries. Specific attention was given to pancreatic cancer, the only major neoplasm showing unfavorable trends in both sexes. PATIENTS AND METHODS: Population and death certification data from stomach, colorectum, pancreas, lung, breast, uterus, prostate, leukemias and total cancers were obtained from the World Health Organisation database and Eurostat. Figures were derived for the EU, France, Germany, Italy, Poland, Spain and the UK. Projected 2014 numbers of deaths by age group were obtained by linear regression on estimated numbers of deaths over the most recent time period identified by a joinpoint regression model. RESULTS: In the EU in 2014, 1,323,600 deaths from cancer are predicted (742,500 men and 581,100 women), corresponding to standardized death rates of 138.1/100,000 men and 84.7/100,000 women, falling by 7% and 5%, respectively, since 2009. In men, predicted rates for the three major cancers (lung, colorectum and prostate cancer) are lower than in 2009, falling by 8%, 4% and 10%, respectively. In women, breast and colorectal cancers had favorable trends (-9% and -7%), but female lung cancer rates are predicted to rise 8%. Pancreatic cancer is the only neoplasm with a negative outlook in both sexes. Only in the young (25-49 years), EU trends become more favorable in men, while women keep registering slight predicted rises. CONCLUSIONS: Cancer mortality predictions for 2014 confirm the overall favorable cancer mortality trend in the EU, translating to an overall 26% fall in men since its peak in 1988, and 20% in women, and the avoidance of over 250,000 deaths in 2014 compared with the peak rate. Notable exceptions are female lung cancer and pancreatic cancer in both sexes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fine particulate matter from traffic increases mortality and morbidity. An important source of traffic particles is brake wear. American studies reported cars to emit break wear particles at a rate of about 11mg/km to 20mg/km of driven distance. A German study estimated that break wear contributes about 12.5% to 21% of the total traffic particle emissions. The goal of this study was to build a system that allows the study of brake wear particle emissions during different braking behaviours of different car and brake types. The particles should be characterize in terms of size, number, metal, and elemental and organic carbon composition. In addition, the influence of different deceleration schemes on the particle composition and size distribution should be studied. Finally, this system should allow exposing human cell cultures to these particles. An exposure-box (0.25 cubic-m volume) was built that can be mounted around a car's braking system. This allows exposing cells to fresh brake wear particles. Concentrations of particle numbers, mass and surface, metals, and carbon compounds were quantified. Tests were conducted with A549 lung epithelial cells. Five different cars and two typical braking behaviours (full stop and normal deceleration) were tested. Particle number and size distribution was analysed for the first six minutes. In this time, two braking events occurred. Full stop produced significantly higher particle concentrations than normal deceleration (average of 23'000 vs. 10'400 #/cm3, p= 0.016). The particle number distribution was bi-modal with one peak at 60 to 100 nm (depending on the tested car and braking behaviour) and a second peak at 200 to 400 nm. Metal concentrations varied depending on the tested car type. Iron (range of 163 to 15'600 μg/m3) and Manganese (range of 0.9 to 135 μg/m3) were present in all samples, while Copper was absent in some samples (<6 to 1220 μg/m3). The overall "fleet" metal ratio was Fe:Cu:Mn = 128:14:1. Temperature and humidity varied little. A549-cells were successfully exposed in the various experimental settings and retained their viability. Culture supernatant was stored and cell culture samples were fixated to test for inflammatory response. Analysis of these samples is ongoing. The established system allowed testing brake wear particle emissions from real-world cars. The large variability of chemical composition and emitted amounts of brake wear particles between car models seems to be related to differences between brake pad compositions of different producers. Initial results suggest that the conditions inside the exposure box allow exposing human lung epithelial cells to freshly produced brake wear particles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural genetic variation can have a pronounced influence on human taste perception, which in turn may influence food preference and dietary choice. Genome-wide association studies represent a powerful tool to understand this influence. To help optimize the design of future genome-wide-association studies on human taste perception we have used the well-known TAS2R38-PROP association as a tool to determine the relative power and efficiency of different phenotyping and data-analysis strategies. The results show that the choice of both data collection and data processing schemes can have a very substantial impact on the power to detect genotypic variation that affects chemosensory perception. Based on these results we provide practical guidelines for the design of future GWAS studies on chemosensory phenotypes. Moreover, in addition to the TAS2R38 gene past studies have implicated a number of other genetic loci to affect taste sensitivity to PROP and the related bitter compound PTC. None of these other locations showed genome-wide significant associations in our study. To facilitate further, target-gene driven, studies on PROP taste perception we provide the genome-wide list of p-values for all SNPs genotyped in the current study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The specificities of multinational corporations (MNCs) have to date not been a focus area of IS research. Extant literature mostly proposes IS configurations for specific types of MNCs, following a static and prescriptive approach. Our research seeks to explain the dynamics of global IS design. It suggests a new theoretical lens for studying global IS design by applying the structural adjustment paradigm from organizational change theories. Relying on archetype theory, we conduct a longitudinal case study to theorize the dynamics of IS adaptation. We find that global IS design emerges as an organizational adaptation process to balance interpretative schemes (i.e. the organization's values and beliefs) and structural arrangements (i.e. strategic, organizational, and IS configurations). The resulting insights can be used as a basis to further explore alternative global IS designs and movements between them.