982 resultados para Data and Information


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The atmosphere is a global influence on the movement of heat and humidity between the continents, and thus significantly affects climate variability. Information about atmospheric circulation are of major importance for the understanding of different climatic conditions. Dust deposits from maar lakes and dry maars from the Eifel Volcanic Field (Germany) are therefore used as proxy data for the reconstruction of past aeolian dynamics.rnrnIn this thesis past two sediment cores from the Eifel region are examined: the core SM3 from Lake Schalkenmehren and the core DE3 from the Dehner dry maar. Both cores contain the tephra of the Laacher See eruption, which is dated to 12,900 before present. Taken together the cores cover the last 60,000 years: SM3 the Holocene and DE3 the marine isotope stages MIS-3 and MIS-2, respectively. The frequencies of glacial dust storm events and their paleo wind direction are detected by high resolution grain size and provenance analysis of the lake sediments. Therefore two different methods are applied: geochemical measurements of the sediment using µXRF-scanning and the particle analysis method RADIUS (rapid particle analysis of digital images by ultra-high-resolution scanning of thin sections).rnIt is shown that single dust layers in the lake sediment are characterized by an increased content of aeolian transported carbonate particles. The limestone-bearing Eifel-North-South zone is the most likely source for the carbonate rich aeolian dust in the lake sediments of the Dehner dry maar. The dry maar is located on the western side of the Eifel-North-South zone. Thus, carbonate rich aeolian sediment is most likely to be transported towards the Dehner dry maar within easterly winds. A methodology is developed which limits the detection to the aeolian transported carbonate particles in the sediment, the RADIUS-carbonate module.rnrnIn summary, during the marine isotope stage MIS-3 the storm frequency and the east wind frequency are both increased in comparison to MIS-2. These results leads to the suggestion that atmospheric circulation was affected by more turbulent conditions during MIS-3 in comparison to the more stable atmospheric circulation during the full glacial conditions of MIS-2.rnThe results of the investigations of the dust records are finally evaluated in relation a study of atmospheric general circulation models for a comprehensive interpretation. Here, AGCM experiments (ECHAM3 and ECHAM4) with different prescribed SST patterns are used to develop a synoptic interpretation of long-persisting east wind conditions and of east wind storm events, which are suggested to lead to an enhanced accumulation of sediment being transported by easterly winds to the proxy site of the Dehner dry maar.rnrnThe basic observations made on the proxy record are also illustrated in the 10 m-wind vectors in the different model experiments under glacial conditions with different prescribed sea surface temperature patterns. Furthermore, the analysis of long-persisting east wind conditions in the AGCM data shows a stronger seasonality under glacial conditions: all the different experiments are characterized by an increase of the relative importance of the LEWIC during spring and summer. The different glacial experiments consistently show a shift from a long-lasting high over the Baltic Sea towards the NW, directly above the Scandinavian Ice Sheet, together with contemporary enhanced westerly circulation over the North Atlantic.rnrnThis thesis is a comprehensive analysis of atmospheric circulation patterns during the last glacial period. It has been possible to reconstruct important elements of the glacial paleo climate in Central Europe. While the proxy data from sediment cores lead to a binary signal of the wind direction changes (east versus west wind), a synoptic interpretation using atmospheric circulation models is successful. This shows a possible distribution of high and low pressure areas and thus the direction and strength of wind fields which have the capacity to transport dust. In conclusion, the combination of numerical models, to enhance understanding of processes in the climate system, with proxy data from the environmental record is the key to a comprehensive approach to paleo climatic reconstruction.rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SMARTDIAB is a platform designed to support the monitoring, management, and treatment of patients with type 1 diabetes mellitus (T1DM), by combining state-of-the-art approaches in the fields of database (DB) technologies, communications, simulation algorithms, and data mining. SMARTDIAB consists mainly of two units: 1) the patient unit (PU); and 2) the patient management unit (PMU), which communicate with each other for data exchange. The PMU can be accessed by the PU through the internet using devices, such as PCs/laptops with direct internet access or mobile phones via a Wi-Fi/General Packet Radio Service access network. The PU consists of an insulin pump for subcutaneous insulin infusion to the patient and a continuous glucose measurement system. The aforementioned devices running a user-friendly application gather patient's related information and transmit it to the PMU. The PMU consists of a diabetes data management system (DDMS), a decision support system (DSS) that provides risk assessment for long-term diabetes complications, and an insulin infusion advisory system (IIAS), which reside on a Web server. The DDMS can be accessed from both medical personnel and patients, with appropriate security access rights and front-end interfaces. The DDMS, apart from being used for data storage/retrieval, provides also advanced tools for the intelligent processing of the patient's data, supporting the physician in decision making, regarding the patient's treatment. The IIAS is used to close the loop between the insulin pump and the continuous glucose monitoring system, by providing the pump with the appropriate insulin infusion rate in order to keep the patient's glucose levels within predefined limits. The pilot version of the SMARTDIAB has already been implemented, while the platform's evaluation in clinical environment is being in progress.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the past 20 years or so, more has become known about the properties of khat, its pharmacology, physiological and psychological effects on humans. However, at the same time its reputation of social and recreational use in traditional contexts has hindered the dissemination of knowledge about its detrimental effects in terms of mortality. This paper focuses on this particular deficit and adds to the knowledge-base by reviewing the scant literature that does exist on mortality associated with the trade and use of khat. We sought all peer-reviewed papers relating to deaths associated with khat. From an initial list of 111, we identified 15 items meeting our selection criteria. Examination of these revealed 61 further relevant items. These were supplemented with published reports, newspaper and other media reports. A conceptual framework was then developed for classifying mortality associated with each stage of the plant's journey from its cultivation, transportation, consumption, to its effects on the human body. The model is demonstrated with concrete examples drawn from the above sources. These highlight a number of issues for which more substantive statistical data are needed, including population-based studies of the physiological and psychological determinants of khat-related fatalities. Khat-consuming communities, and health professionals charged with their care should be more aware of the physiological and psychological effects of khat, together with the risks for morbidity and mortality associated with its use. There is also a need for information to be collected at international and national levels on other causes of death associated with khat cultivation, transportation, and trade. Both these dimensions need to be understood.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is focused on the integration of state-of-the-art technologies in the fields of telecommunications, simulation algorithms, and data mining in order to develop a Type 1 diabetes patient's semi to fully-automated monitoring and management system. The main components of the system are a glucose measurement device, an insulin delivery system (insulin injection or insulin pumps), a mobile phone for the GPRS network, and a PDA or laptop for the Internet. In the medical environment, appropriate infrastructure for storage, analysis and visualizing of patients' data has been implemented to facilitate treatment design by health care experts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Earth observations (EO) represent a growing and valuable resource for many scientific, research and practical applications carried out by users around the world. Access to EO data for some applications or activities, like climate change research or emergency response activities, becomes indispensable for their success. However, often EO data or products made of them are (or are claimed to be) subject to intellectual property law protection and are licensed under specific conditions regarding access and use. Restrictive conditions on data use can be prohibitive for further work with the data. Global Earth Observation System of Systems (GEOSS) is an initiative led by the Group on Earth Observations (GEO) with the aim to provide coordinated, comprehensive, and sustained EO and information for making informed decisions in various areas beneficial to societies, their functioning and development. It seeks to share data with users world-wide with the fewest possible restrictions on their use by implementing GEOSS Data Sharing Principles adopted by GEO. The Principles proclaim full and open exchange of data shared within GEOSS, while recognising relevant international instruments and national policies and legislation through which restrictions on the use of data may be imposed.The paper focuses on the issue of the legal interoperability of data that are shared with varying restrictions on use with the aim to explore the options of making data interoperable. The main question it addresses is whether the public domain or its equivalents represent the best mechanism to ensure legal interoperability of data. To this end, the paper analyses legal protection regimes and their norms applicable to EO data. Based on the findings, it highlights the existing public law statutory, regulatory, and policy approaches, as well as private law instruments, such as waivers, licenses and contracts, that may be used to place the datasets in the public domain, or otherwise make them publicly available for use and re-use without restrictions. It uses GEOSS and the particular characteristics of it as a system to identify the ways to reconcile the vast possibilities it provides through sharing of data from various sources and jurisdictions on the one hand, and the restrictions on the use of the shared resources on the other. On a more general level the paper seeks to draw attention to the obstacles and potential regulatory solutions for sharing factual or research data for the purposes that go beyond research and education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract. Ancient Lake Ohrid is a steep-sided, oligotrophic, karst lake that was tectonically formed most likely within the Pliocene and often referred to as a hotspot of endemic biodiversity. This study aims on tracing significant lake level fluctuations at Lake Ohrid using high-resolution acoustic data in combination with lithological, geochemical, and chronological information from two sediment cores recovered from sub-aquatic terrace levels at ca. 32 and 60m water depth. According to our data, significant lake level fluctuations with prominent lowstands of ca. 60 and 35m below the present water level occurred during Marine Isotope Stage (MIS) 6 and MIS 5, respectively. The effect of these lowstands on biodiversity in most coastal parts of the lake is negligible, due to only small changes in lake surface area, coastline, and habitat. In contrast, biodiversity in shallower areas was more severely affected due to disconnection of today sublacustrine springs from the main water body. Multichannel seismic data from deeper parts of the lake clearly image several clinoform structures stacked on top of each other. These stacked clinoforms indicate significantly lower lake levels prior to MIS 6 and a stepwise rise of water level with intermittent stillstands since its existence as water-filled body, which might have caused enhanced expansion of endemic species within Lake Ohrid.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information on how species distributions and ecosystem services are impacted by anthropogenic climate change is important for adaptation planning. Palaeo data suggest that Abies alba formed forests under significantly warmer-than-present conditions in Europe and might be a native substitute for widespread drought-sensitive temperate and boreal tree species such as beech (Fagus sylvatica) and spruce (Picea abies) under future global warming conditions. Here, we combine pollen and macrofossil data, modern observations, and results from transient simulations with the LPX-Bern dynamic global vegetation model to assess past and future distributions of A. alba in Europe. LPX-Bern is forced with climate anomalies from a run over the past 21 000 years with the Community Earth System Model, modern climatology, and with 21st-century multimodel ensemble results for the high-emission RCP8.5 and the stringent mitigation RCP2.6 pathway. The simulated distribution for present climate encompasses the modern range of A. alba, with the model exceeding the present distribution in north-western and southern Europe. Mid-Holocene pollen data and model results agree for southern Europe, suggesting that at present, human impacts suppress the distribution in southern Europe. Pollen and model results both show range expansion starting during the Bølling–Allerød warm period, interrupted by the Younger Dryas cold, and resuming during the Holocene. The distribution of A. alba expands to the north-east in all future scenarios, whereas the potential (currently unrealized) range would be substantially reduced in southern Europe under RCP8.5. A. alba maintains its current range in central Europe despite competition by other thermophilous tree species. Our combined palaeoecological and model evidence suggest that A. alba may ensure important ecosystem services including stand and slope stability, infrastructure protection, and carbon sequestration under significantly warmer-than-present conditions in central Europe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Among Hispanics, the HPV vaccine has the potential to eliminate disparities in cervical cancer incidence and mortality but only if optimal rates of vaccination are achieved. Media can be an important information source for increasing HPV knowledge and awareness of the vaccine. Very little is known about how media use among Hispanics affects their HPV knowledge and vaccine awareness. Even less is known about what differences exist in media use and information processing among English- and Spanish-speaking Hispanics.^ Aims. Examine the relationships between three health communication variables (media exposure, HPV-specific information scanning and seeking) and three HPV outcomes (knowledge, vaccine awareness and initiation) among English- and Spanish-speaking Hispanics.^ Methods. Cross-sectional data from a survey administered to Hispanic mothers in Dallas, Texas was used for univariate and multivariate logistic regression analyses. Sample used for analysis included 288 mothers of females aged 8-22 recruited from clinics and community events. Dependent variables of interest were HPV knowledge, HPV vaccine awareness and initiation. Independent variables were media exposure, HPV-specific information scanning and seeking. Language was tested as an effect modifier on the relationship between health communication variables and HPV outcomes.^ Results. English-speaking mothers reported more media exposure, HPV-specific information scanning and seeking than Spanish-speakers. Scanning for HPV information was associated with more HPV knowledge (OR = 4.26, 95% CI = 2.41 - 7.51), vaccine awareness (OR = 10.01, 95% CI = 5.43 - 18.47) and vaccine initiation (OR = 2.54, 95% CI = 1.09 - 5.91). Seeking HPV-specific information was associated with more knowledge (OR = 2.27, 95% CI = 1.23 - 4.16), awareness (OR = 6.60, 95% CI = 2.74 - 15.91) and initiation (OR = 4.93, 95% CI = 2.64 - 9.20). Language moderated the effect of information scanning and seeking on vaccine awareness.^ Discussion. Differences in information scanning and seeking behaviors among Hispanic subgroups have the potential to lead to disparities in vaccine awareness.^ Conclusion. Findings from this study underscore health communication differences among Hispanics and emphasize the need to target Spanish language media as well as English language media aimed at Hispanics to improve knowledge and awareness.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Southern China, especially Yunnan, has undergone high tectonic activity caused by the uplift of Himalayan Mountains during the Neogene, which led to a fast changing palaeogeography. Previous study shows that Southern China has been influenced by the Asian Monsoon since at least the Early Miocene. However, it is yet not well understood how intense the Miocene monsoon system was. In the present study, 63 fossil floras of 16 localities from Southern China are compiled and evaluated for obtaining available information concerning floristic composition, stratigraphic age, sedimentology, etc. Based on such reliable information, selected mega- and micro-floras have been analysed with the coexistence approach to obtain quantitative palaeoclimate data. Visualization of climate results in maps shows a distinct spatial differentiation in Southern China during the Miocene. Higher seasonalities of temperature and precipitation occur in the north and south parts of Southern China, respectively. During the Miocene, most regions of Southern China and Europe were both warm and humid. Central Eurasia was likely to be an arid center, which gradually spread westward and eastward. Our data provide information about Miocene climate patterns in Southern China and about the evolution of these patterns throughout the Miocene, and is also crucial to unravel and understand the climatic signals of global cooling and tectonic uplift.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kelp forests represent a major habitat type in coastal waters worldwide and their structure and distribution is predicted to change due to global warming. Despite their ecological and economical importance, there is still a lack of reliable spatial information on their abundance and distribution. In recent years, various hydroacoustic mapping techniques for sublittoral environments evolved. However, in turbid coastal waters, such as off the island of Helgoland (Germany, North Sea), the kelp vegetation is present in shallow water depths normally excluded from hydroacoustic surveys. In this study, single beam survey data consisting of the two seafloor parameters roughness and hardness were obtained with RoxAnn from water depth between 2 and 18 m. Our primary aim was to reliably detect the kelp forest habitat with different densities and distinguish it from other vegetated zones. Five habitat classes were identified using underwater-video and were applied for classification of acoustic signatures. Subsequently, spatial prediction maps were produced via two classification approaches: Linear discriminant analysis (LDA) and manual classification routine (MC). LDA was able to distinguish dense kelp forest from other habitats (i.e. mixed seaweed vegetation, sand, and barren bedrock), but no variances in kelp density. In contrast, MC also provided information on medium dense kelp distribution which is characterized by intermediate roughness and hardness values evoked by reduced kelp abundances. The prediction maps reach accordance levels of 62% (LDA) and 68% (MC). The presence of vegetation (kelp and mixed seaweed vegetation) was determined with higher prediction abilities of 75% (LDA) and 76% (MC). Since the different habitat classes reveal acoustic signatures that strongly overlap, the manual classification method was more appropriate for separating different kelp forest densities and low-lying vegetation. It became evident that the occurrence of kelp in this area is not simply linked to water depth. Moreover, this study shows that the two seafloor parameters collected with RoxAnn are suitable indicators for the discrimination of different densely vegetated seafloor habitats in shallow environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Millennium Development Goals point out the necessity of actively promoting maternal-child health care status, especially in underserved areas. This article details the development actions carried out between 2008 and 2011 in some rural communities of Nicaragua with the aim to provide a low-cost tele-health communication service. The service is managed by the health care center of Cusmapa, which leads the program and maintains a communication link between its health staff and the health brigades of 26 distant communities. Local agents can use the system to report urgent maternal-child health care episodes to be assessed through WiMAX-WiFi voice and data communications attended by two physicians and six nurses located at the health care center. The health and nutritional status of the maternal-child population can be monitored to prevent diseases, subnutrition, and deaths. The action approach assumes the fundamentals of appropriate technology and looks for community- based, sustainable, replicable, and scalable solutions to ensure future deployments according to the strategies of the United Nations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new proposal to have secure communications in a system is reported. The basis is the use of a synchronized digital chaotic systems, sending the information signal added to an initial chaos. The received signal is analyzed by another chaos generator located at the receiver and, by a logic boolean function of the chaotic and the received signals, the original information is recovered. One of the most important facts of this system is that the bandwidth needed by the system remain the same with and without chaos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to present a simulation‐based evaluation method for the comparison of different organizational forms and software support levels in the field of supply chain management (SCM). Design/methodology/approach – Apart from widely known logistic performance indicators, the discrete event simulation model considers explicitly coordination cost as stemming from iterative administration procedures. Findings - The method is applied to an exemplary supply chain configuration considering various parameter settings. Curiously, additional coordination cost does not always result in improved logistic performance. Influence factor variations lead to different organizational recommendations. The results confirm the high importance of (up to now) disregarded dimensions when evaluating SCM concepts and IT tools. Research limitations/implications – The model is based on simplified product and network structures. Future research shall include more complex, real world configurations. Practical implications – The developed method is designed for the identification of improvement potential when SCM software is employed. Coordination schemes based only on ERP systems are valid alternatives in industrial practice because significant investment IT can be avoided. Therefore, the evaluation of these coordination procedures, in particular the cost due to iterations, is of high managerial interest and the method provides a comprehensive tool for strategic IT decision making. Originality/value – Reviewed literature is mostly focused on the benefits of SCM software implementations. However, ERP system based supply chain coordination is still widespread industrial practice but associated coordination cost has not been addressed by researchers.