903 resultados para Tests for Continuous Lifetime Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Worldwide data for cancer survival are scarce. We aimed to initiate worldwide surveillance of cancer survival by central analysis of population-based registry data, as a metric of the effectiveness of health systems, and to inform global policy on cancer control. METHODS: Individual tumour records were submitted by 279 population-based cancer registries in 67 countries for 25·7 million adults (age 15-99 years) and 75 000 children (age 0-14 years) diagnosed with cancer during 1995-2009 and followed up to Dec 31, 2009, or later. We looked at cancers of the stomach, colon, rectum, liver, lung, breast (women), cervix, ovary, and prostate in adults, and adult and childhood leukaemia. Standardised quality control procedures were applied; errors were corrected by the registry concerned. We estimated 5-year net survival, adjusted for background mortality in every country or region by age (single year), sex, and calendar year, and by race or ethnic origin in some countries. Estimates were age-standardised with the International Cancer Survival Standard weights. FINDINGS: 5-year survival from colon, rectal, and breast cancers has increased steadily in most developed countries. For patients diagnosed during 2005-09, survival for colon and rectal cancer reached 60% or more in 22 countries around the world; for breast cancer, 5-year survival rose to 85% or higher in 17 countries worldwide. Liver and lung cancer remain lethal in all nations: for both cancers, 5-year survival is below 20% everywhere in Europe, in the range 15-19% in North America, and as low as 7-9% in Mongolia and Thailand. Striking rises in 5-year survival from prostate cancer have occurred in many countries: survival rose by 10-20% between 1995-99 and 2005-09 in 22 countries in South America, Asia, and Europe, but survival still varies widely around the world, from less than 60% in Bulgaria and Thailand to 95% or more in Brazil, Puerto Rico, and the USA. For cervical cancer, national estimates of 5-year survival range from less than 50% to more than 70%; regional variations are much wider, and improvements between 1995-99 and 2005-09 have generally been slight. For women diagnosed with ovarian cancer in 2005-09, 5-year survival was 40% or higher only in Ecuador, the USA, and 17 countries in Asia and Europe. 5-year survival for stomach cancer in 2005-09 was high (54-58%) in Japan and South Korea, compared with less than 40% in other countries. By contrast, 5-year survival from adult leukaemia in Japan and South Korea (18-23%) is lower than in most other countries. 5-year survival from childhood acute lymphoblastic leukaemia is less than 60% in several countries, but as high as 90% in Canada and four European countries, which suggests major deficiencies in the management of a largely curable disease. INTERPRETATION: International comparison of survival trends reveals very wide differences that are likely to be attributable to differences in access to early diagnosis and optimum treatment. Continuous worldwide surveillance of cancer survival should become an indispensable source of information for cancer patients and researchers and a stimulus for politicians to improve health policy and health-care systems. FUNDING: Canadian Partnership Against Cancer (Toronto, Canada), Cancer Focus Northern Ireland (Belfast, UK), Cancer Institute New South Wales (Sydney, Australia), Cancer Research UK (London, UK), Centers for Disease Control and Prevention (Atlanta, GA, USA), Swiss Re (London, UK), Swiss Cancer Research foundation (Bern, Switzerland), Swiss Cancer League (Bern, Switzerland), and University of Kentucky (Lexington, KY, USA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introducción. Uno de los paradigmas más utilizados en el estudio de la atención es el Continuous Performance Test (CPT). La versión de pares idénticos (CPT-IP) se ha utilizado ampliamente para evaluar los déficits de atención en los trastornos del neurodesarrollo, neurológicos y psiquiátricos. Sin embargo, la localización de la activación cerebral de las redes atencionales varía significativamente según el diseño de resonancia magnética funcional (RMf) usado. Objetivo. Diseñar una tarea para evaluar la atención sostenida y la memoria de trabajo mediante RMf para proporcionar datos de investigación relacionados con la localización y el papel de estas funciones. Sujetos y métodos. El estudio contó con la participación de 40 estudiantes, todos ellos diestros (50%, mujeres; rango: 18-25 años). La tarea de CPT-IP se diseñó como una tarea de bloques, en la que se combinaban los períodos CPT-IP con los de reposo. Resultados. La tarea de CPT-IP utilizada activa una red formada por regiones frontales, parietales y occipitales, y éstas se relacionan con funciones ejecutivas y atencionales. Conclusiones. La tarea de CPT-IP utilizada en nuestro trabajo proporciona datos normativos en adultos sanos para el estudio del sustrato neural de la atención sostenida y la memoria de trabajo. Estos datos podrían ser útiles para evaluar trastornos que cursan con déficits en memoria de trabajo y en atención sostenida.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Substances emitted into the atmosphere by human activities in urban and industrial areas cause environmental problems such as air quality degradation, respiratory diseases, climate change, global warming, and stratospheric ozone depletion. Volatile organic compounds (VOCs) are major air pollutants, emitted largely by industry, transportation and households. Many VOCs are toxic, and some are considered to be carcinogenic, mutagenic, or teratogenic. A wide spectrum of VOCs is readily oxidized photocatalytically. Photocatalytic oxidation (PCO) over titanium dioxide may present a potential alternative to air treatment strategies currently in use, such as adsorption and thermal treatment, due to its advantageous activity under ambient conditions, although higher but still mild temperatures may also be applied. The objective of the present research was to disclose routes of chemical reactions, estimate the kinetics and the sensitivity of gas-phase PCO to reaction conditions in respect of air pollutants containing heteroatoms in their molecules. Deactivation of the photocatalyst and restoration of its activity was also taken under consideration to assess the practical possibility of the application of PCO to the treatment of air polluted with VOCs. UV-irradiated titanium dioxide was selected as a photocatalyst for its chemical inertness, non-toxic character and low cost. In the present work Degussa P25 TiO2 photocatalyst was mostly used. In transient studies platinized TiO2 was also studied. The experimental research into PCO of following VOCs was undertaken: - methyl tert-butyl ether (MTBE) as the basic oxygenated motor fuel additive and, thus, a major non-biodegradable pollutant of groundwater; - tert-butyl alcohol (TBA) as the primary product of MTBE hydrolysis and PCO; - ethyl mercaptan (ethanethiol) as one of the reduced sulphur pungent air pollutants in the pulp-and-paper industry; - methylamine (MA) and dimethylamine (DMA) as the amino compounds often emitted by various industries. The PCO of VOCs was studied using a continuous-flow mode. The PCO of MTBE and TBA was also studied by transient mode, in which carbon dioxide, water, and acetone were identified as the main gas-phase products. The volatile products of thermal catalytic oxidation (TCO) of MTBE included 2-methyl-1-propene (2-MP), carbon monoxide, carbon dioxide and water; TBA decomposed to 2-MP and water. Continuous PCO of 4 TBA proceeded faster in humid air than dry air. MTBE oxidation, however, was less sensitive to humidity. The TiO2 catalyst was stable during continuous PCO of MTBE and TBA above 373 K, but gradually lost activity below 373 K; the catalyst could be regenerated by UV irradiation in the absence of gas-phase VOCs. Sulphur dioxide, carbon monoxide, carbon dioxide and water were identified as ultimate products of PCO of ethanethiol. Acetic acid was identified as a photocatalytic oxidation by-product. The limits of ethanethiol concentration and temperature, at which the reactor performance was stable for indefinite time, were established. The apparent reaction kinetics appeared to be independent of the reaction temperature within the studied limits, 373 to 453 K. The catalyst was completely and irreversibly deactivated with ethanethiol TCO. Volatile PCO products of MA included ammonia, nitrogen dioxide, nitrous oxide, carbon dioxide and water. Formamide was observed among DMA PCO products together with others similar to the ones of MA. TCO for both substances resulted in the formation of ammonia, hydrogen cyanide, carbon monoxide, carbon dioxide and water. No deactivation of the photocatalyst during the multiple long-run experiments was observed at the concentrations and temperatures used in the study. PCO of MA was also studied in the aqueous phase. Maximum efficiency was achieved in an alkaline media, where MA exhibited high fugitivity. Two mechanisms of aqueous PCO – decomposition to formate and ammonia, and oxidation of organic nitrogen directly to nitrite - lead ultimately to carbon dioxide, water, ammonia and nitrate: formate and nitrite were observed as intermediates. A part of the ammonia formed in the reaction was oxidized to nitrite and nitrate. This finding helped in better understanding of the gasphase PCO pathways. The PCO kinetic data for VOCs fitted well to the monomolecular Langmuir- Hinshelwood (L-H) model, whereas TCO kinetic behaviour matched the first order process for volatile amines and the L-H model for others. It should be noted that both LH and the first order equations were only the data fit, not the real description of the reaction kinetics. The dependence of the kinetic constants on temperature was established in the form of an Arrhenius equation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article discusses the development of WEBDATANET established in 2011 which aims to create a multidisciplinary network of web-based data collection experts in Europe. Topics include the presence of 190 experts in 30 European countries and abroad, the establishment of web-based teaching and discussion platforms and working groups and task forces. Also discussed is the scope of the research carried by WEBDATANET. In light of the growing importance of web-based data in the social and behavioral sciences, WEBDATANET was established in 2011 as a COST Action (IS 1004) to create a multidisciplinary network of web-based data collection experts: (web) survey methodologists, psychologists, sociologists, linguists, economists, Internet scientists, media and public opinion researchers. The aim was to accumulate and synthesize knowledge regarding methodological issues of web-based data collection (surveys, experiments, tests, non-reactive data, and mobile Internet research), and foster its scientific usage in a broader community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research was to do a repeated cross-sectional research on class teachers who study in the 4th year and also graduated at the Faculty of Education, University of Turku between the years of 2000 through 2004. Specifically, seven research questions were addressed to target the main purpose of the study: How do class teacher education masters’ degree senior students and graduates rate “importance; effectiveness; and quality” of training they have received at the Faculty of Education? Are there significant differences between overall ratings of importance; effectiveness and quality of training by year of graduation, sex, and age (for graduates) and sex and age (for senior students)? Is there significant relationship between respondents’ overall ratings of importance; effectiveness and their overall ratings of the quality of training and preparation they have received? Are there significant differences between graduates and senior students about importance, effectiveness, and quality of teacher education programs? And what do teachers’ [Graduates] believe about how increasing work experience has changed their opinions of their preservice training? Moreover the following concepts related to the instructional activities were studied: critical thinking skills, communication skills, attention to ethics, curriculum and instruction (planning), role of teacher and teaching knowledge, assessment skills, attention to continuous professional development, subject matters knowledge, knowledge of learning environment, and using educational technology. Researcher also tried to find influence of some moderator variables e.g. year of graduation, sex, and age on the dependent and independent variables. This study consisted of two questionnaires (a structured likert-scale and an open ended questionnaire). The population in study 1 was all senior students and 2000-2004 class teacher education masters’ degree from the departments of Teacher Education Faculty of Education at University of Turku. Of the 1020 students and graduates the researcher was able to find current addresses of 675 of the subjects and of the 675 graduates contacted, 439 or 66.2 percent responded to the survey. The population in study 2 was all class teachers who graduated from Turku University and now work in the few basic schools (59 Schools) in South- West Finland. 257 teachers answered to the open ended web-based questions. SPSS was used to produce standard deviations; Analysis of Variance; Pearson Product Moment Correlation (r); T-test; ANOVA, Bonferroni post-hoc test; and Polynomial Contrast tests meant to analyze linear trend. An alpha level of .05 was used to determine statistical significance. The results of the study showed that: A majority of the respondents (graduates and students) rated the overall importance, effectiveness and quality of the teacher education programs as important, effective and good. Generally speaking there were only a few significant differences between the cohorts and groups related to the background variables (gender, age). The different cohorts were rating the quality of the programs very similarly but some differences between the cohorts were found in the importance and effectiveness ratings. Graduates of 2001 and 2002 rated the importance of the program significantly higher than 2000 graduates. The effectiveness of the programs was rated significantly higher by 2001 and 2003 graduates than other groups. In spite of these individual differences between cohorts there were no linear trends among the year cohorts in any measure. In respondents’ ratings of the effectiveness of teacher education programs there was significant difference between males and females; females rated it higher than males. There were no significant differences between males’ and females’ ratings of the importance and quality of programs. In the ratings there was only one difference between age groups. Older graduates (35 years or older) rated the importance of the teacher training significantly higher that 25-35 years old graduates. In graduates’ ratings there were positive but relatively low correlations between all variables related to importance, effectiveness and quality of Teacher Education Programs. Generally speaking students’ ratings about importance, effectiveness and quality of teacher education program were very positive. There was only one significant difference related to the background variables. Females rated higher the effectiveness of the program. The comparison of students’ and graduates’ perception about importance, effectiveness, and quality of teacher education programs showed that there were no significant differences between graduates and students in the overall ratings. However there were differences in some individual variables. Students rated higher in importance of “Continuous Professional Development”, effectiveness of “Critical Thinking Skills” and “Using Educational Technology” and quality of “Advice received from the advisor”. Graduates rated higher in importance of “Knowledge of Learning Environment” and effectiveness of “Continuous Professional Development”. According to the qualitative data of study 2 some graduates expressed that their perceptions have not changed about the importance, effectiveness, and quality of training that they received during their study time. They pointed out that teacher education programs have provided them the basic theoretical/formal knowledge and some training of practical routines. However, a majority of the teachers seems to have somewhat critical opinions about the teacher education. These teachers were not satisfied with teacher education programs because they argued that the programs failed to meet their practical demands in different everyday situations of the classroom e.g. in coping with students’ learning difficulties, multiprofessional communication with parents and other professional groups (psychologists and social workers), and classroom management problems. Participants also emphasized more practice oriented knowledge of subject matter, evaluation methods and teachers’ rights and responsibilities. Therefore, they (54.1% of participants) suggested that teacher education departments should provide more practice-based courses and programs as well as closer collaboration between regular schools and teacher education departments in order to fill gap between theory and practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chromogenic immunohistochemistry (IHC) is omnipresent in cancer diagnosis, but has also been criticized for its technical limit in quantifying the level of protein expression on tissue sections, thus potentially masking clinically relevant data. Shifting from qualitative to quantitative, immunofluorescence (IF) has recently gained attention, yet the question of how precisely IF can quantify antigen expression remains unanswered, regarding in particular its technical limitations and applicability to multiple markers. Here we introduce microfluidic precision IF, which accurately quantifies the target expression level in a continuous scale based on microfluidic IF staining of standard tissue sections and low-complexity automated image analysis. We show that the level of HER2 protein expression, as continuously quantified using microfluidic precision IF in 25 breast cancer cases, including several cases with equivocal IHC result, can predict the number of HER2 gene copies as assessed by fluorescence in situ hybridization (FISH). Finally, we demonstrate that the working principle of this technology is not restricted to HER2 but can be extended to other biomarkers. We anticipate that our method has the potential of providing automated, fast and high-quality quantitative in situ biomarker data using low-cost immunofluorescence assays, as increasingly required in the era of individually tailored cancer therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To evaluate the BI-RADS as a predictive factor of suspicion for malignancy in breast lesions by correlating radiological with histological results and calculating the positive predictive value for categories 3, 4 and 5 in a breast cancer reference center in the city of São Paulo. Materials and Methods Retrospective, analytical and cross-sectional study including 725 patients with mammographic and/or sonographic findings classified as BI-RADS categories 3, 4 and 5 who were referred to the authors' institution to undergo percutaneous biopsy. The tests results were reviewed and the positive predictive value was calculated by means of a specific mathematical equation. Results Positive predictive values found for categories 3, 4 and 5 were respectively the following: 0.74%, 33.08% and 92.95%, for cases submitted to ultrasound-guided biopsy, and 0.00%, 14.90% and 100% for cases submitted to stereotactic biopsy. Conclusion The present study demonstrated high suspicion for malignancy in lesions classified as category 5 and low risk for category 3. As regards category 4, the need for systematic biopsies was observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Defining environmental chemistry is a not an easy task because it encompasses many different topics. According to Stanley E. Manahan, author of a classical textbook of Environmental Chemistry, this branch could be defined as the one centered in the study of the sources, transport, effects and fates of chemical species in the water, soil, and air environments, as well as the influence of human activity upon these processes. More recently, new knowledge emerged from the Environmental Toxicology allowed to go even deeper in the meaning of 'effects' and 'fates' of a continuous growing number of organic and inorganic species disposed in water bodies, soils and atmosphere. Toxicity tests became an important tool to evaluate the environmental impact of such species to a great number of organisms, thus allowing to set quality criteria for drinking water, sediments and biota. The state of art shows that environmental chemistry is a multi-inter disciplinary science by nature; therefore, it needs more than a limited, unique-approach and non-oriented set of data to understand the nature of natural processes. Taking all these aspects into consideration, one can say that Environmental Chemistry in Brazil is now a well established area of research within the classical areas of the Chemistry, with a large number of emerging groups as well research groups with worldwide recognition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today´s organizations must have the ability to react to rapid changes in the market. These rapid changes cause pressure to continuously find new efficient ways to organize work practices. Increased competition requires businesses to become more effective and to pay attention to quality of management and to make people to understand their work's impact on the final result. The fundamentals in continmuois improvement are systematic and agile tackling of indentified individual process constraints and the fact tha nothin finally improves without changes. Successful continuous improvement requires management commitment, education, implementation, measurement, recognition and regeneration. These ingredients form the foundation, both for breakthrough projects and small step ongoing improvement activities. One part of the organization's management system are the quality tools, which provide systematic methodologies for identifying problems, defining their root causes, finding solutions, gathering and sorting of data, supporting decision making and implementing the changes, and many other management tasks. Organizational change management includes processes and tools for managing the people in an organizational level change. These tools include a structured approach, which can be used for effective transition of organizations through change. When combined with the understanding of change management of individuals, these tools provide a framework for managing people in change,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a networked business environment the visibility requirements towards the supply operations and customer interface has become tighter. In order to meet those requirements the master data of case company is seen as an enabler. However the current state of master data and its quality are not seen good enough to meet those requirements. In this thesis the target of research was to develop a process for managing master data quality as a continuous process and find solutions to cleanse the current customer and supplier data to meet the quality requirements defined in that process. Based on the theory of Master Data Management and data cleansing, small amount of master data was analyzed and cleansed using one commercial data cleansing solution available on the market. This was conducted in cooperation with the vendor as a proof of concept. In the proof of concept the cleansing solution’s applicability to improve the quality of current master data was proved. Based on those findings and the theory of data management the recommendations and proposals for improving the quality of data were given. In the results was also discovered that the biggest reasons for poor data quality is the lack of data governance in the company, and the current master data solutions and its restrictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We generalize to arbitrary waiting-time distributions some results which were previously derived for discrete distributions. We show that for any two waiting-time distributions with the same mean delay time, that with higher dispersion will lead to a faster front. Experimental data on the speed of virus infections in a plaque are correctly explained by the theoretical predictions using a Gaussian delay-time distribution, which is more realistic for this system than the Dirac delta distribution considered previously [J. Fort and V. Méndez, Phys. Rev. Lett.89, 178101 (2002)]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Especially in global enterprises, key data is fragmented in multiple Enterprise Resource Planning (ERP) systems. Thus the data is inconsistent, fragmented and redundant across the various systems. Master Data Management (MDM) is a concept, which creates cross-references between customers, suppliers and business units, and enables corporate hierarchies and structures. The overall goal for MDM is the ability to create an enterprise-wide consistent data model, which enables analyzing and reporting customer and supplier data. The goal of the study was defining the properties and success factors of a master data system. The theoretical background was based on literature and the case consisted of enterprise specific needs and demands. The theoretical part presents the concept, background, and principles of MDM and then the phases of system planning and implementation project. Case consists of background, definition of as is situation, definition of project, evaluation criterions and concludes the key results of the thesis. In the end chapter Conclusions combines common principles with the results of the case. The case part ended up dividing important factors of the system in success factors, technical requirements and business benefits. To clarify the project and find funding for the project, business benefits have to be defined and the realization has to be monitored. The thesis found out six success factors for the MDM system: Well defined business case, data management and monitoring, data models and structures defined and maintained, customer and supplier data governance, delivery and quality, commitment, and continuous communication with business. Technical requirements emerged several times during the thesis and therefore those can’t be ignored in the project. Conclusions chapter goes through these factors on a general level. The success factors and technical requirements are related to the essentials of MDM: Governance, Action and Quality. This chapter could be used as guidance in a master data management project.