914 resultados para Using Lean tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Case-based reasoning (CBR) is a recent approach to problem solving and learning that has got a lot of attention over the last years. In this work, the CBR methodology is used to reduce the time and amount of resources spent on carry out experiments to determine the viscosity of the new slurry. The aim of this work is: to develop a CBR system to support the decision making process about the type of slurries behavior, to collect a sufficient volume of qualitative data for case base, and to calculate the viscosity of the Newtonian slurries. Firstly in this paper, the literature review about the types of fluid flow, Newtonian and non-Newtonian slurries is presented. Some physical properties of the suspensions are also considered. The second part of the literature review provides an overview of the case-based reasoning field. Different models and stages of CBR cycles, benefits and disadvantages of this methodology are considered subsequently. Brief review of the CBS tools is also given in this work. Finally, some results of work and opportunities for system modernization are presented. To develop a decision support system for slurry viscosity determination, software application MS Office Excel was used. Designed system consists of three parts: workspace, the case base, and section for calculating the viscosity of Newtonian slurries. First and second sections are supposed to work with Newtonian and Bingham fluids. In the last section, apparent viscosity can be calculated for Newtonian slurries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrological models are important tools that have been used in water resource planning and management. Thus, the aim of this work was to calibrate and validate in a daily time scale, the SWAT model (Soil and Water Assessment Tool) to the watershed of the Galo creek , located in Espírito Santo State. To conduct the study we used georeferenced maps of relief, soil type and use, in addition to historical daily time series of basin climate and flow. In modeling were used time series corresponding to the periods Jan 1, 1995 to Dec 31, 2000 and Jan 1, 2001 to Dec 20, 2003 for calibration and validation, respectively. Model performance evaluation was done using the Nash-Sutcliffe coefficient (E NS) and the percentage of bias (P BIAS). SWAT evaluation was also done in the simulation of the following hydrological variables: maximum and minimum annual daily flowsand minimum reference flows, Q90 and Q95, based on mean absolute error. E NS and P BIAS were, respectively, 0.65 and 7.2% and 0.70 and 14.1%, for calibration and validation, indicating a satisfactory performance for the model. SWAT adequately simulated minimum annual daily flow and the reference flows, Q90 and Q95; it was not suitable in the simulation of maximum annual daily flows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prostate-specific antigen (PSA) is a marker that is commonly used in estimating prostate cancer risk. Prostate cancer is usually a slowly progressing disease, which might not cause any symptoms whatsoever. Nevertheless, some cases of cancer are aggressive and need to be treated before they become life-threatening. However, the blood PSA concentration may rise also in benign prostate diseases and using a single total PSA (tPSA) measurement to guide the decision on further examinations leads to many unnecessary biopsies, over-detection, and overtreatment of indolent cancers which would not require treatment. Therefore, there is a need for markers that would better separate cancer from benign disorders, and would also predict cancer aggressiveness. The aim of this study was to evaluate whether intact and nicked forms of free PSA (fPSA-I and fPSA-N) or human kallikrein-related peptidase 2 (hK2) could serve as new tools in estimating prostate cancer risk. First, the immunoassays for fPSA-I and free and total hK2 were optimized so that they would be less prone to assay interference caused by interfering factors present in some blood samples. The optimized assays were shown to work well and were used to study the marker concentrations in the clinical sample panels. The marker levels were measured from preoperative blood samples of prostate cancer patients scheduled for radical prostatectomy. The association of the markers with the cancer stage and grade was studied. It was found that among all tested markers and their combinations especially the ratio of fPSA-N to tPSA and ratio of free PSA (fPSA) to tPSA were associated with both cancer stage and grade. They might be useful in predicting the cancer aggressiveness, but further follow-up studies are necessary to fully evaluate the significance of the markers in this clinical setting. The markers tPSA, fPSA, fPSA-I and hK2 were combined in a statistical model which was previously shown to be able to reduce unnecessary biopsies when applied to large screening cohorts of men with elevated tPSA. The discriminative accuracy of this model was compared to models based on established clinical predictors in reference to biopsy outcome. The kallikrein model and the calculated fPSA-N concentrations (fPSA minus fPSA-I) correlated with the prostate volume and the model, when compared to the clinical models, predicted prostate cancer in biopsy equally well. Hence, the measurement of kallikreins in a blood sample could be used to replace the volume measurement which is time-consuming, needs instrumentation and skilled personnel and is an uncomfortable procedure. Overall, the model could simplify the estimation of prostate cancer risk. Finally, as the fPSA-N seems to be an interesting new marker, a direct immunoassay for measuring fPSA-N concentrations was developed. The analytical performance was acceptable, but the rather complicated assay protocol needs to be improved until it can be used for measuring large sample panels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkimuksen päätavoitteena on luoda Lindab Oy:n Kyyjärven tuotantolaitokselle kehittämissuunnitelma nojautuen Lean filosofiaan, perinteiseen suorituskyvyn mittaamiseen ja sen osa-alueisiin. Lean teorian näkökulmasta tutkimustyössä tunnistetaan tärkeimmät kehityskohteet ja niihin vaikuttavimmat kehitystoimet. Työn alussa selvitetään eri tuotantomuotoja ja niiden kilpailukykyä, lisäksi perustellaan, miksi työn kehittämisen pohjalle valitaan Lean - filosofia. Raportin kuluessa Lean ajattelua avataan lukijalle enemmän ja kerrotaan mitä se sisältää. Lean filosofiasta valitaan elementtejä, joita voidaan käyttää jatkossa, kehitettävän toimintaympäristön kehitysmallin muodostamiseen. Lean filosofian tarkastelun jälkeen toimintaympäristöä tutkitaan ja muodostetaan käsitys suorituskyvyn alueista, joita on tarkoitus pyrkiä kehittämään. Tässä kohdassa etsitään myös soveltuvat Lean mittarit ja tarkistetaan syy-seuraussuhteet valittuihin suorituskyvyn osa-alueisiin. Lean mittarit toimivat myös itsearviointikysymysten taustalla, joilla kartoitetaan toimintaympäristön suorituskykyä. Suorituskyvyn mittauksen tulokset toimivat täten kehittämistehtävän lopullisina ajureina.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Approximately two percent of Finns have sequels after traumatic brain injury (TBI), and many TBI patients are young or middle-aged. The high rate of unemployment after TBI has major economic consequences for society, and traumatic brain injury often has remarkable personal consequences, as well. Structural imaging is often needed to support the clinical TBI diagnosis. Accurate early diagnosis is essential for successful rehabilition and, thus, may also influence the patient’s outcome. Traumatic axonal injury and cortical contusions constitute the majority of traumatic brain lesions. Several studies have shown magnetic resonance imaging (MRI) to be superior to computed tomography (CT) in the detection of these lesions. However, traumatic brain injury often leads to persistent symptoms even in cases with few or no findings in conventional MRI. Aims and methods: The aim of this prospective study was to clarify the role of conventional MRI in the imaging of traumatic brain injury, and to investigate how to improve the radiologic diagnostics of TBI by using more modern diffusion-weighted imaging (DWI) and diffusion tensor imaging (DTI) techniques. We estimated, in a longitudinal study, the visibility of the contusions and other intraparenchymal lesions in conventional MRI at one week and one year after TBI. We used DWI-based measurements to look for changes in the diffusivity of the normal-appearing brain in a case-control study. DTI-based tractography was used in a case-control study to evaluate changes in the volume, diffusivity, and anisotropy of the long association tracts in symptomatic TBI patients with no visible signs of intracranial or intraparenchymal abnormalities on routine MRI. We further studied the reproducibility of different tools to identify and measure white-matter tracts by using a DTI sequence suitable for clinical protocols. Results: Both the number and extent of visible traumatic lesions on conventional MRI diminished significantly with time. Slightly increased diffusion in the normal-appearing brain was a common finding at one week after TBI, but it was not significantly associated with the injury severity. Fractional anisotropy values, that represent the integrity of the white-matter tracts, were significantly diminished in several tracts in TBI patients compared to the control subjects. Compared to the cross-sectional ROI method, the tract-based analyses had better reproducibility to identify and measure white-matter tracts of interest by means of DTI tractography. Conclusions: As conventional MRI is still applied in clinical practice, it should be carried out soon after the injury, at least in symptomatic patients with negative CT scan. DWI-related brain diffusivity measurements may be used to improve the documenting of TBI. DTI tractography can be used to improve radiologic diagnostics in a symptomatic TBI sub-population with no findings on conventional MRI. Reproducibility of different tools to quantify fibre tracts vary considerably, which should be taken into consideration in the clinical DTI applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän työn tavoitteena on luoda malli tai työkalu Ecocat Oy:n jalometallihävikin kustan-nusten hallitsemiseen tuote- tai tuoteperhetasolla. Lisäksi osatavoitteena on nostaa esiin huomioita ja käytäntöjä hävikin hallinnan tueksi. Työn teoriaosuudessa käydään läpi prosesseihin liittyvää kirjallisuutta keskittyen proses-sien kuvaamiseen, määrittelyyn, kehittämiseen sekä vaihtelun vähentämisen perusteisiin. Lisäksi teoriaosassa on lyhyt Lean perusteiden sekä työkalujen kuvaus, jonka tarkoituk-sena on antaa pohjaa hävikin minimointityölle. Viimeinen teoriakappale keskittyy jalo-metalleihin raaka-aineina; niiden hankintaan, markkinoiden piirteisiin ja kustannusvaiku-tuksiin. Työn empiirisen osan johdantona toimii kappaleessa 4 oleva katalysaattorien ja niissä käytettävien jalometallien esittely. Kappaleissa 5 - 7 on työn empiirinen osuus, jossa luodaan tavoitteiden mukainen kustannustenhallintyökalu. Työn tuloksena syntyi viiden tutkimukseen valitun tuotteen avulla Excel-laskentatyökalu, jolla tiettyjen spesifikaatioiden mukaisten tuotteiden jalometallihävikin osuus tuotannon aikana voidaan laskea. Lisäksi työn tuloksissa, osatavoitteen mukaisesti, tarkastellaan muun muassa mittausprosesseihin sekä työn suoritukseen liittyviä kehityskohteita.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The requirements set by the market for electrical machines become increasingly demanding requiring more sophisticated technological solutions. Companies producing electrical ma-chines are challenged to develop machines that provide competitive edge for the customer for example through increased efficiency, reliability or some customer specific special requirement. The objective of this thesis is to derive a proposal for the first steps to transform the electrical machine product development process of a manufacturing company towards lean product development. The current product development process in the company is presented together with the processes of four other companies interviewed for the thesis. On the basis of current processes of the electrical machine industry and the related literature, a generalized electrical machine product development process is derived. The management isms and –tools utilized by the companies are analyzed. Adoption of lean Pull-Event –reviews, Oobeya –management and Knowledge based product development are suggested as the initial steps of implementing lean product development paradigm in the manufacturing company. Proposals for refining the cur-rent product development process and increasing the stakeholder involvement in the development projects are made. Lean product development is finding its way to Finnish electrical machine industry, but the results will be available only after the methods have been implemented and adopted by the companies. There is some enthusiasm about the benefits of lean approach and if executed successfully it will provide competitive edge for the Finnish electrical machine industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityössä tarkastellaan hitsaavan verkoston laadunhallintaa ja siinä ilmeneviä erilaisia ongelmakohtia. Tämän lisäksi työssä tarkastellaan kolmen eri laatutyökalun Lean, Six Sigma ja Total Welding Management soveltamista hitsaavan verkoston laadunhallinnassa. Teoriaosassa käsitellään sekä yleisesti että hitsauksen osalta laatua ja laadunhallintaa, sekä edellä mainittuja laatutyökaluja. Tutkimusosaan tietoja hitsaavista verkostoista kerättiin kaikkiaan kolmesta eri verkostosta. Näiden kerättyjen tietojen pohjalta tarkasteltiin valittujen laatutyökalujen soveltuvuutta verkostomaiseen käyttöön. Verkostoitunut toiminta aiheuttaa monia uusia haasteita yritysten laadunhallinnalle verrattuna yksittäisiin hitsaaviin yrityksiin. Suurimpia tutkimuksessa havaittuja ongelmakohtia ovat suunnittelun ja valmistuksen yhteistyön erilaiset puutteet, laatutasoon ja sen varmistukseen liittyvät asiat, sekä verkoston sisälle syntyvä niin sanottu hiljainen tieto ja sen häviäminen. Tutkimuksen tarkastelujen perusteella havaittiin, että kaikkien tutkimukseen valitun kolmen laatutyökalun soveltaminen myös verkostomaisessa toiminnassa on mahdollista, mutta se vaatii huomattavasti suurempaa työpanosta kuin soveltaminen yksittäisessä yrityksessä. Myös näiden kaikkien kolmen työkalun yhtä aikainen käyttö on mahdollista. Juuri oikean työkalun valitseminen kullekin hitsaavalle verkostolle vaatii tarkkaa perehtymistä verkostoon ja sen tilanteeseen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Products developed at industries, institutes and research centers are expected to have high level of quality and performance, having a minimum waste, which require efficient and robust tools to numerically simulate stringent project conditions with great reliability. In this context, Computational Fluid Dynamics (CFD) plays an important role and the present work shows two numerical algorithms that are used in the CFD community to solve the Euler and Navier-Stokes equations applied to typical aerospace and aeronautical problems. Particularly, unstructured discretization of the spatial domain has gained special attention by the international community due to its ease in discretizing complex spatial domains. This work has the main objective of illustrating some advantages and disadvantages of numerical algorithms using structured and unstructured spatial discretization of the flow governing equations. Numerical methods include a finite volume formulation and the Euler and Navier-Stokes equations are applied to solve a transonic nozzle problem, a low supersonic airfoil problem and a hypersonic inlet problem. In a structured context, these problems are solved using MacCormack’s implicit algorithm with Steger and Warming’s flux vector splitting technique, while, in an unstructured context, Jameson and Mavriplis’ explicit algorithm is used. Convergence acceleration is obtained using a spatially variable time stepping procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few people see both opportunities and threats coming from IT legacy in current world. On one hand, effective legacy management can bring substantial hard savings and smooth transition to the desired future state. On the other hand, its mismanagement contributes to serious operational business risks, as old systems are not as reliable as it is required by the business users. This thesis offers one perspective of dealing with IT legacy – through effective contract management, as a component towards achieving Procurement Excellence in IT, thus bridging IT delivery departments, IT procurement, business units, and suppliers. It developed a model for assessing the impact of improvements on contract management process and set of tools and advices with regards to analysis and improvement actions. The thesis conducted case study to present and justify the implementation of Lean Six Sigma in IT legacy contract management environment. Lean Six Sigma proved to be successful and this thesis presents and discusses all the steps necessary, and pitfalls to avoid, to achieve breakthrough improvement in IT contract management process performance. For the IT legacy contract management process two improvements require special attention and can be easily copied to any organization. First is the issue of diluted contract ownership that stops all the improvements, as people do not know who is responsible for performing those actions. Second is the contract management performance evaluation tool, which can be used for monitoring, identifying outlying contracts and opportunities for improvements in the process. The study resulted in a valuable insight on the benefits of applying Lean Six Sigma to improve IT legacy contract management, as well as on how Lean Six Sigma can be applied in IT environment. Managerial implications are discussed. It is concluded that the use of data-driven Lean Six Sigma methodology for improving the existing IT contract management processes is a significant addition to the existing best practices in contract management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enabling Change in Universities: Enhancing Education for Sustainable Development with Tools for Quality Assurance This thesis deals with enabling change in universities, more explicitly enhancing education for sustainable development with tools for quality assurance. Change management is a discipline within management that was developed in the 1980s because business changed from being predictable to unpredictable. The PEST mnemonic is a method to categorize factors enabling change; such as political, economic, socio-cultural and technological factors, which all affect higher education. A classification of a change, in either hard or soft, can help understanding the type of change that an organization is facing. Hard changes are more applied to problems that have clear objectives and indicators, with a known cause of the problem. Soft changes are applied to larger problems that affect the entire organization or beyond it. The basic definition for sustainable development is: the future generations should have similar opportunities as the previous. The UN has set as a global goal an integration of education for sustainable development (ESD) at all levels of education during 2005- 2014. The goal is set also in universities, the graduates of which are future leaders for all labor markets. The objective for ESD in higher education is that graduates obtain the competence to take economic, social and environmental costs and benefits into account when making decisions. Knowledge outcomes should aim for systematic and holistic thinking, which requires cross disciplinary education. So far, the development of ESD has not achieved its goals. The UN has identified a need for more transdisclipnary research in ESD. A joint global requirement for universities is quality assurance, the aim of which is to secure and improve teaching and learning. Quality, environmental and integrated management systems are used by some universities for filling the quality assurance requirements. The goal of this thesis is to open up new ways for enhancing ESD in universities, beyond the forerunners; by exploring how management systems could be used as tools for promoting ESD. The thesis is based on five studies. In the first study, I focus on if and how tools for quality assurance could be benefitted for promoting ESD. It is written from a new perspective, the memetic, for reaching a diversity of faculty. A meme is an idea that diffuses from brain to brain. It can be applied for cultural evolution. It is a theory that is based on the evolutionary theory by Darwin, applied for social sciences. In the second Paper, I present the results from the development of the pilot process model for enhancing ESD with management systems. The development of the model is based on a study that includes earlier studies, a survey in academia and an analysis of the practice in 11 universities in the Nordic countries. In the third study, I explore if the change depends on national culture or if it is global. It is a comparative study on both policy and implementation level, between the Nordic countries and China. The fourth study is a single case study based on change management. In this study, I identify what to consider in order to enable the change: enhancing ESD with tools for quality assurance in universities. In the fifth Paper, I present the results of the process model for enhancing ESD with management systems. The model was compared with identified drivers and barriers for enhancing ESD and for implementing management systems. Finally, the process model was piloted and applied for identifying sustainability aspects in curricula. Action research was chosen as methodology because there are not already implemented approaches using quality management for promoting ESD, why the only way to study this is to make it happen. Another reason for choosing action research is since it is essential to involve students and faculty for enhancing ESD. Action based research consists of the following phases: a) diagnosing, b) planning action, c) taking action and d) evaluating action. This research was made possible by a project called Education for Sustainable Development in Academia in the Nordic countries, ESDAN, in which activities were divided into these four phases. Each phase ended with an open seminar, where the results of the study were presented. The objective for the research project was to develop a process for including knowledge in sustainable development in curricula, which could be used in the quality assurance work. Eleven universities from the Nordic countries cooperated in the project. The aim was, by applying the process, to identify and publish examples of relevant sustainability aspects in different degree programs in universities in the Nordic countries. The project was partly financed by the Nordic Council of Ministers and partly by the participating pilot universities. Based on the results of my studies, I consider that quality, environmental and integrated management systems can be used for promoting ESD in universities. Relevant sustainability aspects have been identified in different fields of studies by applying the final process model. The final process model was compared with drivers and barriers for enhancing ESD and for implementing management systems in universities and with succeeding with management systems in industry. It corresponds with these, meaning that drivers are taken into account and barriers tackled. Both ESD and management systems in universities could be considered successful memes, which can reflect an effective way of communication among individuals. I have identified that management systems could be used as tools for hard changes and to support the soft change of enhancing ESD in universities with management system. Based on the change management study I have summarized recommendations on what to consider in order to enable the studied change. The main practical implications of the results are that the process model could be applied for assessment, benchmarking and communication of ESD, connected to quality assurance, when applied. This is possible because the information can be assembled in one picture, which facilitates comparison. The memetic approach can be applied for structuring. It is viable to make comparative studies between cultures, for getting insight in special characteristics of the own culture. Action based research is suitable for involving faculty. Change management can be applied for planning a change, which both enhancing ESD and developing management systems are identified to be.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this master’s thesis is to examine if Weibull analysis is suitable method for warranty forecasting in the Case Company. The Case Company has used Reliasoft’s Weibull++ software, which is basing on the Weibull method, but the Company has noticed that the analysis has not given right results. This study was conducted making Weibull simulations in different profit centers of the Case Company and then comparing actual cost and forecasted cost. Simula-tions were made using different time frames and two methods for determining future deliveries. The first sub objective is to examine, which parameters of simulations will give the best result to each profit center. The second sub objective of this study is to create a simple control model for following forecasted costs and actual realized costs. The third sub objective is to document all Qlikview-parameters of profit centers. This study is a constructive research, and solutions for company’s problems are figured out in this master’s thesis. In the theory parts were introduced quality issues, for example; what is quality, quality costing and cost of poor quality. Quality is one of the major aspects in the Case Company, so understand-ing the link between quality and warranty forecasting is important. Warranty management was also introduced and other different tools for warranty forecasting. The Weibull method and its mathematical properties and reliability engineering were introduced. The main results of this master’s thesis are that the Weibull analysis forecasted too high costs, when calculating provision. Although, some forecasted values of profit centers were lower than actual values, the method works better for planning purposes. One of the reasons is that quality improving or alternatively quality decreasing is not showing in the results of the analysis in the short run. The other reason for too high values is that the products of the Case Company are com-plex and analyses were made in the profit center-level. The Weibull method was developed for standard products, but products of the Case Company consists of many complex components. According to the theory, this method was developed for homogeneous-data. So the most im-portant notification is that the analysis should be made in the product level, not the profit center level, when the data is more homogeneous.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Macroalgae are the main primary producers of the temperate rocky shores providing a three-dimensional habitat, food and nursery grounds for many other species. During the past decades, the state of the coastal waters has deteriorated due to increasing human pressures, resulting in dramatic changes in coastal ecosystems, including macroalgal communities. To reverse the deterioration of the European seas, the EU has adopted the Water Framework Directive (WFD) and the Marine Strategy Framework Directive (MSFD), aiming at improved status of the coastal waters and the marine environment. Further, the Habitats Directive (HD) calls for the protection of important habitats and species (many of which are marine) and the Maritime Spatial Planning Directive for sustainability in the use of resources and human activities at sea and by the coasts. To efficiently protect important marine habitats and communities, we need knowledge on their spatial distribution. Ecological knowledge is also needed to assess the status of the marine areas by involving biological indicators, as required by the WFD and the MSFD; knowledge on how biota changes with human-induced pressures is essential, but to reliably assess change, we need also to know how biotic communities vary over natural environmental gradients. This is especially important in sea areas such as the Baltic Sea, where the natural environmental gradients create substantial differences in biota between areas. In this thesis, I studied the variation occurring in macroalgal communities across the environmental gradients of the northern Baltic Sea, including eutrophication induced changes. The aim was to produce knowledge to support the reliable use of macroalgae as indicators of ecological status of the marine areas and to test practical metrics that could potentially be used in status assessments. Further, the aim was to develop a methodology for mapping the HD Annex I habitat reefs, using the best available data on geology and bathymetry. The results showed that the large-scale variation in the macroalgal community composition of the northern Baltic Sea is largely driven by salinity and exposure. Exposure is important also on smaller spatial scales, affecting species occurrence, community structure and depth penetration of algae. Consequently, the natural variability complicates the use of macroalgae as indicators of human-induced changes. Of the studied indicators, the number of perennial algal species, the perennial cover, the fraction of annual algae, and the lower limit of occurrence of red and brown perennial algae showed potential as usable indicators of ecological status. However, the cumulated cover of algae, commonly used as an indicator in the fully marine environments, showed low responses to eutrophication in the area. Although the mere occurrence of perennial algae did not show clear indicator potential, a distinct discrepancy in the occurrence of bladderwrack, Fucus vesiculosus, was found between two areas with differing eutrophication history, the Bothnian Sea and the Archipelago Sea. The absence of Fucus from many potential sites in the outer Archipelago Sea is likely due to its inability to recover from its disappearance from the area 30-40 years ago, highlighting the importance of past events in macroalgal occurrence. The methodology presented for mapping the potential distribution and the ecological value of reefs showed, that relatively high accuracy in mapping can be achieved by combining existing available data, and the maps produced serve as valuable background information for more detailed surveys. Taken together, the results of the theses contribute significantly to the knowledge on macroalgal communities of the northern Baltic Sea that can be directly applied in various management contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiac troponin (cTn) I and T are the recommended biomarkers for the diagnosis and risk stratification of patients with suspected acute coronary syndrome (ACS), a major cause of cardiovascular death and disability worldwide. It has recently been demonstrated that cTn-specific autoantibodies (cTnAAb) can negatively interfere with cTnI detection by immunoassays to the extent that cTnAAb-positive patients may be falsely designated as cTnI-negative. The aim of this thesis was to develop and optimize immunoassays for the detection of both cTnI and cTnAAb, which would eventually enable exploring the clinical impact of these autoantibodies on cTnI testing and subsequent patient management. The extent of cTnAAb interference in different cTnI assay configurations and the molecular characteristics of cTnAAbs were investigated in publications I and II, respectively. The findings showed that cTnI midfragment targeting immunoassays used predominantly in clinical practice are affected by cTnAAb interference which can be circumvented by using a novel 3+1-type assay design with three capture antibodies against the N-terminus, midfragment and C-terminus and one tracer antibody against the C-terminus. The use of this assay configuration was further supported by the epitope specificity study, which showed that although the midfragment is most commonly targeted by cTnAAbs, the interference basically encompasses the whole molecule, and there may be remarkable individual variation at the affected sites. In publications III and IV, all the data obtained in previous studies were utilized to develop an improved version of an existing cTnAAb assay and a sensitive cTnI assay free of this specific analytical interference. The results of the thesis showed that approximately one in 10 patients with suspected ACS have detectable amounts of cTnAAbs in their circulation and that cTnAAbs can inhibit cTnI determination when targeted against the binding sites of assay antibodies used in its immunological detection. In the light of these observations, the risk of clinical misclassification caused by the presence of cTnAAbs remains a valid and reasonable concern. Because the titers, affinities and epitope specificities of cTnAAbs and the concentration of endogenous cTnI determine the final effect of circulating cTnAAbs, appropriately sized studies on their clinical significance are warranted. The new cTnI and cTnAAb assays could serve as analytical tools for establishing the impact of cTnAAbs on cTnI testing and also for unraveling the etiology of cTn-related autoimmune responses.