44 resultados para hyperbolic double-complex Laplace operator

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

All positive-strand RNA viruses utilize cellular membranes for the assembly of their replication complexes, which results in extensive membrane modification in infected host cells. These alterations act as structural and functional scaffolds for RNA replication, providing protection for the viral double-stranded RNA against host defences. It is known that different positive-strand RNA viruses alter different cellular membranes. However, the origin of the targeted membranes, the mechanisms that direct replication proteins to specific membranes and the steps in the formation of the membrane bound replication complex are not completely understood. Alphaviruses (including Semliki Forest virus, SFV), members of family Togaviridae, replicate their RNA in association with membranes derived from the endosomal and lysosomal compartment, inducing membrane invaginations called spherules. Spherule structures have been shown to be the specific sites for RNA synthesis. Four replication proteins, nsP1-nsP4, are translated as a polyprotein (P1234) which is processed autocatalytically and gives rise to a membrane-bound replication complex. Membrane binding is mediated via nsP1 which possesses an amphipathic α-helix (binding peptide) in the central region of the protein. The aim of this thesis was to characterize the association of the SFV replication complex with cellular membranes and the modification of the membranes during virus infection. Therefore, it was necessary to set up the system for determining which viral components are needed for inducing the spherules. In addition, the targeting of the replication complex, the formation site of the spherules and their intracellular trafficking were studied in detail. The results of current work demonstrate that mutations in the binding peptide region of nsP1 are lethal for virus replication and change the localization of the polyprotein precursor P123. The replication complex is first targeted to the plasma membrane where membrane invaginations, spherules, are induced. Using a specific regulated endocytosis event the spherules are internalized from the plasma membrane in neutral carrier vesicles and transported via an actin-and microtubule-dependent manner to the pericentriolar area. Homotypic fusions and fusions with pre-existing acidic organelles lead to the maturation of previously described cytopathic vacuoles with hundreds of spherules on their limiting membranes. This work provides new insights into the membrane binding mechanism of SFV replication complex and its role in the virus life cycle. Development of plasmid-driven system for studying the formation of the replication complex described in this thesis allows various applications to address different steps in SFV life cycle and virus-host interactions in the future. This trans-replication system could be applied for many different viruses. In addition, the current work brings up new aspects of membranes and cellular components involved in SFV replication leading to further understanding in the formation and dynamics of the membrane-associated replication complex.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The feasibility of different modern analytical techniques for the mass spectrometric detection of anabolic androgenic steroids (AAS) in human urine was examined in order to enhance the prevalent analytics and to find reasonable strategies for effective sports drug testing. A comparative study of the sensitivity and specificity between gas chromatography (GC) combined with low (LRMS) and high resolution mass spectrometry (HRMS) in screening of AAS was carried out with four metabolites of methandienone. Measurements were done in selected ion monitoring mode with HRMS using a mass resolution of 5000. With HRMS the detection limits were considerably lower than with LRMS, enabling detection of steroids at low 0.2-0.5 ng/ml levels. However, also with HRMS, the biological background hampered the detection of some steroids. The applicability of liquid-phase microextraction (LPME) was studied with metabolites of fluoxymesterone, 4-chlorodehydromethyltestosterone, stanozolol and danazol. Factors affecting the extraction process were studied and a novel LPME method with in-fiber silylation was developed and validated for GC/MS analysis of the danazol metabolite. The method allowed precise, selective and sensitive analysis of the metabolite and enabled simultaneous filtration, extraction, enrichment and derivatization of the analyte from urine without any other steps in sample preparation. Liquid chromatographic/tandem mass spectrometric (LC/MS/MS) methods utilizing electrospray ionization (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) were developed and applied for detection of oxandrolone and metabolites of stanozolol and 4-chlorodehydromethyltestosterone in urine. All methods exhibited high sensitivity and specificity. ESI showed, however, the best applicability, and a LC/ESI-MS/MS method for routine screening of nine 17-alkyl-substituted AAS was thus developed enabling fast and precise measurement of all analytes with detection limits below 2 ng/ml. The potential of chemometrics to resolve complex GC/MS data was demonstrated with samples prepared for AAS screening. Acquired full scan spectral data (m/z 40-700) were processed by the OSCAR algorithm (Optimization by Stepwise Constraints of Alternating Regression). The deconvolution process was able to dig out from a GC/MS run more than the double number of components as compared with the number of visible chromatographic peaks. Severely overlapping components, as well as components hidden in the chromatographic background could be isolated successfully. All studied techniques proved to be useful analytical tools to improve detection of AAS in urine. Superiority of different procedures is, however, compound-dependent and different techniques complement each other.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 1997 the Finnish Jabal Haroun Project (FJHP) has studied the ruins of the monastery and pilgrimage complex (Gr. oikos) of Aaron located on a plateau of the Mountain of Prophet Aaron, Jabal an-Nabi Harûn, ca. 5 km to the south-west of the UNESCO World Heritage site of Petra in Jordan. The state of conservation and the damaging processes affecting the stone structures of the site are studied in this M.A. thesis. The chapel was chosen as an example, as it represents the phasing and building materials of the entire site. The aim of this work is to act as a preliminary study with regards to the planning of long-term conservation at the site. The research is empirical in nature. The condition of the stones in the chapel walls was mapped using the Illustrated Glossary on Stone Deterioration, by the ICOMOS International Scientific Committee for Stone. This glossary combines several standards and systems of damage mapping used in the field. Climatic conditions (temperature and RH %) were monitored for one year (9/2005-8/2006) using a HOBO Microstation datalogger. The measurements were compared with contemporary measurements from the nearest weather station in Wadi Musa. Salts in the stones were studied by taking samples from the stone surfaces by scraping and with the “Paper Pulp”-method; with a poultice of wet cellulose fiber (Arbocel BC1000) and analyzing what main types of salts were to be found in the samples. The climatic conditions on the mountain were expected to be rapidly changing and to differ clearly from conditions in the neighboring areas. The rapid changes were confirmed, but the values did not differ as much as expected from those nearby: the 12 months monitored had average temperatures and were somewhat drier than average. Earlier research in the area has shown that the geological properties of the stone material influence its deterioration. The damage mapping showed clearly, that salts are also a major reason for stone weathering. The salt samples contained several salt combinations, whose behavior in the extremely unstable climatic conditions is difficult to predict. Detailed mapping and regular monitoring of especially the structures, that are going remain exposed, is recommended in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The point of departure in this dissertation was the practical safety problem of unanticipated, unfamiliar events and unexpected changes in the environment, the demanding situations which the operators should take care of in the complex socio-technical systems. The aim of this thesis was to increase the understanding of demanding situations and of the resources for coping with these situations by presenting a new construct, a conceptual model called Expert Identity (ExId) as a way to open up new solutions to the problem of demanding situations and by testing the model in empirical studies on operator work. The premises of the Core-Task Analysis (CTA) framework were adopted as a starting point: core-task oriented working practices promote the system efficiency (incl. safety, productivity and well-being targets) and that should be supported. The negative effects of stress were summarised and the possible countermeasures related to the operators' personal resources such as experience, expertise, sense of control, conceptions of work and self etc. were considered. ExId was proposed as a way to bring emotional-energetic depth into the work analysis and to supplement CTA-based practical methods to discover development challenges and to contribute to the development of complex socio-technical systems. The potential of ExId to promote understanding of operator work was demonstrated in the context of the six empirical studies on operator work. Each of these studies had its own practical objectives within the corresponding quite broad focuses of the studies. The concluding research questions were: 1) Are the assumptions made in ExId on the basis of the different theories and previous studies supported by the empirical findings? 2) Does the ExId construct promote understanding of the operator work in empirical studies? 3) What are the strengths and weaknesses of the ExId construct? The layers and the assumptions of the development of expert identity appeared to gain evidence. The new conceptual model worked as a part of an analysis of different kinds of data, as a part of different methods used for different purposes, in different work contexts. The results showed that the operators had problems in taking care of the core task resulting from the discrepancy between the demands and resources (either personal or external). The changes of work, the difficulties in reaching the real content of work in the organisation and the limits of the practical means of support had complicated the problem and limited the possibilities of the development actions within the case organisations. Personal resources seemed to be sensitive to the changes, adaptation is taking place, but not deeply or quickly enough. Furthermore, the results showed several characteristics of the studied contexts that complicated the operators' possibilities to grow into or with the demands and to develop practices, expertise and expert identity matching the core task. They were: discontinuation of the work demands, discrepancy between conceptions of work held in the other parts of organisation, visions and the reality faced by the operators, emphasis on the individual efforts and situational solutions. The potential of ExId to open up new paths to solving the problem of the demanding situations and its ability to enable studies on practices in the field was considered in the discussion. The results were interpreted as promising enough to encourage the conduction of further studies on ExId. This dissertation proposes especially contribution to supporting the workers in recognising the changing demands and their possibilities for growing with them when aiming to support human performance in complex socio-technical systems, both in designing the systems and solving the existing problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Failures in industrial organizations dealing with hazardous technologies can have widespread consequences for the safety of the workers and the general population. Psychology can have a major role in contributing to the safe and reliable operation of these technologies. Most current models of safety management in complex sociotechnical systems such as nuclear power plant maintenance are either non-contextual or based on an overly-rational image of an organization. Thus, they fail to grasp either the actual requirements of the work or the socially-constructed nature of the work in question. The general aim of the present study is to develop and test a methodology for contextual assessment of organizational culture in complex sociotechnical systems. This is done by demonstrating the findings that the application of the emerging methodology produces in the domain of maintenance of a nuclear power plant (NPP). The concepts of organizational culture and organizational core task (OCT) are operationalized and tested in the case studies. We argue that when the complexity of the work, technology and social environment is increased, the significance of the most implicit features of organizational culture as a means of coordinating the work and achieving safety and effectiveness of the activities also increases. For this reason a cultural perspective could provide additional insight into the problem of safety management. The present study aims to determine; (1) the elements of the organizational culture in complex sociotechnical systems; (2) the demands the maintenance task sets for the organizational culture; (3) how the current organizational culture at the case organizations supports the perception and fulfilment of the demands of the maintenance work; (4) the similarities and differences between the maintenance cultures at the case organizations, and (5) the necessary assessment of the organizational culture in complex sociotechnical systems. Three in-depth case studies were carried out at the maintenance units of three Nordic NPPs. The case studies employed an iterative and multimethod research strategy. The following methods were used: interviews, CULTURE-survey, seminars, document analysis and group work. Both cultural analysis and task modelling were carried out. The results indicate that organizational culture in complex sociotechnical systems can be characterised according to three qualitatively different elements: structure, internal integration and conceptions. All three of these elements of culture as well as their interrelations have to be considered in organizational assessments or important aspects of the organizational dynamics will be overlooked. On the basis of OCT modelling, the maintenance core task was defined as balancing between three critical demands: anticipating the condition of the plant and conducting preventive maintenance accordingly, reacting to unexpected technical faults and monitoring and reflecting on the effects of maintenance actions and the condition of the plant. The results indicate that safety was highly valued at all three plants, and in that sense they all had strong safety cultures. In other respects the cultural features were quite different, and thus the culturally-accepted means of maintaining high safety also differed. The handicraft nature of maintenance work was emphasised as a source of identity at the NPPs. Overall, the importance of safety was taken for granted, but the cultural norms concerning the appropriate means to guarantee it were little reflected. A sense of control, personal responsibility and organizational changes emerged as challenging issues at all the plants. The study shows that in complex sociotechnical systems it is both necessary and possible to analyse the safety and effectiveness of the organizational culture. Safety in complex sociotechnical systems cannot be understood or managed without understanding the demands of the organizational core task and managing the dynamics between the three elements of the organizational culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been suggested that semantic information processing is modularized according to the input form (e.g., visual, verbal, non-verbal sound). A great deal of research has concentrated on detecting a separate verbal module. Also, it has traditionally been assumed in linguistics that the meaning of a single clause is computed before integration to a wider context. Recent research has called these views into question. The present study explored whether it is reasonable to assume separate verbal and nonverbal semantic systems in the light of the evidence from event-related potentials (ERPs). The study also provided information on whether the context influences processing of a single clause before the local meaning is computed. The focus was on an ERP called N400. Its amplitude is assumed to reflect the effort required to integrate an item to the preceding context. For instance, if a word is anomalous in its context, it will elicit a larger N400. N400 has been observed in experiments using both verbal and nonverbal stimuli. Contents of a single sentence were not hypothesized to influence the N400 amplitude. Only the combined contents of the sentence and the picture were hypothesized to influence the N400. The subjects (n = 17) viewed pictures on a computer screen while hearing sentences through headphones. Their task was to judge the congruency of the picture and the sentence. There were four conditions: 1) the picture and the sentence were congruent and sensible, 2) the sentence and the picture were congruent, but the sentence ended anomalously, 3) the picture and the sentence were incongruent but sensible, 4) the picture and the sentence were incongruent and anomalous. Stimuli from the four conditions were presented in a semi-randomized sequence. Their electroencephalography was simultaneously recorded. ERPs were computed for the four conditions. The amplitude of the N400 effect was largest in the incongruent sentence-picture -pairs. The anomalously ending sentences did not elicit a larger N400 than the sensible sentences. The results suggest that there is no separate verbal semantic system, and that the meaning of a single clause is not processed independent of the context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The kidney filtration barrier consists of fenestrated endothelial cell layer, glomerular basement membrane and slit diaphragm (SD), the specialized junction between glomerular viscelar epithelial cells (podocytes). Podocyte injury is associated with the development of proteinuria, and if not reversed the injury will lead to permanent deterioration of the glomerular filter. The early events are characterized by disruption of the integrity of the SD, but the molecular pathways involved are not fully understood. Congenital nephrotic syndrome of the Finnish type (CNF) is caused by mutations in NPHS1, the gene encoding the SD protein nephrin. Lack of nephrin results in loss of the SD and massive proteinuria beginning before birth. Furthermore, nephrin expression is decreased in acquired human kidney diseases including diabetic nephropathy. This highlights the importance of nephrin and consequently SD in regulating the kidney filtration function. However, the precise molecular mechanism of how nephrin is involved in the formation of the SD is unknown. This thesis work aimed at clarifying the role of nephrin and its interaction partners in the formation of the SD. The purpose was to identify novel proteins that associate with nephrin in order to define the essential molecular complex required for the establishment of the SD. The aim was also to decipher the role of novel nephrin interacting proteins in podocytes. Nephrin binds to nephrin-like proteins Neph1 and Neph2, and to adherens junction protein P-cadherin. These interactions have been suggested to play a role in the formation of the SD. In this thesis work, we identified densin as a novel interaction partner for nephrin. Densin was localized to the SD and it was shown to bind to adherens junction protein beta-catenin. Furthermore, densin was shown to behave in a similar fashion as adherens junction proteins in cell-cell contacts. These results indicate that densin may play a role in cell adhesion and, therefore, may contribute to the formation of the SD together with nephrin and adherens junction proteins. Nephrin was also shown to bind to Neph3, which has been previously localized to the SD. Neph3 and Neph1 were shown to induce cell adhesion alone, whereas nephrin needed to trans-interact with Neph1 or Neph3 from the opposite cell surface in order to make cell-cell contacts. This was associated with the decreased tyrosine phosphorylation of nephrin. These data extend the current knowledge of the molecular composition of the nephrin protein complex at the SD and also provide novel insights of how the SD may be formed. This thesis work also showed that densin was up-regulated in the podocytes of CNF patients. Neph3 was up-regulated in nephrin deficient mouse kidneys, which share similar podocyte alterations and lack of the SD as observed in CNF patients podocytes. These data suggest that densin and Neph3 may have a role in the formation of morphological alterations in podocytes detected in CNF patients. Furthermore, this thesis work showed that deletion of beta-catenin specifically from adult mouse podocytes protected the mice from the development of adriamycin-induced podocyte injury and proteinuria compared to wild-type mice. These results show that beta-catenin play a role in the adriamycin induced podocyte injury. Podocyte injury is a hallmark in many kidney diseases and the changes observed in the podocytes of CNF patient share characteristics with injured podocytes observed in chronic kidney diseases. Therefore, the results obtained in this thesis work suggest that densin, Neph3 and beta-catenin participate in the molecular pathways which result in morphological alterations commonly detected in injured podocytes in kidney diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the intermolecular interactions in (i) boron-nitrogen based systems for hydrogen splitting and storage, (ii) endohedral complexes, A@C60, and (iii) aurophilic dimers. We first present an introduction of intermolecular interactions. The theoretical background is then described. The research results are summarized in the following sections. In the boron-nitrogen systems, the electrostatic interaction is found to be the leading contribution, as 'Coulomb Pays for Heitler and London' (CHL). For the endohedral complex, the intermolecular interaction is formulated by a one-center expansion of the Coulomb operator 1/rab. For the aurophilic attraction between two C2v monomers, a London-type formula was derived by fully accounting for the anisotropy and point-group symmetry of the monomers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aminopolykarboksyylaatteja, kuten etyleenidiamiinitetraetikkahappoa (EDTA), on käytetty useiden vuosikymmenien ajan erinomaisen metalli-ionien sitomiskyvyn vuoksi kelatointiaineena lukuisissa sovelluksissa sekä analytiikassa että monilla teollisisuuden aloilla. Näiden yhdisteiden biohajoamattomuus on kuitenkin herättänyt huolta viime aikoina, sillä niiden on havaittu olevan hyvin pysyviä luonnossa. Tämä työ on osa laajempaa tutkimushanketta, jossa on tavoitteena löytää korvaavia kelatointiaineita EDTA:lle. Tutkimuksen aiheena on kuuden kelatointiaineen metalli-ionien sitomiskyvyn kartoitus. EDTA:a paremmin luonnossa hajoavina nämä ovat ympäristöystävällisiä ehdokkaita korvaaviksi kelatointiaineiksi useisiin sovelluksiin. Työssä tutkittiin niiden kompleksinmuodostusta useiden metalli-ionien kanssa potentiometrisella titrauksella. Metalli-ionivalikoima vaihteli hieman kelatointiaineesta riippuen sisältäen magnesium-, kalsium-, mangaani-, rauta-, kupari-, sinkki-, kadmium-, elohopea-, lyijy- ja lantaani-ionit. Tutkittavat metallit oli valittu tähtäimessä olevien sovellusten, synteesissä ilmenneiden ongelmien tai ympäristönäkökohtien perusteella. Tulokset osoittavat näiden yhdisteiden metallinsitomiskyvyn olevan jonkin verran heikompi kuin EDTA:lla, mutta kuitenkin riittävän useisiin sovelluksiin kuten sellunvalkaisuprosessiin. Myrkyllisten raskasmetallien, kadmiumin, elohopen ja lyijyn kohdalla EDTA:a heikompi sitoutuminen on eduksikin, koska se yhdistettynä parempaan biohajoavuuteen saattaa alentaa tutkittujen yhdisteiden kykyä mobilisoida kyseisiä metalleja sedimenteistä. Useimmilla tutkituista yhdisteistä on ympäristönäkökulmasta etuna myös EDTA:a pienempi typpipitoisuus.