970 resultados para GC pit
Resumo:
Recent epidemiological studies have shown a consistent association of the mass concentration of urban air thoracic (PM10) and fine (PM2.5) particles with mortality and morbidity among cardiorespiratory patients. However, the chemical characteristics of different particulate size ranges and the biological mechanisms responsible for these adverse health effects are not well known. The principal aims of this thesis were to validate a high volume cascade impactor (HVCI) for the collection of particulate matter for physicochemical and toxicological studies, and to make an in-depth chemical and source characterisation of samples collected during different pollution situations. The particulate samples were collected with the HVCI, virtual impactors and a Berner low pressure impactor in six European cities: Helsinki, Duisburg, Prague, Amsterdam, Barcelona and Athens. The samples were analysed for particle mass, common ions, total and water-soluble elements as well as elemental and organic carbon. Laboratory calibration and field comparisons indicated that the HVCI can provide a unique large capacity, high efficiency sampling of size-segregated aerosol particles. The cutoff sizes of the recommended HVCI configuration were 2.4, 0.9 and 0.2 μm. The HVCI mass concentrations were in a good agreement with the reference methods, but the chemical composition of especially the fine particulate samples showed some differences. This implies that the chemical characterization of the exposure variable in toxicological studies needs to be done from the same HVCI samples as used in cell and animal studies. The data from parallel, low volume reference samplers provide valuable additional information for chemical mass closure and source assessment. The major components of PM2.5 in the virtual impactor samples were carbonaceous compounds, secondary inorganic ions and sea salt, whereas those of coarse particles (PM2.5-10) were soil-derived compounds, carbonaceous compounds, sea salt and nitrate. The major and minor components together accounted for 77-106% and 77-96% of the gravimetrically-measured masses of fine and coarse particles, respectively. Relatively large differences between sampling campaigns were observed in the organic carbon content of the PM2.5 samples as well as the mineral composition of the PM2.5-10 samples. A source assessment based on chemical tracers suggested clear differences in the dominant sources (e.g. traffic, residential heating with solid fuels, metal industry plants, regional or long-range transport) between the sampling campaigns. In summary, the field campaigns exhibited different profiles with regard to particulate sources, size distribution and chemical composition, thus, providing a highly useful setup for toxicological studies on the size-segregated HVCI samples.
Resumo:
Polymer protected gold nanoparticles have successfully been synthesized by both "grafting-from" and "grafting-to" techniques. The synthesis methods of the gold particles were systematically studied. Two chemically different homopolymers were used to protect gold particles: thermo-responsive poly(N-isopropylacrylamide), PNIPAM, and polystyrene, PS. Both polymers were synthesized by using a controlled/living radical polymerization process, reversible addition-fragmentation chain transfer (RAFT) polymerization, to obtain monodisperse polymers of various molar masses and carrying dithiobenzoate end groups. Hence, particles protected either with PNIPAM, PNIPAM-AuNPs, or with a mixture of two polymers, PNIPAM/PS-AuNPs (i.e., amphiphilic gold nanoparticles), were prepared. The particles contain monodisperse polymer shells, though the cores are somewhat polydisperse. Aqueous PNIPAM-AuNPs prepared using a "grafting-from" technique, show thermo-responsive properties derived from the tethered PNIPAM chains. For PNIPAM-AuNPs prepared using a "grafting-to" technique, two-phase transitions of PNIPAM were observed in the microcalorimetric studies of the aqueous solutions. The first transition with a sharp and narrow endothermic peak occurs at lower temperature, and the second one with a broader peak at higher temperature. In the first transition PNIPAM segments show much higher cooperativity than in the second one. The observations are tentatively rationalized by assuming that the PNIPAM brush can be subdivided into two zones, an inner and an outer one. In the inner zone, the PNIPAM segments are close to the gold surface, densely packed, less hydrated, and undergo the first transition. In the outer zone, on the other hand, the PNIPAM segments are looser and more hydrated, adopt a restricted random coil conformation, and show a phase transition, which is dependent on both particle concentration and the chemical nature of the end groups of the PNIPAM chains. Monolayers of the amphiphilic gold nanoparticles at the air-water interface show several characteristic regions upon compression in a Langmuir trough at room temperature. These can be attributed to the polymer conformational transitions from a pancake to a brush. Also, the compression isotherms show temperature dependence due to the thermo-responsive properties of the tethered PNIPAM chains. The films were successfully deposited on substrates by Langmuir-Blodgett technique. The sessile drop contact angle measurements conducted on both sides of the monolayer deposited at room temperature reveal two slightly different contact angles, that may indicate phase separation between the tethered PNIPAM and PS chains on the gold core. The optical properties of amphiphilic gold nanoparticles were studied both in situ at the air-water interface and on the deposited films. The in situ SPR band of the monolayer shows a blue shift with compression, while a red shift with the deposition cycle occurs in the deposited films. The blue shift is compression-induced and closely related to the conformational change of the tethered PNIPAM chains, which may cause a decrease in the polarity of the local environment of the gold cores. The red shift in the deposited films is due to a weak interparticle coupling between adjacent particles. Temperature effects on the SPR band in both cases were also investigated. In the in situ case, at a constant surface pressure, an increase in temperature leads to a red shift in the SPR, likely due to the shrinking of the tethered PNIPAM chains, as well as to a slight decrease of the distance between the adjacent particles resulting in an increase in the interparticle coupling. However, in the case of the deposited films, the SPR band red-shifts with the deposition cycles more at a high temperature than at a low temperature. This is because the compressibility of the polymer coated gold nanoparticles at a high temperature leads to a smaller interparticle distance, resulting in an increase of the interparticle coupling in the deposited multilayers.
Resumo:
Pressurised hot water extraction (PHWE) exploits the unique temperature-dependent solvent properties of water minimising the use of harmful organic solvents. Water is environmentally friendly, cheap and easily available extraction medium. The effects of temperature, pressure and extraction time in PHWE have often been studied, but here the emphasis was on other parameters important for the extraction, most notably the dimensions of the extraction vessel and the stability and solubility of the analytes to be extracted. Non-linear data analysis and self-organising maps were employed in the data analysis to obtain correlations between the parameters studied, recoveries and relative errors. First, pressurised hot water extraction (PHWE) was combined on-line with liquid chromatography-gas chromatography (LC-GC), and the system was applied to the extraction and analysis of polycyclic aromatic hydrocarbons (PAHs) in sediment. The method is of superior sensitivity compared with the traditional methods, and only a small 10 mg sample was required for analysis. The commercial extraction vessels were replaced by laboratory-made stainless steel vessels because of some problems that arose. The performance of the laboratory-made vessels was comparable to that of the commercial ones. In an investigation of the effect of thermal desorption in PHWE, it was found that at lower temperatures (200ºC and 250ºC) the effect of thermal desorption is smaller than the effect of the solvating property of hot water. At 300ºC, however, thermal desorption is the main mechanism. The effect of the geometry of the extraction vessel on recoveries was studied with five specially constructed extraction vessels. In addition to the extraction vessel geometry, the sediment packing style and the direction of water flow through the vessel were investigated. The geometry of the vessel was found to have only minor effect on the recoveries, and the same was true of the sediment packing style and the direction of water flow through the vessel. These are good results because these parameters do not have to be carefully optimised before the start of extractions. Liquid-liquid extraction (LLE) and solid-phase extraction (SPE) were compared as trapping techniques for PHWE. LLE was more robust than SPE and it provided better recoveries and repeatabilities than did SPE. Problems related to blocking of the Tenax trap and unrepeatable trapping of the analytes were encountered in SPE. Thus, although LLE is more labour intensive, it can be recommended over SPE. The stabilities of the PAHs in aqueous solutions were measured using a batch-type reaction vessel. Degradation was observed at 300ºC even with the shortest heating time. Ketones and quinones and other oxidation products were observed. Although the conditions of the stability studies differed considerably from the extraction conditions in PHWE, the results indicate that the risk of analyte degradation must be taken into account in PHWE. The aqueous solubilities of acenaphthene, anthracene and pyrene were measured, first below and then above the melting point of the analytes. Measurements below the melting point were made to check that the equipment was working, and the results were compared with those obtained earlier. Good agreement was found between the measured and literature values. A new saturation cell was constructed for the solubility measurements above the melting point of the analytes because the flow-through saturation cell could not be used above the melting point. An exponential relationship was found between the solubilities measured for pyrene and anthracene and temperature.
Resumo:
Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.
Resumo:
The cossid moth (Coryphodema tristis) has a broad range of native tree hosts in South Africa. The moth recently moved into non-native Eucalyptus plantations in South Africa, on which it now causes significant damage. Here we investigate the chemicals involved in pheromone communication between the sexes of this moth in order to better understand its ecology, and with a view to potentially develop management tools for it. In particular, we characterize female gland extracts and headspace samples through coupled gas chromatography electro-antennographic detection (GC-EAD) and two dimensional gas chromatography mass spectrometry (GCxGC-MS). Tentative identities of the potential pheromone compounds were confirmed by comparing both retention time and mass spectra with authentic standards. Two electrophysiologically active pheromone compounds, tetradecyl acetate (14:OAc) and Z9-tetradecenyl acetate (Z9-14:OAc) were identified from pheromone gland extracts, and an additional compound (Z9-14:OH) from headspace samples. We further determined dose response curves for the identified compounds and six other structurally similar compounds that are common to the order Cossidae. Male antennae showed superior sensitivity toward Z9-14:OAc, Z7-tetradecenyl acetate (Z7-14:OAc), E9-tetradecenyl acetate (E9-14:OAc), Z9-tetradecenol (Z9-14:OH) and Z9-tetradecenal (Z9-14:Ald) when compared to female antennae. While we could show electrophysiological responses to single pheromone compounds, behavioral attraction of males was dependent on the synergistic effect of at least two of these compounds. Signal specificity is shown to be gained through pheromone blends. A field trial showed that a significant number of males were caught only in traps baited with a combination of Z9-14:OAc (circa 95 of the ratio) and Z9-14:OH. Addition of 14:OAc to this mixture also improved the number of males caught, although not significantly. This study represents a major step towards developing a useful attractant to be used in management tools for C. tristis and contributes to the understanding of chemical communication and biology of this group of insects.
Resumo:
Frictions are factors that hinder trading of securities in financial markets. Typical frictions include limited market depth, transaction costs, lack of infinite divisibility of securities, and taxes. Conventional models used in mathematical finance often gloss over these issues, which affect almost all financial markets, by arguing that the impact of frictions is negligible and, consequently, the frictionless models are valid approximations. This dissertation consists of three research papers, which are related to the study of the validity of such approximations in two distinct modeling problems. Models of price dynamics that are based on diffusion processes, i.e., continuous strong Markov processes, are widely used in the frictionless scenario. The first paper establishes that diffusion models can indeed be understood as approximations of price dynamics in markets with frictions. This is achieved by introducing an agent-based model of a financial market where finitely many agents trade a financial security, the price of which evolves according to price impacts generated by trades. It is shown that, if the number of agents is large, then under certain assumptions the price process of security, which is a pure-jump process, can be approximated by a one-dimensional diffusion process. In a slightly extended model, in which agents may exhibit herd behavior, the approximating diffusion model turns out to be a stochastic volatility model. Finally, it is shown that when agents' tendency to herd is strong, logarithmic returns in the approximating stochastic volatility model are heavy-tailed. The remaining papers are related to no-arbitrage criteria and superhedging in continuous-time option pricing models under small-transaction-cost asymptotics. Guasoni, Rásonyi, and Schachermayer have recently shown that, in such a setting, any financial security admits no arbitrage opportunities and there exist no feasible superhedging strategies for European call and put options written on it, as long as its price process is continuous and has the so-called conditional full support (CFS) property. Motivated by this result, CFS is established for certain stochastic integrals and a subclass of Brownian semistationary processes in the two papers. As a consequence, a wide range of possibly non-Markovian local and stochastic volatility models have the CFS property.
Resumo:
In this thesis we study a few games related to non-wellfounded and stationary sets. Games have turned out to be an important tool in mathematical logic ranging from semantic games defining the truth of a sentence in a given logic to for example games on real numbers whose determinacies have important effects on the consistency of certain large cardinal assumptions. The equality of non-wellfounded sets can be determined by a so called bisimulation game already used to identify processes in theoretical computer science and possible world models for modal logic. Here we present a game to classify non-wellfounded sets according to their branching structure. We also study games on stationary sets moving back to classical wellfounded set theory. We also describe a way to approximate non-wellfounded sets with hereditarily finite wellfounded sets. The framework used to do this is domain theory. In the Banach-Mazur game, also called the ideal game, the players play a descending sequence of stationary sets and the second player tries to keep their intersection stationary. The game is connected to precipitousness of the corresponding ideal. In the pressing down game first player plays regressive functions defined on stationary sets and the second player responds with a stationary set where the function is constant trying to keep the intersection stationary. This game has applications in model theory to the determinacy of the Ehrenfeucht-Fraisse game. We show that it is consistent that these games are not equivalent.
Resumo:
Increased activation of c-src seen in colorectal cancer is an indicator of a poor clinical prognosis, suggesting that identification of downstream effectors of c-src may lead to new avenues of therapy. Guanylyl cyclase C (GC-C) is a receptor for the gastrointestinal hormones guanylin and uroguanylin and the bacterial heat-stable enterotoxin. Though activation of GC-C by its ligands elevates intracellular cyclic GMP (cGMP) levels and inhibits cell proliferation, its persistent expression in colorectal carcinomas and occult metastases makes it a marker for malignancy. We show here that GC-C is a substrate for inhibitory phosphorylation by c-src, resulting in reduced ligand-mediated cGMP production. Consequently, active c-src in colonic cells can overcome GC-C-mediated control of the cell cycle. Furthermore, docking of the c-src SH2 domain to phosphorylated GC-C results in colocalization and further activation of c-src. We therefore propose a novel feed-forward mechanism of activation of c-src that is induced by cross talk between a receptor GC and a tyrosine kinase. Our findings have important implications in understanding the molecular mechanisms involved in the progression and treatment of colorectal cancer.
Resumo:
In this thesis we study a series of multi-user resource-sharing problems for the Internet, which involve distribution of a common resource among participants of multi-user systems (servers or networks). We study concurrently accessible resources, which for end-users may be exclusively accessible or non-exclusively. For all kinds we suggest a separate algorithm or a modification of common reputation scheme. Every algorithm or method is studied from different perspectives: optimality of protocols, selfishness of end users, fairness of the protocol for end users. On the one hand the multifaceted analysis allows us to select the most suited protocols among a set of various available ones based on trade-offs of optima criteria. On the other hand, the future Internet predictions dictate new rules for the optimality we should take into account and new properties of the networks that cannot be neglected anymore. In this thesis we have studied new protocols for such resource-sharing problems as the backoff protocol, defense mechanisms against Denial-of-Service, fairness and confidentiality for users in overlay networks. For backoff protocol we present analysis of a general backoff scheme, where an optimization is applied to a general-view backoff function. It leads to an optimality condition for backoff protocols in both slot times and continuous time models. Additionally we present an extension for the backoff scheme in order to achieve fairness for the participants in an unfair environment, such as wireless signal strengths. Finally, for the backoff algorithm we suggest a reputation scheme that deals with misbehaving nodes. For the next problem -- denial-of-service attacks, we suggest two schemes that deal with the malicious behavior for two conditions: forged identities and unspoofed identities. For the first one we suggest a novel most-knocked-first-served algorithm, while for the latter we apply a reputation mechanism in order to restrict resource access for misbehaving nodes. Finally, we study the reputation scheme for the overlays and peer-to-peer networks, where resource is not placed on a common station, but spread across the network. The theoretical analysis suggests what behavior will be selected by the end station under such a reputation mechanism.
Resumo:
Ubiquitous computing is about making computers and computerized artefacts a pervasive part of our everyday lifes, bringing more and more activities into the realm of information. The computationalization, informationalization of everyday activities increases not only our reach, efficiency and capabilities but also the amount and kinds of data gathered about us and our activities. In this thesis, I explore how information systems can be constructed so that they handle this personal data in a reasonable manner. The thesis provides two kinds of results: on one hand, tools and methods for both the construction as well as the evaluation of ubiquitous and mobile systems---on the other hand an evaluation of the privacy aspects of a ubiquitous social awareness system. The work emphasises real-world experiments as the most important way to study privacy. Additionally, the state of current information systems as regards data protection is studied. The tools and methods in this thesis consist of three distinct contributions. An algorithm for locationing in cellular networks is proposed that does not require the location information to be revealed beyond the user's terminal. A prototyping platform for the creation of context-aware ubiquitous applications called ContextPhone is described and released as open source. Finally, a set of methodological findings for the use of smartphones in social scientific field research is reported. A central contribution of this thesis are the pragmatic tools that allow other researchers to carry out experiments. The evaluation of the ubiquitous social awareness application ContextContacts covers both the usage of the system in general as well as an analysis of privacy implications. The usage of the system is analyzed in the light of how users make inferences of others based on real-time contextual cues mediated by the system, based on several long-term field studies. The analysis of privacy implications draws together the social psychological theory of self-presentation and research in privacy for ubiquitous computing, deriving a set of design guidelines for such systems. The main findings from these studies can be summarized as follows: The fact that ubiquitous computing systems gather more data about users can be used to not only study the use of such systems in an effort to create better systems but in general to study phenomena previously unstudied, such as the dynamic change of social networks. Systems that let people create new ways of presenting themselves to others can be fun for the users---but the self-presentation requires several thoughtful design decisions that allow the manipulation of the image mediated by the system. Finally, the growing amount of computational resources available to the users can be used to allow them to use the data themselves, rather than just being passive subjects of data gathering.
Resumo:
The authors derive the Korteweg-de Vries equation in a multicomponent plasma that includes any number of positive and negative ions. The solitary wave solutions are also found explicitly for the case of isothermal and non-isothermal electrons.
Resumo:
Kulutuselektroniikka, matkapuhelin- ja tietotekniikka ovat viime vuosina lähentyneet toisiaan. Perinteisesti näitä tekniikoita vastaavat verkot, päätelaitteet ja sisältö ovat olleet erillisiä. Nykyään Internet-teknologia ja viestinnän digitalisoituminen yhdentää verkkoja. Palveluntarjoajat tuottavat sisältöä kasvavassa määrin digitaalisena. Päätelaitteetkin yhdentyvät. Esimerkiksi kannettavaa tietokonetta voidaan pitää matkapuhelimena tai televisiovastaanottimena, kun se on varustettu tarvittavalla laitteistolla ja ohjelmistolla. Sisältöjen, verkkojen ja päätelaitteiden yhdentyminen on saanut yhä useammat kuluttajat rakentamaan kotiverkkoja. Tutkielman tavoitteena on esitellä ja vertailla markkinoilla olevia laitteita ja ohjelmistoja, joiden avulla voidaan rakentaa toimiva kotiverkko. Kotiverkolla tarkoitetaan tässä tutkielmassa kodin sisäpuolista verkkoa, joka on liitetty toimivasti kodin ulkopuolisiin verkkoihin. Tutkielmassa tarkastellaan ensin yleisimpiä lähiverkko- ja lyhyen kantaman verkkotekniikoita, joita käytetään kotiverkkoja rakennettaessa. Tämän jälkeen tarkastellaan joitakin väliohjelmistoratkaisuja, joilla voidaan hallita kotiverkon laitteiden vuorovaikutusta ja turvallisuutta. Lopuksi suunnitellaan konkreettinen kotiverkko. Käyttötilanteiden avulla havainnollistaan verkkotekniikoiden ja väliohjelmiston yhteistoimintaa tietoliikenteen hallinnassa.
Resumo:
Tosiaikainen tietovarasto on keskitetty tietokantajärjestelmä pehmeitä tosiaikaisia liiketoimintatiedon hallintasovelluksia varten. Näiden sovellusten perusvaatimuksena on tuoreen tiedon jatkuva saatavuus. Työssä käsitellään tosiaikaisen tietovaraston suunnittelua, tietovaraston jatkuvan ylläpidon eri vaiheita sekä näihin vaiheisiin soveltuvia menetelmiä. Tarkoitus on tuoda esiin kompromisseja, joita väistämättä joudutaan tekemään tietovaraston kyselytehokkuuden, viiveen ja jatkuvan saatavuuden välillä. Johtopäätöksenä suositellaan sitä suurempaa varovaisuutta mitä pienempiä viiveitä tavoitellaan. Liiketoimintatiedon hallintasovellusten tosiaikaisuus on ominaisuus, jota käyttäjät tavallisesti haluavat enemmän kuin tarvitsevat. Joissakin tapauksissa tosiaikaisuus on suorastaan haitallista. Mutta jos tosiaikainen tieto on välttämätöntä, samanaikaisia käyttäjiä on paljon, ja tarvittavat tiedot pitää yhdistää useasta lähdejärjestelmästä, niin tosiaikaiselle tietovarastoinnille ei ole kelvollista vaihtoehtoa. Tällöinkin riittää, että jatkuvasti ylläpidetään vain pientä osaa koko tietovarastosta.
Resumo:
Requirements engineering is an important phase in software development where customer's needs and expectations are transformed into a software requirements specification. The requirements specification can be considered as an agreement between the customer and the developer where both parties agree on the expected system features and behaviour. However, requirements engineers must deal with a variety of issues that complicate the requirements process. The communication gap between the customer and the developers is among typical reasons for unsatisfactory requirements. In this thesis we study how the use case technique could be used in requirements engineering in bridging the communication gap between the customer and development team. We also discuss how a use case description can be use cases can be used as a basis for acceptance test cases.
Resumo:
Kansainvälisen oikeuden alaan kuuluvassa tutkielmassa käsitellään humanitaarisen intervention oikeutusta ja laillisuutta. Tutkimuskysymyksenä on, missä määrin humanitaarisilla näkökohdilla perusteltuja sotilaallisia toimia tai niillä uhkaamista voi pitää kansainvälisoikeudellisesti hyväksyttävänä ja millainen painoarvo ennakkotapauksena olisi annettava NATO-maiden Kosovossa toteuttamalle väliintulolle. Tutkielmassa perehdytään ihmisoikeusajattelun tiettyjen taustaoletusten kritiikkiin. Tarkastelun kohteena ovat erityisesti kannanotot, joiden mukaan ihmisoikeuksia ei voi pitää luonteeltaan universaaleina, sekä kyseiseen kritiikkiin liittyvät väitteet siitä, että ns. hegemonisessa asemassa olevat valtiot hyödyntävät ihmisoikeusargumentteja oikeuttaakseen voimankäyttönsä. Universaalisuuskritiikkiä voidaan pitää pitkälti perusteltuna, mutta nykyinen kansainvälinen yhteisö tarvitsee kuitenkin tietynlaisia yleismaailmallisia normeja voidakseen toimia tehokkaasti. Kritiikin ei voikaan katsoa pätevän humanitaarisen intervention kannalta keskeisiin ihmisoikeusnormeihin kuten kansanmurhan kieltoon, sillä kyseiset velvoitteet suojaavat kansainvälisen yhteisön toimivuutta ja uskottavuutta. Humanitaarisiin argumentteihin liittyy kuitenkin muita ongelmia: niillä on esimerkiksi aika ajoin pyritty oikeuttamaan sotilaallisia toimia, joissa ihmisoikeusnäkökohdat eivät välttämättä ole olleet etusijalla. Ihmisoikeuksille ei ole syytä antaa kansainvälisessä oikeudessa asemaa universaaleina "superargumentteina", jotka eivät olisi kyseenalaistettavissa. YK:n peruskirjan ja kansainvälisen tapaoikeuden näkökulmasta humanitaarisen intervention kaltaiseen voimankäyttöön vaaditaan turvallisuusneuvoston hyväksyntä, jota ei Kosovo-operaatioon saatu. Interventiota voi tässä suhteessa pitää yksiselitteisesti laittomana, sillä sen tueksi esitetyt oikeudelliset argumentit eivät ole vakuuttavia. Tapaukseen liittyvät ihmisoikeusnäkökohdat ovat kuitenkin siinä määrin merkittäviä, että ongelmaan ei ole perusteltua suhtautua tiukan legalistisesti. Operaation hyväksyminen moraaliargumenttien nojalla voisi kuitenkin johtaa nykyisten voimankäyttörajoitusten marginalisoitumiseen, mikä olisi yllä käsitellyn kritiikin valossa ongelmallista. Tutkielmassa nostetaan suositeltavaksi ratkaisuksi lähestymistapa, jossa Kosovon tapaus ymmärretään yksittäisenä oikeudenvastaisena mutta samalla oikeudenulkoisena poikkeustapauksena. Tällöin peruskirjan mukainen voimankäytön sääntely säilyy entisellään ilman että humanitaariset näkökohdat jäisivät tyystin huomiotta. Ratkaisu ei sulje pois mahdollisuutta suhtautua positiivisesti Kosovo-operaation mahdollisesti luomaan "poliittiseen normiin": suuren mittakaavan ihmisoikeusloukkaukset eivät jää Euroopassa seurauksitta. Ilman turvallisuusneuvoston suostumusta toteutettaviin humanitaarisiin interventioihin liittyvien käytännöllisten ja kansainvälisoikeudellisten riskien vuoksi niihin on kuitenkin aihetta suhtautua suurella varauksella.