948 resultados para extraction and separation techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The amounts of macro (P, K, Ca and Mg) and micronutrients (Cu and Zn) extracted with the Mehlich-1 (M1) solution, by the 1.0 mol L-1 KCl (KCl) and with the 0.1 mol L-1 HCl (HCl) for representative soil types of the Rio Grande do Sul state (Brazil) were compared with those extracted with the Mehlich-1 solution determined with the inductively coupled plasma optical emission spectroscopy (ICP). The amounts of nutrients extracted by the different methods showed high correlation coefficients. On average, the Mehlich-1 solution extracted similar amounts of P, determined with colorimetric and ICP methods, and, K determined with emission and ICP. The amounts of Ca and Mg extracted with the Mehlich-1 solution, determined by ICP, were similar to those extracted with the KCl solution determined by the atomic absorption spectrophotometry. The amounts of Cu and Zn extracted with the Mehlich-1 solution, determined by the ICP, were higher than those extracted with the 0.1 mol L-1 HCl determined by the atomic absorption spectrophotometry. The results indicate that the Mehlich-1 solution and ICP can be used for simultaneous multielement extraction and determination for Southern Brazilian soils. However, a conversion factor for values interpretation is needed. The use of the conversion factor to determine the K availability index in soils is adequate and does not affect the K recommendations for crops in southern Brazilian soils.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A sensitive and selective ultra-high performance liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was developed for the fast quantification of ten psychotropic drugs and metabolites in human plasma for the needs of our laboratory (amisulpride, asenapine, desmethyl-mirtazapine, iloperidone, mirtazapine, norquetiapine, olanzapine, paliperidone, quetiapine and risperidone). Stable isotope-labeled internal standards were used for all analytes, to compensate for the global method variability, including extraction and ionization variations. Sample preparation was performed by generic protein precipitation with acetonitrile. Chromatographic separation was achieved in less than 3.0min on an Acquity UPLC BEH Shield RP18 column (2.1mm×50mm; 1.7μm), using a gradient elution of 10mM ammonium formate buffer pH 3.0 and acetonitrile at a flow rate of 0.4ml/min. The compounds were quantified on a tandem quadrupole mass spectrometer operating in positive electrospray ionization mode, using multiple reaction monitoring. The method was fully validated according to the latest recommendations of international guidelines. Eight point calibration curves were used to cover a large concentration range 0.5-200ng/ml for asenapine, desmethyl-mirtazapine, iloperidone, mirtazapine, olanzapine, paliperidone and risperidone, and 1-1500ng/ml for amisulpride, norquetiapine and quetiapine. Good quantitative performances were achieved in terms of trueness (93.1-111.2%), repeatability (1.3-8.6%) and intermediate precision (1.8-11.5%). Internal standard-normalized matrix effects ranged between 95 and 105%, with a variability never exceeding 6%. The accuracy profiles (total error) were included in the acceptance limits of ±30% for biological samples. This method is therefore suitable for both therapeutic drug monitoring and pharmacokinetic studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brain perfusion can be assessed by CT and MR. For CT, two major techniquesare used. First, Xenon CT is an equilibrium technique based on a freely diffusibletracer. First pass of iodinated contrast injected intravenously is a second method,more widely available. Both methods are proven to be robust and quantitative,thanks to the linear relationship between contrast concentration and x-ray attenuation.For the CT methods, concern regarding x-ray doses delivered to the patientsneed to be addressed. MR is also able to assess brain perfusion using the firstpass of gadolinium based contrast agent injected intravenously. This method hasto be considered as a semi-quantitative because of the non linear relationshipbetween contrast concentration and MR signal changes. Arterial spin labelingis another MR method assessing brain perfusion without injection of contrast. Insuch case, the blood flow in the carotids is magnetically labelled by an externalradiofrequency pulse and observed during its first pass through the brain. Eachof this various CT and MR techniques have advantages and limits that will be illustratedand summarised.Learning Objectives:1. To understand and compare the different techniques for brain perfusionimaging.2. To learn about the methods of acquisition and post-processing of brainperfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to evaluate the influence of rootstocks and pruning times on yield and on nutrient content and extraction by pruned branches and harvested bunches of 'Niagara Rosada' grapevine in subtropical climate. The rootstocks 'IAC 766', 'IAC 572', 'IAC 313', 'IAC 571-6', and '106-8 Mgt' were evaluated. Treatments consisted of a combination between five rootstocks and three pruning times. At pruning, fresh and dry matter mass of branches were evaluated to estimate biomass accumulation. At harvest, yield was estimated by weighing of bunches per plant. Branches and bunches were sampled at pruning and at harvest, respectively, for nutrient content analysis. Nutrient content and dry matter mass of branches and bunches were used to estimate total nutrient extraction. 'Niagara Rosada' grapevine grafted onto the 'IAC 572' rootstock had the highest yield and dry matter mass of bunches, which were significantly different from the ones observed in 'Niagara Rosada'/'IAC 313'. 'Niagara Rosada' grafted onto the 'IAC 572' rootstock extracted the largest quantity of K, P, Mg, S, Cu, and Fe, differing from 'IAC 313' and 'IAC 766' in K and P extraction, and from '106-8 Mgt' in Mg and S extraction. Winter pruning results in higher yield, dry matter accumulation by branches, and total nutrient content and extraction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To-date, there has been no effective chiral capillary electrophoresis-mass spectrometry (CE-MS) method reported for the simultaneous enantioseparation of the antidepressant drug, venlafaxine (VX) and its structurally-similar major metabolite, O-desmethylvenlafaxine (O-DVX). This is mainly due to the difficulty of identifying MS compatible chiral selector, which could provide both high enantioselectivity and sensitive MS detection. In this work, poly-sodium N-undecenoyl-L,L-leucylalaninate (poly-L,L-SULA) was employed as a chiral selector after screening several dipeptide polymeric chiral surfactants. Baseline separation of both O-DVX and VX enantiomers was achieved in 15min after optimizing the buffer pH, poly-L,L-SULA concentration, nebulizer pressure and separation voltage. Calibration curves in spiked plasma (recoveries higher than 80%) were linear over the concentration range 150-5000ng/mL for both VX and O-DVX. The limit of detection (LOD) was found to be as low as 30ng/mL and 21ng/mL for O-DVX and VX, respectively. This method was successfully applied to measure the plasma concentrations of human volunteers receiving VX or O-DVX orally when co-administered without and with indinivar therapy. The results suggest that micellar electrokinetic chromatography electrospray ionization-tandem mass spectrometry (MEKC-ESI-MS/MS) is an effective low cost alternative technique for the pharmacokinetics and pharmacodynamics studies of both O-DVX and VX enantiomers. The technique has potential to identify drug-drug interaction involving VX and O-DVX enantiomers while administering indinivar therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Post-testicular sperm maturation occurs in the epididymis. The ion concentration and proteins secreted into the epididymal lumen, together with testicular factors, are believed to be responsible for the maturation of spermatozoa. Disruption of the maturation of spermatozoa in the epididymis provides a promising strategy for generating a male contraceptive. However, little is known about the proteins involved. For drug development, it is also essential to have tools to study the function of these proteins in vitro. One approach for screening novel targets is to study the secretory products of the epididymis or the G protein-coupled receptors (GPCRs) that are involved in the maturation process of the spermatozoa. The modified Ca2+ imaging technique to monitor release from PC12 pheochromocytoma cells can also be applied to monitor secretory products involved in the maturational processes of spermatozoa. PC12 pheochromocytoma cells were chosen for evaluation of this technique as they release catecholamines from their cell body, thus behaving like endocrine secretory cells. The results of the study demonstrate that depolarisation of nerve growth factor -differentiated PC12 cells releases factors which activate nearby randomly distributed HEL erythroleukemia cells. Thus, during the release process, the ligands reach concentrations high enough to activate receptors even in cells some distance from the release site. This suggests that communication between randomly dispersed cells is possible even if the actual quantities of transmitter released are extremely small. The development of a novel method to analyse GPCR-dependent Ca2+ signalling in living slices of mouse caput epididymis is an additional tool for screening for drug targets. By this technique it was possible to analyse functional GPCRs in the epithelial cells of the ductus epididymis. The results revealed that, both P2X- and P2Y-type purinergic receptors are responsible for the rapid and transient Ca2+ signal detected in the epithelial cells of caput epididymides. Immunohistochemical and reverse transcriptase-polymerase chain reaction (RTPCR) analyses showed the expression of at least P2X1, P2X2, P2X4 and P2X7, and P2Y1 and P2Y2 receptors in the epididymis. Searching for epididymis-specific promoters for transgene delivery into the epididymis is of key importance for the development of specific models for drug development. We used EGFP as the reporter gene to identify proper promoters to deliver transgenes into the epithelial cells of the mouse epididymis in vivo. Our results revealed that the 5.0 kb murine Glutathione peroxidase 5 (GPX5) promoter can be used to target transgene expression into the epididymis while the 3.8 kb Cysteine-rich secretory protein-1 (CRISP-1) promoter can be used to target transgene expression into the testis. Although the visualisation of EGFP in living cells in culture usually poses few problems, the detection of EGFP in tissue sections can be more difficult because soluble EGFP molecules can be lost if the cell membrane is damaged by freezing, sectioning, or permeabilisation. Furthermore, the fluorescence of EGFP is dependent on its conformation. Therefore, fixation protocols that immobilise EGFP may also destroy its usefulness as a fluorescent reporter. We therefore developed a novel tissue preparation and preservation techniques for EGFP. In addition, fluorescence spectrophotometry with epididymal epithelial cells in suspension revealed the expression of functional purinergic, adrenergic, cholinergic and bradykinin receptors in these cell lines (mE-Cap27 and mE-Cap28). In conclusion, we developed new tools for studying the role of the epididymis in sperm maturation. We developed a new technique to analyse GPCR dependent Ca2+ signalling in living slices of mouse caput epididymis. In addition, we improved the method of detecting reporter gene expression. Furthermore, we characterised two epididymis-specific gene promoters, analysed the expression of GPCRs in epididymal epithelial cells and developed a novel technique for measurement of secretion from cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: The purpose of this study was to determine the incidence and clinical symptoms associated with sharp mandibular bone irregularities (SMBI) after lower third molar extraction and to identify possible risk factors for this complication. Study Design: A mixed study design was used. A retrospective cohort study of 1432 lower third molar extractions was done to determine the incidence of SMBI and a retrospective case-control study was done to determine potential demographic and etiologic factors by comparing those patients with postoperative SMBI with controls. Results: Twelve SMBI were found (0.84%). Age was the most important risk factor for this complication. The operated side and the presence of an associated radiolucent image were also significantly related to the development of mandibular bone irregularities. The depth of impaction of the tooth might also be an important factor since erupted or nearly erupted third molars were more frequent in the SMBI group. Conclusions: SMBI are a rare postoperative complication after lower third molar removal. Older patients having left side lower third molars removed are more likely to develop this problem. The treatment should be the removal of the irregularity when the patient is symptomatic

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study concerns certain problems inherent to the determination of fat-soluble vitamins in food, from extraction methods to identification and quantification. The discussion involves the main official and unofficial extraction methods coupled with spectrophotometric and HPLC techniques in which vitamins samples are obtained through liquid-liquid-solid and liquid-liquid-solid-solid extraction, indispensable to the analytical separation of different chemical compounds with vitamin functions. A saponification stage, possibly coupled with supercritical fluid extraction appears to be mandatory in the determination of vitamins A and E in their alcoholic forms. Alternative identification and quantification procedures are outlined: biological and chemical assays, analytical separations by HPLC (normal and reversed-phase), UV detection (all fat-soluble vitamins) and fluorescence detection (retinoids and tocopherols). Automation from sample preparation to quantification stages increases the data acquisition rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An efficient flotation method based on the combination of flame atomic absorption spectrometry (FAAS) and separation and preconcentration step for determination of Cr3+, Cu 2+, Co2+, Ni2+, Zn2+, Cd 2+, Fe3+ and Pb2+ ions in various real samples by the possibility of applying bis(2-hydroxyacetophenone)-1,4-butanediimine (BHABDI) as a new collector was studied. The influence of pH, amount of BHABDI as collector, sample matrix, type and amount of eluting agent, type and amount of surfactant as floating agent, ionic strength and air flow rates i.e. variables affecting the efficiency of the extraction system was evaluated. It is ascertained that metal ions such as iron can be separated simultaneously from matrix in the presence of 0.012 mM ligand, 0.025% (w/v) of CTAB to a test sample of 750 mL at pH 6.5. These ions can be eluted quantitatively with 6 mL of 1.0 mol L-1 HNO3 in methanol which lead to the enrichment factor of 125. The detection limits for analyte ions were in the range of 1.3-2.4 ng mL-1. The method has been successfully applied for determination of trace amounts of ions in various real samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of medicinal plants and their use in industrial applications is increasing worldwide, especially in Brazil. Phyllanthus species, popularly known as "quebra-pedras" in Brazil, are used in folk medicine for treating urinary infections and renal calculus. This paper reports an authenticity study, based on herbal drugs from Phyllanthus species, involving commercial and authentic samples using spectroscopic techniques: FT-IR, ¹H HR-MAS NMR and ¹H NMR in solution, combined with chemometric analysis. The spectroscopic techniques evaluated, coupled with chemometric methods, have great potential in the investigation of complex matrices. Furthermore, several metabolites were identified by the NMR techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a three-week mini-project for an Experimental Organic Chemistry course. The activities include N-C cross-coupling synthesis of N-(4-methoxyphenyl) benzamide in an adapted microwave oven by a copper catalyst (CuI). Abilities and concepts normally present in practical organic chemistry courses are covered: use of balances, volumetric glassware, separation of mixtures (liquid-liquid extraction and filtration), chromatographic techniques, melting point determination and stoichiometric calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, a procedure is developed for cloud point extraction of Pd(II) and Rh(III) ions in aqueous solution using Span 80 (non-ionic surfactant) prior to their determination by flame atomic absorption spectroscopy. This method is based on the extraction of Pd(II) and Rh(III) ions at a pH of 10 using Span 80 with no chelating agent. We investigated the effect of various parameters on the recovery of the analyte ions, including pH, equilibration temperature and time, concentration of Span 80, and ionic strength. Under the best experimental conditions, the limits of detection based on 3Sb for Pd(II) and Rh(III) ions were 1.3 and 1.2 ng mL-1, respectively. Seven replicate determinations of a mixture of 0.5 µg mL-1 palladium and rhodium ions gave a mean absorbance of 0.058 and 0.053 with relative standard deviations of 1.8 and 1.6%, respectively. The developed method was successfully applied to the extraction and determination of the palladium and rhodium ions in road dust and standard samples and satisfactory results were obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.