912 resultados para Appearance-based methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Two concentration methods for fast and routine determination of caffeine (using HPLC-UV detection) in surface, and wastewater are evaluated. Both methods are based on solid-phase extraction (SPE) concentration with octadecyl silica sorbents. A common “offline” SPE procedure shows that quantitative recovery of caffeine is obtained with 2 mL of an elution mixture solvent methanol-water containing at least 60% methanol. The method detection limit is 0.1 μg L−1 when percolating 1 L samples through the cartridge. The development of an “online” SPE method based on a mini-SPE column, containing 100 mg of the same sorbent, directly connected to the HPLC system allows the method detection limit to be decreased to 10 ng L−1 with a sample volume of 100 mL. The “offline” SPE method is applied to the analysis of caffeine in wastewater samples, whereas the “on-line” method is used for analysis in natural waters from streams receiving significant water intakes from local wastewater treatment plants

Relevância:

40.00% 40.00%

Publicador:

Resumo:

AIMS: To investigate the relationships between gestational diabetes mellitus (GDM) and the metabolic syndrome (MS), as it was suggested that insulin resistance was the hallmark of both conditions. To analyse post-partum screening in order to identify risk factors for the subsequent development of type 2 diabetes mellitus (DM). METHODS: A retrospective analysis of all singleton pregnancies diagnosed with GDM at the Lausanne University Hospital for 3 consecutive years. Pre-pregnancy obesity, hypertension and dyslipidaemia were recorded as constituents of the MS. RESULTS: For 5788 deliveries, 159 women (2.7%) with GDM were identified. Constituents of the MS were present before GDM pregnancy in 26% (n = 37/144): 84% (n = 31/37) were obese, 38% (n = 14/37) had hypertension and 22% (n = 8/37) had dyslipidaemia. Gestational hypertension was associated with obesity (OR = 3.2, P = 0.02) and dyslipidaemia (OR = 5.4, P=0.002). Seventy-four women (47%) returned for post-partum OGTT, which was abnormal in 20 women (27%): 11% (n = 8) had type 2 diabetes and 16% (n = 12) had impaired glucose tolerance. Independent predictors of abnormal glucose tolerance in the post-partum were: having > 2 abnormal values on the diagnostic OGTT during pregnancy and presenting MS constituents (OR = 5.2, CI 1.8-23.2 and OR = 5.3, CI 1.3-22.2). CONCLUSIONS: In one fourth of GDM pregnancies, metabolic abnormalities precede the appearance of glucose intolerance. These women have a high risk of developing the MS and type 2 diabetes in later years. Where GDM screening is not universal, practitioners should be aware of those metabolic risks in every pregnant woman presenting with obesity, hypertension or dyslipidaemia, in order to achieve better diagnosis and especially better post-partum follow-up and treatment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A statewide study was performed to develop regional regression equations for estimating selected annual exceedance- probability statistics for ungaged stream sites in Iowa. The study area comprises streamgages located within Iowa and 50 miles beyond the State’s borders. Annual exceedanceprobability estimates were computed for 518 streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data through 2010. The estimation of the selected statistics included a Bayesian weighted least-squares/generalized least-squares regression analysis to update regional skew coefficients for the 518 streamgages. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low flows. Also, geographic information system software was used to measure 59 selected basin characteristics for each streamgage. Regional regression analysis, using generalized leastsquares regression, was used to develop a set of equations for each flood region in Iowa for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. A total of 394 streamgages were included in the development of regional regression equations for three flood regions (regions 1, 2, and 3) that were defined for Iowa based on landform regions and soil regions. Average standard errors of prediction range from 31.8 to 45.2 percent for flood region 1, 19.4 to 46.8 percent for flood region 2, and 26.5 to 43.1 percent for flood region 3. The pseudo coefficients of determination for the generalized leastsquares equations range from 90.8 to 96.2 percent for flood region 1, 91.5 to 97.9 percent for flood region 2, and 92.4 to 96.0 percent for flood region 3. The regression equations are applicable only to stream sites in Iowa with flows not significantly affected by regulation, diversion, channelization, backwater, or urbanization and with basin characteristics within the range of those used to develop the equations. These regression equations will be implemented within the U.S. Geological Survey StreamStats Web-based geographic information system tool. StreamStats allows users to click on any ungaged site on a river and compute estimates of the eight selected statistics; in addition, 90-percent prediction intervals and the measured basin characteristics for the ungaged sites also are provided by the Web-based tool. StreamStats also allows users to click on any streamgage in Iowa and estimates computed for these eight selected statistics are provided for the streamgage.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: Pharmacologic modulation of wound healing after glaucoma filtering surgery remains a major clinical challenge in ophthalmology. Poly(ortho ester) (POE) is a bioerodible and biocompatible viscous polymer potentially useful as a sustained drug delivery system that allows the frequency of intraocular injections to be reduced. The purpose of this study was to determine the efficacy of POE containing a precise amount of 5-fluorouracil (5-FU) in an experimental model of filtering surgery in the rabbit. METHODS: Trabeculectomy was performed in pigmented rabbit eyes. An ointmentlike formulation of POE containing 1% wt/wt 5-FU was injected subconjunctivally at the site of surgery, during the procedure. Intraocular pressure (IOP), bleb persistence, and ocular inflammatory reaction were monitored until postoperative day 30. Quantitative analysis of 5-FU was performed in the anterior chamber. Histologic analysis was used to assess the appearance of the filtering fistula and the polymer's biocompatibility. RESULTS: The decrease in IOP from baseline and the persistence of the filtering bleb were significantly more marked in the 5-FU-treated eyes during postoperative days 9 through 28. Corneal toxicity triggered by 5-FU was significantly lower in the group that received 5-FU in POE compared with a 5-FU tamponade. Histopathologic evaluation showed that POE was well tolerated, and no fibrosis occurred in eyes treated with POE containing 5-FU. CONCLUSIONS: In this rabbit model of trabeculectomy, the formulation based on POE and containing a precise amount of 5-FU reduced IOP and prolonged bleb persistence in a way similar to the conventional method of a 5-FU tamponade, while significantly reducing 5-FU toxicity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the evaluation results of the methods submitted to Challenge US: Biometric Measurements from Fetal Ultrasound Images, a segmentation challenge held at the IEEE International Symposium on Biomedical Imaging 2012. The challenge was set to compare and evaluate current fetal ultrasound image segmentation methods. It consisted of automatically segmenting fetal anatomical structures to measure standard obstetric biometric parameters, from 2D fetal ultrasound images taken on fetuses at different gestational ages (21 weeks, 28 weeks, and 33 weeks) and with varying image quality to reflect data encountered in real clinical environments. Four independent sub-challenges were proposed, according to the objects of interest measured in clinical practice: abdomen, head, femur, and whole fetus. Five teams participated in the head sub-challenge and two teams in the femur sub-challenge, including one team who tackled both. Nobody attempted the abdomen and whole fetus sub-challenges. The challenge goals were two-fold and the participants were asked to submit the segmentation results as well as the measurements derived from the segmented objects. Extensive quantitative (region-based, distance-based, and Bland-Altman measurements) and qualitative evaluation was performed to compare the results from a representative selection of current methods submitted to the challenge. Several experts (three for the head sub-challenge and two for the femur sub-challenge), with different degrees of expertise, manually delineated the objects of interest to define the ground truth used within the evaluation framework. For the head sub-challenge, several groups produced results that could be potentially used in clinical settings, with comparable performance to manual delineations. The femur sub-challenge had inferior performance to the head sub-challenge due to the fact that it is a harder segmentation problem and that the techniques presented relied more on the femur's appearance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Recent neuroimaging studies suggest that value-based decision-making may rely on mechanisms of evidence accumulation. However no studies have explicitly investigated the time when single decisions are taken based on such an accumulation process. NEW METHOD: Here, we outline a novel electroencephalography (EEG) decoding technique which is based on accumulating the probability of appearance of prototypical voltage topographies and can be used for predicting subjects' decisions. We use this approach for studying the time-course of single decisions, during a task where subjects were asked to compare reward vs. loss points for accepting or rejecting offers. RESULTS: We show that based on this new method, we can accurately decode decisions for the majority of the subjects. The typical time-period for accurate decoding was modulated by task difficulty on a trial-by-trial basis. Typical latencies of when decisions are made were detected at ∼500ms for 'easy' vs. ∼700ms for 'hard' decisions, well before subjects' response (∼340ms). Importantly, this decision time correlated with the drift rates of a diffusion model, evaluated independently at the behavioral level. COMPARISON WITH EXISTING METHOD(S): We compare the performance of our algorithm with logistic regression and support vector machine and show that we obtain significant results for a higher number of subjects than with these two approaches. We also carry out analyses at the average event-related potential level, for comparison with previous studies on decision-making. CONCLUSIONS: We present a novel approach for studying the timing of value-based decision-making, by accumulating patterns of topographic EEG activity at single-trial level.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

WAP tulee olemaan tulevaisuudessa tärkeässä roolissa, kun etsitään sopivaa tiedonsiirtoprotokollaa uusille mobiilipalveluille. Vaikka WAP jollain tavalla epäonnistui ensimmäisessä tulemisessaan sen suosio varmasti tulevaisuudessa tulee kasvamaan. WAP:in heikko suosio ei johtunut niinkään protokollan tiedonsiirto ominaisuuksista, vaan WAP-palveluiden kehittymättömyydestä. Tulevaisuuden palvelut kuitenkin ovat kehittyneempiä ja WAP:in suosio tulee kasvamaan. Viimeisimpänä WAP:ia käyttävänä palveluna on esitelty MMS. Kun uudet WAP:iin pohjautuvat palvelut yleistyvät, asettaa se uusia vaatimuksia myös WAP gateway:lle. Työssä tarkastellaan erilaisia mahdollisuuksia mitata mobiilisti WAP palvelujen palvelun tasoa. Työssä myös toteutetaan mobiili WAP palveluiden mittauskomponentti, joka toimii osana laajempaa ohjelmistoa. Tarkoituksena on toteuttaa mittauskomponentti, joka emuloi mahdollisimman hyvin todellista loppukäyttäjää.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Kuvien laatu on tutkituimpia ja käytetyimpiä aiheita. Tässä työssä tarkastellaan värin laatu ja spektrikuvia. Työssä annetaan yleiskuva olemassa olevista pakattujen ja erillisten kuvien laadunarviointimenetelmistä painottaen näiden menetelmien soveltaminen spektrikuviin. Tässä työssä esitellään spektriväriulkomuotomalli värikuvien laadunarvioinnille. Malli sovelletaan spektrikuvista jäljennettyihin värikuviin. Malli pohjautuu sekä tilastolliseen spektrikuvamalliin, joka muodostaa yhteyden spektrikuvien ja valokuvien parametrien välille, että kuvan yleiseen ulkomuotoon. Värikuvien tilastollisten spektriparametrien ja fyysisten parametrien välinen yhteys on varmennettu tietokone-pohjaisella kuvamallinnuksella. Mallin ominaisuuksien pohjalta on kehitetty koekäyttöön tarkoitettu menetelmä värikuvien laadunarvioinnille. On kehitetty asiantuntija-pohjainen kyselymenetelmä ja sumea päättelyjärjestelmä värikuvien laadunarvioinnille. Tutkimus osoittaa, että spektri-väri –yhteys ja sumea päättelyjärjestelmä soveltuvat tehokkaasti värikuvien laadunarviointiin.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Extension of shelf life and preservation of products are both very important for the food industry. However, just as with other processes, speed and higher manufacturing performance are also beneficial. Although microwave heating is utilized in a number of industrial processes, there are many unanswered questions about its effects on foods. Here we analyze whether the effects of microwave heating with continuous flow are equivalent to those of traditional heat transfer methods. In our study, the effects of heating of liquid foods by conventional and continuous flow microwave heating were studied. Among other properties, we compared the stability of the liquid foods between the two heat treatments. Our goal was to determine whether the continuous flow microwave heating and the conventional heating methods have the same effects on the liquid foods, and, therefore, whether microwave heat treatment can effectively replace conventional heat treatments. We have compared the colour, separation phenomena of the samples treated by different methods. For milk, we also monitored the total viable cell count, for orange juice, vitamin C contents in addition to the taste of the product by sensory analysis. The majority of the results indicate that the circulating coil microwave method used here is equivalent to the conventional heating method based on thermal conduction and convection. However, some results in the analysis of the milk samples show clear differences between heat transfer methods. According to our results, the colour parameters (lightness, red-green and blue-yellow values) of the microwave treated samples differed not only from the untreated control, but also from the traditional heat treated samples. The differences are visually undetectable, however, they become evident through analytical measurement with spectrophotometer. This finding suggests that besides thermal effects, microwave-based food treatment can alter product properties in other ways as well.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Connectivity analysis on diffusion MRI data of the whole- brain suffers from distortions caused by the standard echo- planar imaging acquisition strategies. These images show characteristic geometrical deformations and signal destruction that are an important drawback limiting the success of tractography algorithms. Several retrospective correction techniques are readily available. In this work, we use a digital phantom designed for the evaluation of connectivity pipelines. We subject the phantom to a âeurooetheoretically correctâeuro and plausible deformation that resembles the artifact under investigation. We correct data back, with three standard methodologies (namely fieldmap-based, reversed encoding-based, and registration- based). Finally, we rank the methods based on their geometrical accuracy, the dropout compensation, and their impact on the resulting connectivity matrices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Because of technical principles, samples to be observed with electron microscopy need to be fixed in a chemical process and exposed to vacuum conditions that can produce some changes in the morphology of the specimen. The aim of this work was to obtain high-resolution images of the fresh articular cartilage surface with an environmental scanning electron microscope (ESEM), which is an instrument that permits examination of biological specimens without fixation methods in a 10 Torr chamber pressure, thus minimizing the risk of creating artifacts in the structure. Samples from weight-bearing areas of femoral condyles of New Zealand white rabbits were collected and photographed using an ESEM. Images were analyzed using a categorization based in the Jurvelin classification system modified by Hong and Henderson. Appearance of the observed elevations and depressions as described in the classification were observed, but no fractures or splits of cartilage surface, thought to be artifacts, were detected. The ESEM is a useful tool to obtain images of fresh articular cartilage surface appearance without either employing fixation methods or exposing the specimen to extreme vacuum conditions, reducing the risk of introducing artifacts within the specimen. For all these reasons it could become a useful tool for quality control of the preservation process of osteochondral allografting in a bank of musculoskeletal tissues.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The drug discovery process is facing new challenges in the evaluation process of the lead compounds as the number of new compounds synthesized is increasing. The potentiality of test compounds is most frequently assayed through the binding of the test compound to the target molecule or receptor, or measuring functional secondary effects caused by the test compound in the target model cells, tissues or organism. Modern homogeneous high-throughput-screening (HTS) assays for purified estrogen receptors (ER) utilize various luminescence based detection methods. Fluorescence polarization (FP) is a standard method for ER ligand binding assay. It was used to demonstrate the performance of two-photon excitation of fluorescence (TPFE) vs. the conventional one-photon excitation method. As result, the TPFE method showed improved dynamics and was found to be comparable with the conventional method. It also held potential for efficient miniaturization. Other luminescence based ER assays utilize energy transfer from a long-lifetime luminescent label e.g. lanthanide chelates (Eu, Tb) to a prompt luminescent label, the signal being read in a time-resolved mode. As an alternative to this method, a new single-label (Eu) time-resolved detection method was developed, based on the quenching of the label by a soluble quencher molecule when displaced from the receptor to the solution phase by an unlabeled competing ligand. The new method was paralleled with the standard FP method. It was shown to yield comparable results with the FP method and found to hold a significantly higher signal-tobackground ratio than FP. Cell-based functional assays for determining the extent of cell surface adhesion molecule (CAM) expression combined with microscopy analysis of the target molecules would provide improved information content, compared to an expression level assay alone. In this work, immune response was simulated by exposing endothelial cells to cytokine stimulation and the resulting increase in the level of adhesion molecule expression was analyzed on fixed cells by means of immunocytochemistry utilizing specific long-lifetime luminophore labeled antibodies against chosen adhesion molecules. Results showed that the method was capable of use in amulti-parametric assay for protein expression levels of several CAMs simultaneously, combined with analysis of the cellular localization of the chosen adhesion molecules through time-resolved luminescence microscopy inspection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.