875 resultados para Parameter extraction
Resumo:
The main objective of this study is to determine the effectiveness of the Electrochemical Chloride Extraction (ECE) technique on a bridge deck with very high concentrations of chloride. This ECE technique was used during the summer of 2003 to reverse the effects of corrosion, which had occurred in the reinforcing steel embedded in the pedestrian bridge deck over Highway 6, along Iowa Avenue, in Iowa City, Iowa, USA. First, the half cell potential was measured to determine the existing corrosion level in the field. The half-cell potential values were in the indecisive range of corrosion (between -200 mV and -350 mV). The ECE technique was then applied to remove the chloride from the bridge deck. The chloride content in the deck was significantly reduced from 25 lb/cy to 4.96 lb/cy in 8 weeks. Concrete cores obtained from the deck were measured for their compressive strengths and there was no reduction in strength due to the ECE technique. Laboratory tests were also performed to demonstrate the effectiveness of the ECE process. In order to simulate the corrosion in the bridge deck, two reinforced slabs and 12 reinforced beams were prepared. First, the half-cell potentials were measured from the test specimens and they all ranged below -200 mV. Upon introduction of 3% salt solution, the potential reached up to -500 mV. This potential was maintained while a salt solution was being added for six months. The ECE technique was then applied to the test specimens in order to remove the chloride from them. Half-cell potential was measured to determine if the ECE technique can effectively reduce the level of corrosion.
Resumo:
Aim Avoiding 'mini-laparotomy' to extract a colectomy specimen may decrease wound complications and further improve recovery after laparoscopic surgery. The aim of this study was to develop a new technique for transrectal specimen extraction (TRSE) and to compare it with conventional laparoscopy (CL) for left sided colectomy. Method Eleven patients with benign disease requiring either sigmoid or left colon resection underwent TRSE. The unfired circular stapler was inserted transanally and used as a guide to suture-close the recto-sigmoid junction laparoscopically and as a handle to pull the sutured sigmoid through the opened rectum inside a laparoscopic camera bag. The anvil was inserted into the lumen of the intussuscepted sigmoid and pushed to the level of the anastomosis. The anastomosis was fashioned end-to-end in the first patients and side-to-end in the following patients to improve safety. Intra-operative and postoperative outcomes of patients undergoing TRSE were compared with those of a group of 20 patients undergoing CL, who were matched for type of resection, body mass index and age. Results The procedure was successful in all but the first patient who was converted to conventional laparoscopic colectomy without any additional morbidity. Two patients in the end-to-end anastomosis group, but none in the side-to-end group, developed peri-anastomotic sepsis. Compared with CL, patients undergoing TRSE did not show any significant differences in operative time, recovery or morbidity. Conclusion Transrectal specimen extraction after left colectomy using the circular stapler technique is feasible. A side-to-end anastomosis appears safer than an end-to-end anastomosis. Further studies are needed to explore the potential advantages of this procedure over CL.
Resumo:
A comment about the article “Local sensitivity analysis for compositional data with application to soil texture in hydrologic modelling” writen by L. Loosvelt and co-authors. The present comment is centered in three specific points. The first one is related to the fact that the authors avoid the use of ilr-coordinates. The second one refers to some generalization of sensitivity analysis when input parameters are compositional. The third tries to show that the role of the Dirichlet distribution in the sensitivity analysis is irrelevant
Resumo:
A rapid biological method for the determination of the bioavailability of naphthalene was developed and its value as an alternative to extraction-based chemical approaches demonstrated. Genetically engineered whole-cell biosensors are used to determine bioavailable naphthalene and their responses compared with results from Tenax extraction and chemical analysis. Results show a 1:1 correlation between biosensor results and chemical analyses for naphthalene-contaminated model materials and sediments, but the biosensor assay is much faster. This work demonstrates that biosensor technology can perform as well as standard chemical methods, though with some advantages including the inherent biological relevance of the response, rapid response time, and potential for field deployment. A survey of results from this work and the literature shows that bioavailability under non-equilibrium conditions nonetheless correlates well with K(oc) or K(d). A rationale is provided wherein chemical resistance is speculated to be operative.
Resumo:
Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM) was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1 mm high-resolution maps of the longitudinal relaxation rate (R1 = 1/T1), effective proton density (PD(*)), magnetization transfer saturation (MT) and effective transverse relaxation rate (R2(*) = 1/T2(*)). MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV) for typical morphometric measures [i.e., gray matter (GM) probability maps used in voxel-based morphometry] and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1 and 8%, respectively, except for the inter-site CoV of R2(*) (<20%). The GM probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived GM probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.
Resumo:
The World Wide Web, the world¿s largest resource for information, has evolved from organizing information using controlled, top-down taxonomies to a bottom up approach that emphasizes assigning meaning to data via mechanisms such as the Social Web (Web 2.0). Tagging adds meta-data, (weak semantics) to the content available on the web. This research investigates the potential for repurposing this layer of meta-data. We propose a multi-phase approach that exploits user-defined tags to identify and extract domain-level concepts. We operationalize this approach and assess its feasibility by application to a publicly available tag repository. The paper describes insights gained from implementing and applying the heuristics contained in the approach, as well as challenges and implications of repurposing tags for extraction of domain-level concepts.
Resumo:
This work provides a general framework for the design of second-order blind estimators without adopting anyapproximation about the observation statistics or the a prioridistribution of the parameters. The proposed solution is obtainedminimizing the estimator variance subject to some constraints onthe estimator bias. The resulting optimal estimator is found todepend on the observation fourth-order moments that can be calculatedanalytically from the known signal model. Unfortunately,in most cases, the performance of this estimator is severely limitedby the residual bias inherent to nonlinear estimation problems.To overcome this limitation, the second-order minimum varianceunbiased estimator is deduced from the general solution by assumingaccurate prior information on the vector of parameters.This small-error approximation is adopted to design iterativeestimators or trackers. It is shown that the associated varianceconstitutes the lower bound for the variance of any unbiasedestimator based on the sample covariance matrix.The paper formulation is then applied to track the angle-of-arrival(AoA) of multiple digitally-modulated sources by means ofa uniform linear array. The optimal second-order tracker is comparedwith the classical maximum likelihood (ML) blind methodsthat are shown to be quadratic in the observed data as well. Simulationshave confirmed that the discrete nature of the transmittedsymbols can be exploited to improve considerably the discriminationof near sources in medium-to-high SNR scenarios.
Resumo:
The objective of this work was to evaluate the influence of rootstocks and pruning times on yield and on nutrient content and extraction by pruned branches and harvested bunches of 'Niagara Rosada' grapevine in subtropical climate. The rootstocks 'IAC 766', 'IAC 572', 'IAC 313', 'IAC 571-6', and '106-8 Mgt' were evaluated. Treatments consisted of a combination between five rootstocks and three pruning times. At pruning, fresh and dry matter mass of branches were evaluated to estimate biomass accumulation. At harvest, yield was estimated by weighing of bunches per plant. Branches and bunches were sampled at pruning and at harvest, respectively, for nutrient content analysis. Nutrient content and dry matter mass of branches and bunches were used to estimate total nutrient extraction. 'Niagara Rosada' grapevine grafted onto the 'IAC 572' rootstock had the highest yield and dry matter mass of bunches, which were significantly different from the ones observed in 'Niagara Rosada'/'IAC 313'. 'Niagara Rosada' grafted onto the 'IAC 572' rootstock extracted the largest quantity of K, P, Mg, S, Cu, and Fe, differing from 'IAC 313' and 'IAC 766' in K and P extraction, and from '106-8 Mgt' in Mg and S extraction. Winter pruning results in higher yield, dry matter accumulation by branches, and total nutrient content and extraction.
Resumo:
Yksi keskeisimmistä tehtävistä matemaattisten mallien tilastollisessa analyysissä on mallien tuntemattomien parametrien estimointi. Tässä diplomityössä ollaan kiinnostuneita tuntemattomien parametrien jakaumista ja niiden muodostamiseen sopivista numeerisista menetelmistä, etenkin tapauksissa, joissa malli on epälineaarinen parametrien suhteen. Erilaisten numeeristen menetelmien osalta pääpaino on Markovin ketju Monte Carlo -menetelmissä (MCMC). Nämä laskentaintensiiviset menetelmät ovat viime aikoina kasvattaneet suosiotaan lähinnä kasvaneen laskentatehon vuoksi. Sekä Markovin ketjujen että Monte Carlo -simuloinnin teoriaa on esitelty työssä siinä määrin, että menetelmien toimivuus saadaan perusteltua. Viime aikoina kehitetyistä menetelmistä tarkastellaan etenkin adaptiivisia MCMC menetelmiä. Työn lähestymistapa on käytännönläheinen ja erilaisia MCMC -menetelmien toteutukseen liittyviä asioita korostetaan. Työn empiirisessä osuudessa tarkastellaan viiden esimerkkimallin tuntemattomien parametrien jakaumaa käyttäen hyväksi teoriaosassa esitettyjä menetelmiä. Mallit kuvaavat kemiallisia reaktioita ja kuvataan tavallisina differentiaaliyhtälöryhminä. Mallit on kerätty kemisteiltä Lappeenrannan teknillisestä yliopistosta ja Åbo Akademista, Turusta.
Resumo:
Tässä diplomityössä tutkitaan tekniikoita, joillavesileima lisätään spektrikuvaan, ja menetelmiä, joilla vesileimat tunnistetaanja havaitaan spektrikuvista. PCA (Principal Component Analysis) -algoritmia käyttäen alkuperäisten kuvien spektriulottuvuutta vähennettiin. Vesileiman lisääminen spektrikuvaan suoritettiin muunnosavaruudessa. Ehdotetun mallin mukaisesti muunnosavaruuden komponentti korvattiin vesileiman ja toisen muunnosavaruuden komponentin lineaarikombinaatiolla. Lisäyksessä käytettävää parametrijoukkoa tutkittiin. Vesileimattujen kuvien laatu mitattiin ja analysoitiin. Suositukset vesileiman lisäykseen esitettiin. Useita menetelmiä käytettiin vesileimojen tunnistamiseen ja tunnistamisen tulokset analysoitiin. Vesileimojen kyky sietää erilaisia hyökkäyksiä tarkistettiin. Diplomityössä suoritettiin joukko havaitsemis-kokeita ottamalla huomioon vesileiman lisäyksessä käytetyt parametrit. ICA (Independent Component Analysis) -menetelmää pidetään yhtenä mahdollisena vaihtoehtona vesileiman havaitsemisessa.
Resumo:
The main objective of this study was todo a statistical analysis of ecological type from optical satellite data, using Tipping's sparse Bayesian algorithm. This thesis uses "the Relevence Vector Machine" algorithm in ecological classification betweenforestland and wetland. Further this bi-classification technique was used to do classification of many other different species of trees and produces hierarchical classification of entire subclasses given as a target class. Also, we carried out an attempt to use airborne image of same forest area. Combining it with image analysis, using different image processing operation, we tried to extract good features and later used them to perform classification of forestland and wetland.
Resumo:
Perceiving the world visually is a basic act for humans, but for computers it is still an unsolved problem. The variability present innatural environments is an obstacle for effective computer vision. The goal of invariant object recognition is to recognise objects in a digital image despite variations in, for example, pose, lighting or occlusion. In this study, invariant object recognition is considered from the viewpoint of feature extraction. Thedifferences between local and global features are studied with emphasis on Hough transform and Gabor filtering based feature extraction. The methods are examined with respect to four capabilities: generality, invariance, stability, and efficiency. Invariant features are presented using both Hough transform and Gabor filtering. A modified Hough transform technique is also presented where the distortion tolerance is increased by incorporating local information. In addition, methods for decreasing the computational costs of the Hough transform employing parallel processing and local information are introduced.
Resumo:
The present study aims to compare yield and quality of pequi pulp oil when applying two distinct processes: in the first, pulp drying in a tray dryer at 60ºC was combined with enzymatic treatment and pressing to oil extraction; in the second, a simple process was carried out by combining sun-drying pulp and pressing. In this study, raw pequi fruits were collected in Mato Grosso State, Brazil. The fruits were autoclaved at 121ºC and stored under refrigeration. An enzymatic extract with pectinase and CMCase activities was used for hydrolysis of pequi pulp, prior to oil extraction. The oil extractions were carried out by hydraulic pressing, with or without enzymatic incubation. The oil content in the pequi pulp (45% w/w) and the physicochemical characteristic of the oil was determined according to standard analytical methods. Free fatty acids, peroxide values, iodine and saponification indices were respectively 1.46 mgKOH/g, 2.98 meq/kg, 49.13 and 189.40. The acidity and peroxide values were lower than the obtained values in commercial oil samples, respectively 2.48 mgKOH/g and 5.22 meq/kg. Aqueous extraction has presented lower efficiency and higher oxidation of unsaturated fatty acids. On the other hand, pequi pulp pressing at room temperature has produced better quality oil. However its efficiency is still smaller than the combined enzymatic treatment and pressing process. This combined process promotes cellular wall hydrolysis and pulp viscosity reduction, contributing to at least 20% of oil yield increase by pressing.