913 resultados para Video interaction analysis : methods and methodology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Practical guidelines for monitoring and measuring compounds such as jasmonates, ketols, ketodi(tri)enes and hydroxy-fatty acids as well as detecting the presence of novel oxylipins are presented. Additionally, a protocol for the penetrant analysis of non-enzymatic lipid oxidation is described. Each of the methods, which employ gas chromatography/mass spectrometry, can be applied without specialist knowledge or recourse to the latest analytical instrumentation. Additional information on oxylipin quantification and novel protocols for preparing oxygen isotope-labelled internal standards are provided. Four developing areas of research are identified: (i) profiling of the unbound cellular pools of oxylipins; (ii) profiling of esterified oxylipins and/or monitoring of their release from parent lipids; (iii) monitoring of non-enzymatic lipid oxidation; (iv) analysis of unstable and reactive oxylipins. The methods and protocols presented herein are designed to give technical insights into the first three areas and to provide a platform from which to enter the fourth area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In anticipation of regulation involving numeric turbidity limit at highway construction sites, research was done into the most appropriate, affordable methods for surface water monitoring. Measuring sediment concentration in streams may be conducted a number of ways. As part of a project funded by the Iowa Department of Transportation, several testing methods were explored to determine the most affordable, appropriate methods for data collection both in the field and in the lab. The primary purpose of the research was to determine the exchangeability of the acrylic transparency tube for water clarity analysis as compared to the turbidimeter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biosidien toimittajat tavallisesti suorittavat biosidien annostelumäärien hallinnan paperi- ja kartonkiteollisuudessa. Useimmiten annostelun hallinta määritetään epäsuorilla menetelmillä, kuten esimerkiksi määrittämällä bakteerien kasvua. Biosidien tehoaineiden todellista konsentraatiota tai määrää prosessivesissä tai lopputuotteessa ei tavallisesti mitata. Diplomityössä kehitettiin kolmelle paperiteollisuudessa yleisesti käytetyllä biosidin tehoaineelle analyyttiset menetelmät. Menetelmät kehitettiin glutaraldehydille, 2,2-dibromi-3-nitriilipropionamidi:lle (DBNPA) ja 5-kloori-2-metyyli-4-isotiatsoliini-3-oni:lle (CMI). Kehitettyjä menetelmiä käytettiin tehoaineiden stabiilisuuden seuraamiseen vesiliuoksessa eri pH:ssa ja lämpötilassa. Lisäksi kartonkinäytteistä tehtiin uuttokokeita ja yritettiin kehittää uuttomenetelmät, joilla pystyttäisiin määrittämään biosidien tehoaineiden jäännöspitoisuuksia lopputuotteesta. Glutaraldehydille ja CMI:lle onnistuttiin kehittämään uuttomenetelmät, joilla pystyttiin määrittämään kartongista tutkittujen tehoaineiden jäännöspitoisuudet. Saadut tulokset vaikuttavat realistisilta. Glutaraldehydille ja DBNPA:lle tehtiin stabiilisuuskokeita ja tulokset ovat samankaltaisia mitä muut tutkijat ovat saaneet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The User-centered design (UCD) game is a tool forhuman-computer interaction practitioners to demonstrate the key user-centered design methodsand how they interrelate in the design process in an interactive and participatory manner. The target audiences are departments and institutions unfamiliar with UCD but whose work is related to the definition, creation, and update of a product or service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to evaluate the relationships between the spectra in the Vis-NIR range and the soil P concentrations obtained from the PM and Prem extraction methods as well as the effects of these relationships on the construction of models predicting P concentration in Oxisols. Soil samples' spectra and their PM and Prem extraction solutions were determined for the Vis-NIR region between 400 and 2500 nm. Mineralogy and/or organic matter content act as primary attributes allowing correlation of these soil phosphorus fractions with the spectra, mainly at wavelengths between 450-550, 900-1100 nm, near 1400 nm and between 2200-2300 nm. However, the regression models generated were not suitable for quantitative phosphate analysis. Solubilization of organic matter and reactions during the PM extraction process hindered correlations between the spectra and these P soil fractions. For Prem,, the presence of Ca in the extractant and preferential adsorption by gibbsite and iron oxides, particularly goethite, obscured correlations with the spectra.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This book is dedicated to celebrate the 60th birthday of Professor Rainer Huopalahti. Professor Rainer “Repe” Huopalahti has had, and in fact is still enjoying a distinguished career in the analysis of food and food related flavor compounds. One will find it hard to make any progress in this particular field without a valid and innovative sample handling technique and this is a field in which Professor Huopalahti has made great contributions. The title and the front cover of this book honors Professor Huopahti’s early steps in science. His PhD thesis which was published on 1985 is entitled “Composition and content of aroma compounds in the dill herb, Anethum graveolens L., affected by different factors”. At that time, the thesis introduced new technology being applied to sample handling and analysis of flavoring compounds of dill. Sample handling is an essential task that in just about every analysis. If one is working with minor compounds in a sample or trying to detect trace levels of the analytes, one of the aims of sample handling may be to increase the sensitivity of the analytical method. On the other hand, if one is working with a challenging matrix such as the kind found in biological samples, one of the aims is to increase the selectivity. However, quite often the aim is to increase both the selectivity and the sensitivity. This book provides good and representative examples about the necessity of valid sample handling and the role of the sample handling in the analytical method. The contributors of the book are leading Finnish scientists on the field of organic instrumental analytical chemistry. Some of them are also Repe’ s personal friends and former students from the University of Turku, Department of Biochemistry and Food Chemistry. Importantly, the authors all know Repe in one way or another and are well aware of his achievements on the field of analytical chemistry. The editorial team had a great time during the planning phase and during the “hard work editorial phase” of the book. For example, we came up with many ideas on how to publish the book. After many long discussions, we decided to have a limited edition as an “old school hard cover book” – and to acknowledge more modern ways of disseminating knowledge by publishing an internet version of the book on the webpages of the University of Turku. Downloading the book from the webpage for personal use is free of charge. We believe and hope that the book will be read with great interest by scientists working in the fascinating field of organic instrumental analytical chemistry. We decided to publish our book in English for two main reasons. First, we believe that in the near future, more and more teaching in Finnish Universities will be delivered in English. To facilitate this process and encourage students to develop good language skills, it was decided to be published the book in English. Secondly, we believe that the book will also interest scientists outside Finland – particularly in the other member states of the European Union. The editorial team thanks all the authors for their willingness to contribute to this book – and to adhere to the very strict schedule. We also want to thank the various individuals and enterprises who financially supported the book project. Without that support, it would not have been possible to publish the hardcover book.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis is twofold. The first and major part is devoted to sensitivity analysis of various discrete optimization problems while the second part addresses methods applied for calculating measures of solution stability and solving multicriteria discrete optimization problems. Despite numerous approaches to stability analysis of discrete optimization problems two major directions can be single out: quantitative and qualitative. Qualitative sensitivity analysis is conducted for multicriteria discrete optimization problems with minisum, minimax and minimin partial criteria. The main results obtained here are necessary and sufficient conditions for different stability types of optimal solutions (or a set of optimal solutions) of the considered problems. Within the framework of quantitative direction various measures of solution stability are investigated. A formula for a quantitative characteristic called stability radius is obtained for the generalized equilibrium situation invariant to changes of game parameters in the case of the H¨older metric. Quality of the problem solution can also be described in terms of robustness analysis. In this work the concepts of accuracy and robustness tolerances are presented for a strategic game with a finite number of players where initial coefficients (costs) of linear payoff functions are subject to perturbations. Investigation of stability radius also aims to devise methods for its calculation. A new metaheuristic approach is derived for calculation of stability radius of an optimal solution to the shortest path problem. The main advantage of the developed method is that it can be potentially applicable for calculating stability radii of NP-hard problems. The last chapter of the thesis focuses on deriving innovative methods based on interactive optimization approach for solving multicriteria combinatorial optimization problems. The key idea of the proposed approach is to utilize a parameterized achievement scalarizing function for solution calculation and to direct interactive procedure by changing weighting coefficients of this function. In order to illustrate the introduced ideas a decision making process is simulated for three objective median location problem. The concepts, models, and ideas collected and analyzed in this thesis create a good and relevant grounds for developing more complicated and integrated models of postoptimal analysis and solving the most computationally challenging problems related to it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis work models the squeezing of the tube and computes the fluid motion of a peristaltic pump. The simulations have been conducted by using COMSOL Multiphysics FSI module. The model is setup in axis symmetric with several simulation cases to have a clear understanding of the results. The model captures total displacement of the tube, velocity magnitude, and average pressure fluctuation of the fluid motion. A clear understanding and review of many mathematical and physical concepts are also discussed with their applications in real field. In order to solve the problems and work around the resource constraints, a thorough understanding of mass balance and momentum equations, finite element concepts, arbitrary Lagrangian-Eulerian method, one-way coupling method, two-way coupling method, and COMSOL Multiphysics simulation setup are understood and briefly narrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The experiences of several healthcare organizations were considered to distinguish the most frequently used lean tools, the success and failure factors, and the obstacles that may appear while implementing lean. As a result, two approaches to “go lean” were defined, and analyzed from the prospective of the applicability to healthcare processes. Industrialization of healthcare was studied, and the most promising digital technology tools to improve healthcare process were highlighted. Finally, the analysis of healthcare challenges and feasible ways to address them was conducted and presented as the main result of this work. The possible ways of implementation of the findings and limitations were described in the conclusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estrogen has multiple effects on lipid and lipoprotein metabolism. We investigated the association between the four common single nucleotide polymorphisms in the estrogen receptor 1 (ESR1) gene locus, -1989T>G, +261G>C, IVS1-397T>C and IVS1-351A>G, and lipid and lipoprotein levels in southern Brazilians. The sample consisted in 150 men and 187 premenopausal women. The women were considered premenopausal if they had regular menstrual bleeding within the previous 3 months and were 18-50 years of age. Exclusion criteria were pregnancy, secondary hyperlipidemia due to renal, hepatic or thyroid disease, and diabetes. Smoking status was self-reported; subjects were classified as never smoked and current smokers. DNA was amplified by PCR and was subsequently digested with the appropriate restriction enzymes. Statistical analysis was carried out for men and women separately. In the study population, major allele frequencies were _1989*T (0.83), +261*G (0.96), IVS1-397*T (0.58), and IVS1-351*A (0.65). Multiple linear regression analyses indicated that an interaction between +261G>C polymorphism and smoking was a significant factor affecting high-density lipoprotein cholesterol (HDL-C) levels (P = 0.028) in women. Nonsmoking women with genotype G/C of +261G>C polymorphism had mean HDL-C levels higher than those with G/G genotype (1.40 ± 0.33 vs 1.22 ± 0.26 mmol/L; P = 0.033). No significant associations with lipid and lipoprotein levels in women and men were detected for other polymorphisms. In conclusion, the +261G>C polymorphism might influence lipoprotein and lipid levels in premenopausal women, but these effects seem to be modulated by smoking, whereas in men ESR1 polymorphisms were not associated with high lipoprotein levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractThe combined effects of tumbling marination methods (Vacuum continuous tumbling marination, CT; Vacuum intermittent tumbling marination, IT) and effective tumbling time (4, 6, 8 and 10 h) on quality characteristics of prepared boneless pork chops were investigated. The results showed that regardless of tumbling time, CT method significantly increased the pH, product yield, cohesiveness, resilience, sensory tenderness and overall flavor (p<0.05) compared with IT method, and CT method also significantly decreased the pressing loss, cooking loss, shear force value (SFV), hardness and chewiness (p<0.05) compared with IT method. With the effective tumbling time increasing from 4 h to 10 h, the product yield and sensory attributes of prepared pork chops increased at first and then decreased, whereas the pressing loss, cooking loss, SFV, hardness and chewiness decreased at first and then increased. Additionally, an interaction between CT method and effective tumbling time was also observed. These results suggested that CT method of 8 h obtained the best quality characteristics of prepared pork chops, which should be adopted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several automated reversed-phase HPLC methods have been developed to determine trace concentrations of carbamate pesticides (which are of concern in Ontario environmental samples) in water by utilizing two solid sorbent extraction techniques. One of the methods is known as on-line pre-concentration'. This technique involves passing 100 milliliters of sample water through a 3 cm pre-column, packed with 5 micron ODS sorbent, at flow rates varying from 5-10 mUmin. By the use of a valve apparatus, the HPLC system is then switched to a gradient mobile phase program consisting of acetonitrile and water. The analytes, Propoxur, Carbofuran, Carbaryl, Propham, Captan, Chloropropham, Barban, and Butylate, which are pre-concentrated on the pre-column, are eluted and separated on a 25 cm C-8 analytical column and determined by UV absorption at 220 nm. The total analytical time is 60 minutes, and the pre-column can be used repeatedly for the analysis of as many as thirty samples. The method is highly sensitive as 100 percent of the analytes present in the sample can be injected into the HPLC. No breakthrough of any of the analytes was observed and the minimum detectable concentrations range from 10 to 480 ng/L. The developed method is totally automated for the analysis of one sample. When the above mobile phase is modified with a buffer solution, Aminocarb, Benomyl, and its degradation product, MBC, can also be detected along with the above pesticides with baseline resolution for all of the analytes. The method can also be easily modified to determine Benomyl and MBC both as solute and as particulate matter. By using a commercially available solid phase extraction cartridge, in lieu of a pre-column, for the extraction and concentration of analytes, a completely automated method has been developed with the aid of the Waters Millilab Workstation. Sample water is loaded at 10 mL/min through a cartridge and the concentrated analytes are eluted from the sorbent with acetonitrile. The resulting eluate is blown-down under nitrogen, made up to volume with water, and injected into the HPLC. The total analytical time is 90 minutes. Fifty percent of the analytes present in the sample can be injected into the HPLC, and recoveries for the above eight pesticides ranged from 84 to 93 percent. The minimum detectable concentrations range from 20 to 960 ng/L. The developed method is totally automated for the analysis of up to thirty consecutive samples. The method has proven to be applicable to both purer water samples as well as untreated lake water samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High temperature superconductors were discovered in 1986, but despite considerable research efforts, both experimental and theoretical, these materials remain poorly understood. Because their electronic structure is both inhomogeneous and highly correlated, a full understanding will require knowledge of quasiparticle properties both in real space and momentum space. In this thesis, we will present a theoretical analysis of the scanning tunneling microscopy (STM) data in BSCCO. We introduce the Bogoliubov-De Gennes Hamiltonian and solve it numerically on a two-dimensional 20 x 20 lattice under a magnetic field perpendicular to the surface. We consider a vortex at the center of our model. We introduce a Zn impurity in our lattice as a microscopic probe of the physical properties of BSCCO. By direct numerical diagonalization of the lattice BogoliubovDe Gennes Hamiltonian for different positions of the impurity, we can calculate the interaction between the vortex and the impurity in a d-wave superconductor.