968 resultados para Scale invariant feature transform (SIFT)
Resumo:
Biofilters degrade only a small fraction of the natural organic matter (NOM) contained in seawater which is the leading cause of biofouling in downstream processes. This work studies the effects of chemical additions on NOM biodegradation by biofilters. In this work, biofiltration of seawater with an empty bed contact time (EBCT) of 6 min and a hydraulic loading rate of 10 m h-1 reduces the biological oxygen demand (BOD7) by 8%, the dissolved organic carbon (DOC) by 6% and the UV absorbance at 254 nm (A254) by 7%. Different amounts of ammonium chloride are added to the seawater (up to twice the total dissolved nitrogen in untreated seawater) to study its possible effect on the removal of NOM by a pilot-scale biofilter. Seawater is amended with different amounts of easily biodegradable dissolved organic carbon (BDOC) supplied as sodium acetate (up to twice the DOC) for the same purpose. The results of this work reveal that the ammonium chloride additions do not significantly affect NOM removal and the sodium acetate is completely consumed by the biofiltration process. For both types of chemical additions, the BOD7, DOC and A254 in the outlet stream of the biofilter are similar to the values for the untreated control. These results indicate that this biofilter easily removes the BDOC from the seawater when the EBCT is not above 6 min. Furthermore, nitrogen does not limit the NOM biodegradation in seawater under these experimental conditions.
Resumo:
The objective of this thesis is to study wavelets and their role in turbulence applications. Under scrutiny in the thesis is the intermittency in turbulence models. Wavelets are used as a mathematical tool to study the intermittent activities that turbulence models produce. The first section generally introduces wavelets and wavelet transforms as a mathematical tool. Moreover, the basic properties of turbulence are discussed and classical methods for modeling turbulent flows are explained. Wavelets are implemented to model the turbulence as well as to analyze turbulent signals. The model studied here is the GOY (Gledzer 1973, Ohkitani & Yamada 1989) shell model of turbulence, which is a popular model for explaining intermittency based on the cascade of kinetic energy. The goal is to introduce better quantification method for intermittency obtained in a shell model. Wavelets are localized in both space (time) and scale, therefore, they are suitable candidates for the study of singular bursts, that interrupt the calm periods of an energy flow through various scales. The study concerns two questions, namely the frequency of the occurrence as well as the intensity of the singular bursts at various Reynolds numbers. The results gave an insight that singularities become more local as Reynolds number increases. The singularities become more local also when the shell number is increased at certain Reynolds number. The study revealed that the singular bursts are more frequent at Re ~ 107 than other cases with lower Re. The intermittency of bursts for the cases with Re ~ 106 and Re ~ 105 was similar, but for the case with Re ~ 104 bursts occured after long waiting time in a different fashion so that it could not be scaled with higher Re.
Resumo:
This study investigated fingermark residues using Fourier transform infrared microscopy (μ- FTIR) in order to obtain fundamental information about the marks' initial composition and aging kinetics. This knowledge would be an asset for fundamental research on fingermarks, such as for dating purposes. Attenuated Total Reflection (ATR) and single-point reflection modes were tested on fresh fingermarks. ATR proved to be better suited and this mode was subsequently selected for further aging studies. Eccrine and sebaceous material was found in fresh and aged fingermarks and the spectral regions 1000-1850 cm-1 and 2700-3600 cm-1 were identified as the most informative. The impact of substrates (aluminium and glass slides) and storage conditions (storage in the light and in the dark) on fingermark aging was also studied. Chemometric analyses showed that fingermarks could be grouped according to their age regardless of the substrate when they were stored in an open box kept in an air-conditioned laboratory at around 20°C next to a window. On the contrary, when fingermarks were stored in the dark, only specimens deposited on the same substrate could be grouped by age. Thus, the substrate appeared to influence aging of fingermarks in the dark. Furthermore, PLS regression analyses were conducted in order to study the possibility of modelling fingermark aging for potential fingermark dating applications. The resulting models showed an overall precision of ±3 days and clearly demonstrated their capability to differentiate older fingermarks (20 and 34-days old) from newer ones (1, 3, 7 and 9-days old) regardless of the substrate and lighting conditions. These results are promising from a fingermark dating perspective. Further research is required to fully validate such models and assess their robustness and limitations in uncontrolled casework conditions.
Resumo:
The University of Barcelona is developing a pilot-scale hot wire chemical vapor deposition (HW-CVD) set up for the deposition of nano-crystalline silicon (nc-Si:H) on 10 cm × 10 cm glass substrate at high deposition rate. The system manages 12 thin wires of 0.15-0.2 mm diameter in a very dense configuration. This permits depositing very uniform films, with inhomogeneities lower than 2.5%, at high deposition rate (1.5-3 nm/s), and maintaining the substrate temperature relatively low (250 °C). The wire configuration design, based on radicals' diffusion simulation, is exposed and the predicted homogeneity is validated with optical transmission scanning measurements of the deposited samples. Different deposition series were carried out by varying the substrate temperature, the silane to hydrogen dilution and the deposition pressure. By means of Fourier transform infrared spectroscopy (FTIR), the evolution in time of the nc-Si:H vibrational modes was monitored. Particular importance has been given to the study of the material stability against post-deposition oxidation.
Resumo:
Randomized, controlled trials have demonstrated efficacy for second-generation antipsychotics in the treatment of acute mania in bipolar disorder. Despite depression being considered the hallmark of bipolar disorder, there are no published systematic reviews or meta-analyses to evaluate the efficacy of modern atypical antipsychotics in bipolar depression. We systematically reviewed published or registered randomized, double-blind, placebo-controlled trials (RCTs) of modern antipsychotics in adult bipolar I and/or II depressive patients (DSM-IV criteria). Efficacy outcomes were assessed based on changes in the Montgomery-Asberg Depression Rating Scale (MADRS) during an 8-wk period. Data were combined through meta-analysis using risk ratio as an effect size with a 95% confidence interval (95% CI) and with a level of statistical significance of 5% (p<0.05). We identified five RCTs; four involved antipsychotic monotherapy and one addressed both monotherapy and combination with an antidepressant. The two quetiapine trials analysed the safety and efficacy of two doses: 300 and 600 mg/d. The only olanzapine trial assessed olanzapine monotherapy within a range of 5-20 mg/d and olanzapine-fluoxetine combination within a range of 5-20 mg/d and 6-12 mg/d, respectively. The two aripiprazole placebo-controlled trials assessed doses of 5-30 mg/d. Quetiapine and olanzapine trials (3/5, 60%) demonstrated superiority over placebo (p<0.001). Only 2/5 (40%) (both aripiprazole trials) failed in the primary efficacy measure after the first 6 wk. Some modern antipsychotics (quetiapine and olanzapine) have demonstrated efficacy in bipolar depressive patients from week 1 onwards. Rapid onset of action seems to be a common feature of atypical antipsychotics in bipolar depression. Comment in The following popper user interface control may not be accessible. Tab to the next button to revert the control to an accessible version.Destroy user interface controlEfficacy of modern antipsychotics in placebo-controlled trials in bipolar depression: a meta-analysis--results to be interpreted with caution.
Resumo:
S u b s u r face fluid flow plays a significant role in many geologic processes and is increasingly being studied in the scale of sedimentary basins and geologic time perspective. Many economic resources such as petroleum and mineral deposits are products of basin scale fluid flow operating over large periods of time. Such ancient flow systems can be studied through analysis of diagenetic alterations and fluid inclusions to constrain physical and chemical conditions of fluids and rocks during their paleohy d r og e o l ogic evolution. Basin simulation models are useful to complement the paleohy d r og e o l ogic record preserved in the rocks and to derive conceptual models on hydraulic basin evolution and generation of economic resources. Different types of fluid flow regimes may evo l ve during basin evolution. The most important with respect to flow rates and capacity for transport of solutes and thermal energy is gr avitational fluid flow driven by the topographic configuration of a basin. Such flow systems require the basin to be elevated above sea level. Consolidational fluid flow is the principal fluid migration process in basins below sea level, caused by loading of compressible rocks. Flow rates of such systems are several orders of magnitude below topogr a p hy driven flow. Howeve r, consolidation may create significant fluid ove rpressure. Episodic dewatering of ove rpressured compart m e n t s m ay cause sudden fluid release with elevated flow velocities and may cause a transient local thermal and chemical disequilibrium betwe e n fluid and rock. This paper gives an ove rv i ew on subsurface fluid flow processes at basin scale and presents examples related to the Pe n e d è s basin in the central Catalan continental margin including the offshore Barcelona half-graben and the compressive South-Pyrenean basin.
Resumo:
Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human"s movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale.
Psychometric Properties of the Spanish Human System Audit Short-Scale of Transformational Leadership
Resumo:
The aim of this research is to examine the psychometric properties of a Spanish version of the Human System Audit transformational leadership short-scale (HSA-TFL-ES). It is based on the concept of Bass developed in 1985. The HSA-TFL is a part of the wider Human System Audit frame. We analyzed the HSA-TFL-ES in five different samples with a total number of 1,718 workers at five sectors. Exploratory Factor Analysis corroborated a single factor in all samples that accounted for 66% to 73% of variance. The internal consistency in all samples was good (α = .92 - .95). Evidence was found for the convergent validity of the HSA-TFL-ES and the Multifactor Leadership Questionnaire. These results suggested that the HSA-TFL short-scale is a psychometrically sound measure of this construct and can be used for a combined and first overall measurement.
Resumo:
Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human"s movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale.
Resumo:
In this paper we propose the use of the independent component analysis (ICA) [1] technique for improving the classification rate of decision trees and multilayer perceptrons [2], [3]. The use of an ICA for the preprocessing stage, makes the structure of both classifiers simpler, and therefore improves the generalization properties. The hypothesis behind the proposed preprocessing is that an ICA analysis will transform the feature space into a space where the components are independent, and aligned to the axes and therefore will be more adapted to the way that a decision tree is constructed. Also the inference of the weights of a multilayer perceptron will be much easier because the gradient search in the weight space will follow independent trajectories. The result is that classifiers are less complex and on some databases the error rate is lower. This idea is also applicable to regression
Resumo:
Diagnosis of community acquired legionella pneumonia (CALP) is currently performed by means of laboratory techniques which may delay diagnosis several hours. To determine whether ANN can categorize CALP and non-legionella community-acquired pneumonia (NLCAP) and be standard for use by clinicians, we prospectively studied 203 patients with community-acquired pneumonia (CAP) diagnosed by laboratory tests. Twenty one clinical and analytical variables were recorded to train a neural net with two classes (LCAP or NLCAP class). In this paper we deal with the problem of diagnosis, feature selection, and ranking of the features as a function of their classification importance, and the design of a classifier the criteria of maximizing the ROC (Receiving operating characteristics) area, which gives a good trade-off between true positives and false negatives. In order to guarantee the validity of the statistics; the train-validation-test databases were rotated by the jackknife technique, and a multistarting procedure was done in order to make the system insensitive to local maxima.
Resumo:
We present a detailed evaluation of the seasonal performance of the Community Multiscale Air Quality (CMAQ) modelling system and the PSU/NCAR meteorological model coupled to a new Numerical Emission Model for Air Quality (MNEQA). The combined system simulates air quality at a fine resolution (3 km as horizontal resolution and 1 h as temporal resolution) in north-eastern Spain, where problems of ozone pollution are frequent. An extensive database compiled over two periods, from May to September 2009 and 2010, is used to evaluate meteorological simulations and chemical outputs. Our results indicate that the model accurately reproduces hourly and 1-h and 8-h maximum ozone surface concentrations measured at the air quality stations, as statistical values fall within the EPA and EU recommendations. However, to further improve forecast accuracy, three simple bias-adjustment techniques mean subtraction (MS), ratio adjustment (RA), and hybrid forecast (HF) based on 10 days of available comparisons are applied. The results show that the MS technique performed better than RA or HF, although all the bias-adjustment techniques significantly reduce the systematic errors in ozone forecasts.