948 resultados para method applied to liquid samples
Resumo:
The recently reported Monte Carlo Random Path Sampling method (RPS) is here improved and its application is expanded to the study of the 2D and 3D Ising and discrete Heisenberg models. The methodology was implemented to allow use in both CPU-based high-performance computing infrastructures (C/MPI) and GPU-based (CUDA) parallel computation, with significant computational performance gains. Convergence is discussed, both in terms of free energy and magnetization dependence on field/temperature. From the calculated magnetization-energy joint density of states, fast calculations of field and temperature dependent thermodynamic properties are performed, including the effects of anisotropy on coercivity, and the magnetocaloric effect. The emergence of first-order magneto-volume transitions in the compressible Ising model is interpreted using the Landau theory of phase transitions. Using metallic Gadolinium as a real-world example, the possibility of using RPS as a tool for computational magnetic materials design is discussed. Experimental magnetic and structural properties of a Gadolinium single crystal are compared to RPS-based calculations using microscopic parameters obtained from Density Functional Theory.
Resumo:
One of the most significant research topics in computer vision is object detection. Most of the reported object detection results localise the detected object within a bounding box, but do not explicitly label the edge contours of the object. Since object contours provide a fundamental diagnostic of object shape, some researchers have initiated work on linear contour feature representations for object detection and localisation. However, linear contour feature-based localisation is highly dependent on the performance of linear contour detection within natural images, and this can be perturbed significantly by a cluttered background. In addition, the conventional approach to achieving rotation-invariant features is to rotate the feature receptive field to align with the local dominant orientation before computing the feature representation. Grid resampling after rotation adds extra computational cost and increases the total time consumption for computing the feature descriptor. Though it is not an expensive process if using current computers, it is appreciated that if each step of the implementation is faster to compute especially when the number of local features is increasing and the application is implemented on resource limited ”smart devices”, such as mobile phones, in real-time. Motivated by the above issues, a 2D object localisation system is proposed in this thesis that matches features of edge contour points, which is an alternative method that takes advantage of the shape information for object localisation. This is inspired by edge contour points comprising the basic components of shape contours. In addition, edge point detection is usually simpler to achieve than linear edge contour detection. Therefore, the proposed localization system could avoid the need for linear contour detection and reduce the pathological disruption from the image background. Moreover, since natural images usually comprise many more edge contour points than interest points (i.e. corner points), we also propose new methods to generate rotation-invariant local feature descriptors without pre-rotating the feature receptive field to improve the computational efficiency of the whole system. In detail, the 2D object localisation system is achieved by matching edge contour points features in a constrained search area based on the initial pose-estimate produced by a prior object detection process. The local feature descriptor obtains rotation invariance by making use of rotational symmetry of the hexagonal structure. Therefore, a set of local feature descriptors is proposed based on the hierarchically hexagonal grouping structure. Ultimately, the 2D object localisation system achieves a very promising performance based on matching the proposed features of edge contour points with the mean correct labelling rate of the edge contour points 0.8654 and the mean false labelling rate 0.0314 applied on the data from Amsterdam Library of Object Images (ALOI). Furthermore, the proposed descriptors are evaluated by comparing to the state-of-the-art descriptors and achieve competitive performances in terms of pose estimate with around half-pixel pose error.
Resumo:
One challenge on data assimilation (DA) methods is how the error covariance for the model state is computed. Ensemble methods have been proposed for producing error covariance estimates, as error is propagated in time using the non-linear model. Variational methods, on the other hand, use the concepts of control theory, whereby the state estimate is optimized from both the background and the measurements. Numerical optimization schemes are applied which solve the problem of memory storage and huge matrix inversion needed by classical Kalman filter methods. Variational Ensemble Kalman filter (VEnKF), as a method inspired the Variational Kalman Filter (VKF), enjoys the benefits from both ensemble methods and variational methods. It avoids filter inbreeding problems which emerge when the ensemble spread underestimates the true error covariance. In VEnKF this is tackled by resampling the ensemble every time measurements are available. One advantage of VEnKF over VKF is that it needs neither tangent linear code nor adjoint code. In this thesis, VEnKF has been applied to a two-dimensional shallow water model simulating a dam-break experiment. The model is a public code with water height measurements recorded in seven stations along the 21:2 m long 1:4 m wide flume’s mid-line. Because the data were too sparse to assimilate the 30 171 model state vector, we chose to interpolate the data both in time and in space. The results of the assimilation were compared with that of a pure simulation. We have found that the results revealed by the VEnKF were more realistic, without numerical artifacts present in the pure simulation. Creating a wrapper code for a model and DA scheme might be challenging, especially when the two were designed independently or are poorly documented. In this thesis we have presented a non-intrusive approach of coupling the model and a DA scheme. An external program is used to send and receive information between the model and DA procedure using files. The advantage of this method is that the model code changes needed are minimal, only a few lines which facilitate input and output. Apart from being simple to coupling, the approach can be employed even if the two were written in different programming languages, because the communication is not through code. The non-intrusive approach is made to accommodate parallel computing by just telling the control program to wait until all the processes have ended before the DA procedure is invoked. It is worth mentioning the overhead increase caused by the approach, as at every assimilation cycle both the model and the DA procedure have to be initialized. Nonetheless, the method can be an ideal approach for a benchmark platform in testing DA methods. The non-intrusive VEnKF has been applied to a multi-purpose hydrodynamic model COHERENS to assimilate Total Suspended Matter (TSM) in lake Säkylän Pyhäjärvi. The lake has an area of 154 km2 with an average depth of 5:4 m. Turbidity and chlorophyll-a concentrations from MERIS satellite images for 7 days between May 16 and July 6 2009 were available. The effect of the organic matter has been computationally eliminated to obtain TSM data. Because of computational demands from both COHERENS and VEnKF, we have chosen to use 1 km grid resolution. The results of the VEnKF have been compared with the measurements recorded at an automatic station located at the North-Western part of the lake. However, due to TSM data sparsity in both time and space, it could not be well matched. The use of multiple automatic stations with real time data is important to elude the time sparsity problem. With DA, this will help in better understanding the environmental hazard variables for instance. We have found that using a very high ensemble size does not necessarily improve the results, because there is a limit whereby additional ensemble members add very little to the performance. Successful implementation of the non-intrusive VEnKF and the ensemble size limit for performance leads to an emerging area of Reduced Order Modeling (ROM). To save computational resources, running full-blown model in ROM is avoided. When the ROM is applied with the non-intrusive DA approach, it might result in a cheaper algorithm that will relax computation challenges existing in the field of modelling and DA.
Resumo:
Agricultural crops can be damaged by funguses, insects, worms and other organisms that cause diseases and decrease the yield of production. The effect of these damaging agents can be reduced using pesticides. Among them, triazole compounds are effective substances against fungus; for example, Oidium. Nevertheless, it has been detected that the residues of these fungicides in foods as well as in derivate products can affect the health of the consumers. Therefore, the European Union has established several regulations fixing the maximum residue of pesticide levels in a wide range of foods trying to assure the consumer safety. Hence, it is very important to develop adequate methods to determine these pesticide compounds. In most cases, gas or liquid chromatographic (GC, LC) separations are used in the analysis of the samples. But firstly, it is necessary to use proper sample treatments in order to preconcentrate and isolate the target analytes. To reach this aim, microextraction techniques are very effective tools; because allow to do both preconcentration and extraction of the analytes in one simple step that considerably reduces the source of errors. With these objectives, two remarkable techniques have been widely used during the last years: solid phase microextraction (SPME) and liquid phase microextraction (LPME) with its different options. Both techniques that avoid the use or reduce the amount of toxic solvents are convenient coupled to chromatographic equipments providing good quantitative results in a wide number of matrices and compounds. In this work simple and reliable methods have been developed using SPME and ultrasound assisted emulsification microextraction (USAEME) coupled to GC or LC for triazole fungicides determination. The proposed methods allow confidently determine triazole concentrations of μg L‐1 order in different fruit samples. Chemometric tools have been used to accomplish successful determinations. Firstly, in the selection and optimization of the variables involved in the microextraction processes; and secondly, to overcome the problems related to the overlapping peaks. Different fractional factorial designs have been used for the screening of the experimental variables; and central composite designs have been carried out to get the best experimental conditions. Trying to solve the overlapping peak problems multivariate calibration methods have been used. Parallel Factor Analysis 2 (PARAFAC2), Multivariate Curve Resolution (MCR) and Parallel Factor Analysis with Linear Dependencies (PARALIND) have been proposed, the adequate algorithms have been used according to data characteristics, and the results have been compared. Because its occurrence in Basque Country and its relevance in the production of cider and txakoli regional wines the grape and apple samples were selected. These crops are often treated with triazole compounds trying to solve the problems caused by the funguses. The peel and pulp from grape and apple, their juices and some commercial products such as musts, juice and cider have been analysed showing the adequacy of the developed methods for the triazole determination in this kind of fruit samples.
Resumo:
Noise mapping has been used as an instrument for assessment of environmental noise, helping to support decision making on urban planning. In Brazil, urban noise is not yet recognized as a major environmental problem by the government. Besides, cities that have databases to drive acoustic simulations, making use of advanced noise mapping systems, are rare. This study sought an alternative method of noise mapping through the use of geoprocessing, which is feasible for the Brazilian reality and for other developing countries. The area chosen for the study was the central zone of the city of Sorocaba, located in So Paulo State, Brazil. The proposed method was effective in the spatial evaluation of equivalent sound pressure level. The results showed an urban area with high noise levels that exceed the legal standard, posing a threat to the welfare of the population.
Resumo:
Liquid chromatography coupled with mass spectrometry is one of the most powerful tools in the toxicologist’s arsenal to detect a wide variety of compounds from many different matrices. However, the huge number of potentially abused substances and new substances especially designed as intoxicants poses a problem in a forensic toxicology setting. Most methods are targeted and designed to cover a very specific drug or group of drugs while many other substances remain undetected. High resolution mass spectrometry, more specifically time-of-flight mass spectrometry, represents an extremely powerful tool in analysing a multitude of compounds not only simultaneously but also retroactively. The data obtained through the time-of-flight instrument contains all compounds made available from sample extraction and chromatography, which can be processed at a later time with an improved library to detect previously unrecognised compounds without having to analyse the respective sample again. The aim of this project was to determine the utility and limitations of time-of-flight mass spectrometry as a general and easily expandable screening method. The resolution of time-of-flight mass spectrometry allows for the separation of compounds with the same nominal mass but distinct exact masses without the need to separate them chromatographically. To simulate the wide variety of potentially encountered drugs in such a general screening method, seven drugs (morphine, cocaine, zolpidem, diazepam, amphetamine, MDEA and THC) were chosen to represent this variety in terms of mass, properties and functional groups. Consequently, several liquid-liquid and solid phase extractions were applied to urine samples to determine the most general suitable and unspecific extraction. Chromatography was optimised by investigating the parameters pH, concentration, organic solvent and gradient of the mobile phase to improve data obtained by the time-of-flight instrument. The resulting method was validated as a qualitative confirmation/identification method. Data processing was automated using the software TargetAnalysis, which provides excellent analyte recognition according to retention time, exact mass and isotope pattern. The recognition of isotope patterns allows excellent recognition of analytes even in interference rich mass spectra and proved to be a good positive indicator. Finally, the validated method was applied to samples received from the A& E Department of Glasgow Royal Infirmary in suspected drug abuse cases and samples received from the Scottish Prison Service, which we received from their own prevalence study targeting drugs of abuse in the prison population. The obtained data was processed with a library established in the course of this work.
Resumo:
Biological tissues are subjected to complex loading states in vivo and in order to define constitutive equations that effectively simulate their mechanical behaviour under these loads, it is necessary to obtain data on the tissue's response to multiaxial loading. Single axis and shear testing of biological tissues is often carried out, but biaxial testing is less common. We sought to design and commission a biaxial compression testing device, capable of obtaining repeatable data for biological samples. The apparatus comprised a sealed stainless steel pressure vessel specifically designed such that a state of hydrostatic compression could be created on the test specimen while simultaneously unloading the sample along one axis with an equilibrating tensile pressure. Thus a state of equibiaxial compression was created perpendicular to the long axis of a rectangular sample. For the purpose of calibration and commissioning of the vessel, rectangular samples of closed cell ethylene vinyl acetate (EVA) foam were tested. Each sample was subjected to repeated loading, and nine separate biaxial experiments were carried out to a maximum pressure of 204 kPa (30 psi), with a relaxation time of two hours between them. Calibration testing demonstrated the force applied to the samples had a maximum error of 0.026 N (0.423% of maximum applied force). Under repeated loading, the foam sample demonstrated lower stiffness during the first load cycle. Following this cycle, an increased stiffness, repeatable response was observed with successive loading. While the experimental protocol was developed for EVA foam, preliminary results on this material suggest that this device may be capable of providing test data for biological tissue samples. The load response of the foam was characteristic of closed cell foams, with consolidation during the early loading cycles, then a repeatable load-displacement response upon repeated loading. The repeatability of the test results demonstrated the ability of the test device to provide reproducible test data and the low experimental error in the force demonstrated the reliability of the test data.
Resumo:
This paper provides an overview of the current QUT Spatial Science undergraduate program based in Brisbane, Queensland, Australia. It discusses the development and implementation of a broad-based educational model for the faculty of built environment and engineering courses and specifically to the course structure of the new Bachelor of Urban Development (Spatial Science) study major. A brief historical background of surveying courses is discussed prior to the detailing of the three distinct and complementary learning themes of the new course structure with a graphical course matrix. Curriculum mapping of the spatial science major has been undertaken as the course approaches formal review in late 2010. Work-integrated learning opportunities have been embedded into the curriculum and a brief outline is presented. Some issues relevant to the tertiary surveying/ spatial sector are highlighted in the context of changing higher education environments in Australia.
Resumo:
Despite the increasing popularity of social networking websites (SNWs), very little is known about the psychosocial variables which predict people’s use of these websites. The present study used an extended model of the theory of planned behaviour (TPB), including the additional variables of self-identity and belongingness, to predict high level SNW use intentions and behaviour in a sample of young people aged between 17 and 24 years. Additional analayses examined the impact of self-identity and belongingness on young people’s addictive tendencies towards SNWs. University students (N = 233) completed measures of the standard TPB constructs (attitude, subjective norm and perceived behavioural control), the additional predictor variables (self-identity and belongingness), demographic variables (age, gender, and past behaviour) and addictive tendencies. One week later, they reported their engagement in high level SNW use during the previous week. Regression analyses partially supported the TPB, as attitude and subjective norm signficantly predicted intentions to engage in high level SNW use with intention signficantly predicting behaviour. Self-identity, but not belongingness, signficantly contributed to the prediction of intention, and, unexpectedly, behaviour. Past behaviour also signficantly predicted intention and behaviour. Self-identity and belongingness signficantly predicted addictive tendencies toward SNWs. Overall, the present study revealed that high level SNW use is influenced by attitudinal, normative, and self-identity factors, findings which can be used to inform strategies that aim to modify young people’s high levels of use or addictive tendencies for SNWs.