997 resultados para Particle Image Velocimetry


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robotic platforms have advanced greatly in terms of their remote sensing capabilities, including obtaining optical information using cameras. Alongside these advances, visual mapping has become a very active research area, which facilitates the mapping of areas inaccessible to humans. This requires the efficient processing of data to increase the final mosaic quality and computational efficiency. In this paper, we propose an efficient image mosaicing algorithm for large area visual mapping in underwater environments using multiple underwater robots. Our method identifies overlapping image pairs in the trajectories carried out by the different robots during the topology estimation process, being this a cornerstone for efficiently mapping large areas of the seafloor. We present comparative results based on challenging real underwater datasets, which simulated multi-robot mapping

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quickremovalofbiosolidsinaquaculturefacilities,andspeciallyinrecirculatingaquaculturesystems(RAS),isoneofthemostimportantstepinwastemanagement.Sedimentationdynamicsofbiosolidsinanaquaculturetankwilldeterminetheiraccumulationatthebottomofthetank.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Late on 2011 November 3, STEREO-A, STEREO-B, MESSENGER, and near-Earth spacecraft observed an energetic particle flux enhancement. Based on the analysis of in situ plasma and particle observations, their correlation with remote sensing observations, and an interplanetary transport model, we conclude that the particle increases observed at multiple locations had a common single source active region and the energetic particles filled a very broad region around the Sun. The active region was located at the solar backside (as seen from Earth) and was the source of a large flare, a fast and wide coronal mass ejection, and an EIT wave, accompanied by type II and type III radio-emission. In contrast to previous solar energetic particle events showing broad longitudinal spread, this event showed clear particle anisotropies at three widely separated observation points at 1AU, suggesting direct particle injection close to the magnetic footpoint of each spacecraft, lasting for several hours.We discuss these observations and the possible scenarios explaining the extremely broad particle spread for this event.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ongoing development of the digital media has brought a new set of challenges with it. As images containing more than three wavelength bands, often called spectral images, are becoming a more integral part of everyday life, problems in the quality of the RGB reproduction from the spectral images have turned into an important area of research. The notion of image quality is often thought to comprise two distinctive areas – image quality itself and image fidelity, both dealing with similar questions, image quality being the degree of excellence of the image, and image fidelity the measure of the match of the image under study to the original. In this thesis, both image fidelity and image quality are considered, with an emphasis on the influence of color and spectral image features on both. There are very few works dedicated to the quality and fidelity of spectral images. Several novel image fidelity measures were developed in this study, which include kernel similarity measures and 3D-SSIM (structural similarity index). The kernel measures incorporate the polynomial, Gaussian radial basis function (RBF) and sigmoid kernels. The 3D-SSIM is an extension of a traditional gray-scale SSIM measure developed to incorporate spectral data. The novel image quality model presented in this study is based on the assumption that the statistical parameters of the spectra of an image influence the overall appearance. The spectral image quality model comprises three parameters of quality: colorfulness, vividness and naturalness. The quality prediction is done by modeling the preference function expressed in JNDs (just noticeable difference). Both image fidelity measures and the image quality model have proven to be effective in the respective experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last two decades of studying the Solar Energetic Particle (SEP) phenomenon, intensive emphasis has been put on how and when and where these SEPs are injected into interplanetary space. It is well known that SEPs are related to solar flares and CMEs. However, the role of each in the acceleration of SEPs has been under debate since the major role was taken from flares ascribed to CMEs step by step after the skylab mission, which started the era of CME spaceborn observations. Since then, the shock wave generated by powerful CMEs in between 2-5 solar radii is considered the major accelerator. The current paradigm interprets the prolonged proton intensity-time profile in gradual SEP events as a direct effect of accelerated SEPs by shock wave propagating in the interplanetary medium. Thus the powerful CME is thought of as a starter for the acceleration and its shock wave as a continuing accelerator to result in such an intensity-time profile. Generally it is believed that a single powerful CME which might or might not be associated with a flare is always the reason behind such gradual events.

In this work we use the Energetic and Relativistic Nucleus and Electrons ERNE instrument on board Solar and Heliospheric Observatory SOHO to present an empirical study to show the possibility of multiple accelerations in SEP events. In the beginning we found 18 double-peaked SEP events by examining 88 SEP events. The peaks in the intensity-time profile were separated by 3-24 hours. We divided the SEP events according to possible multiple acceleration into four groups and in one of these groups we find evidence for multiple acceleration in velocity dispersion and change in the abundance ratio associated at transition to the second peak. Then we explored the intensity-time profiles of all SEP events during solar cycle 23 and found that most of the SEP events are associated with multiple eruptions at the Sun and we call those events as Multi-Eruption Solar Energetic Particles (MESEP) events. We use the data available by Large Angle and Spectrometric Coronograph LASCO on board SOHO to determine the CME associated with such events and YOHKOH and GOES satellites data to determine the flare associated with such events. We found four types of MESEP according to the appearance of the peaks in the intensity-time profile in large variation of energy levels. We found that it is not possible to determine whether the peaks are related to an eruption at the Sun or not, only by examining the anisotropy flux, He/p ratio and velocity dispersion. Then we chose a rare event in which there is evidence of SEP acceleration from behind previous CME. This work resulted in a conclusion which is inconsistent with the current SEP paradigm. Then we discovered through examining another MESEP event, that energetic particles accelerated by a second CME can penetrate a previous CME-driven decelerating shock. Finally, we report the previous two MESEP events with new two events and find a common basis for second CME SEPs penetrating previous decelerating shocks. This phenomenon is reported for the first time and expected to have significant impact on modification of the current paradigm of the solar energetic particle events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cooling crystallization is one of the most important purification and separation techniques in the chemical and pharmaceutical industry. The product of the cooling crystallization process is always a suspension that contains both the mother liquor and the product crystals, and therefore the first process step following crystallization is usually solid-liquid separation. The properties of the produced crystals, such as their size and shape, can be affected by modifying the conditions during the crystallization process. The filtration characteristics of solid/liquid suspensions, on the other hand, are strongly influenced by the particle properties, as well as the properties of the liquid phase. It is thus obvious that the effect of the changes made to the crystallization parameters can also be seen in the course of the filtration process. Although the relationship between crystallization and filtration is widely recognized, the number of publications where these unit operations have been considered in the same context seems to be surprisingly small. This thesis explores the influence of different crystallization parameters in an unseeded batch cooling crystallization process on the external appearance of the product crystals and on the pressure filtration characteristics of the obtained product suspensions. Crystallization experiments are performed by crystallizing sulphathiazole (C9H9N3O2S2), which is a wellknown antibiotic agent, from different mixtures of water and n-propanol in an unseeded batch crystallizer. The different crystallization parameters that are studied are the composition of the solvent, the cooling rate during the crystallization experiments carried out by using a constant cooling rate throughout the whole batch, the cooling profile, as well as the mixing intensity during the batch. The obtained crystals are characterized by using an automated image analyzer and the crystals are separated from the solvent through constant pressure batch filtration experiments. Separation characteristics of the suspensions are described by means of average specific cake resistance and average filter cake porosity, and the compressibilities of the cakes are also determined. The results show that fairly large differences can be observed between the size and shape of the crystals, and it is also shown experimentally that the changes in the crystal size and shape have a direct impact on the pressure filtration characteristics of the crystal suspensions. The experimental results are utilized to create a procedure that can be used for estimating the filtration characteristics of solid-liquid suspensions according to the particle size and shape data obtained by image analysis. Multilinear partial least squares regression (N-PLS) models are created between the filtration parameters and the particle size and shape data, and the results presented in this thesis show that relatively obvious correlations can be detected with the obtained models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aerosol size distributions from 6 to 700 nm were measured simultaneously at an urban background site and a roadside station in Oporto. The particle number concentration was higher at the traffic exposed site, where up to 90% of the size spectrum was dominated by the nucleation mode. Larger aerosol mode diameters were observed in the urban background site possibly due to the coagulation processes or uptake of gases during transport. Factor analysis has shown that road traffic and the neighbour stationary sources localised upwind affect the urban area thought intra-regional pollutant transport.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluates the application of an intelligent hybrid system for time-series forecasting of atmospheric pollutant concentration levels. The proposed method consists of an artificial neural network combined with a particle swarm optimization algorithm. The method not only searches relevant time lags for the correct characterization of the time series, but also determines the best neural network architecture. An experimental analysis is performed using four real time series and the results are shown in terms of six performance measures. The experimental results demonstrate that the proposed methodology achieves a fair prediction of the presented pollutant time series by using compact networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dirt counting and dirt particle characterisation of pulp samples is an important part of quality control in pulp and paper production. The need for an automatic image analysis system to consider dirt particle characterisation in various pulp samples is also very critical. However, existent image analysis systems utilise a single threshold to segment the dirt particles in different pulp samples. This limits their precision. Based on evidence, designing an automatic image analysis system that could overcome this deficiency is very useful. In this study, the developed Niblack thresholding method is proposed. The method defines the threshold based on the number of segmented particles. In addition, the Kittler thresholding is utilised. Both of these thresholding methods can determine the dirt count of the different pulp samples accurately as compared to visual inspection and the Digital Optical Measuring and Analysis System (DOMAS). In addition, the minimum resolution needed for acquiring a scanner image is defined. By considering the variation in dirt particle features, the curl shows acceptable difference to discriminate the bark and the fibre bundles in different pulp samples. Three classifiers, called k-Nearest Neighbour, Linear Discriminant Analysis and Multi-layer Perceptron are utilised to categorize the dirt particles. Linear Discriminant Analysis and Multi-layer Perceptron are the most accurate in classifying the segmented dirt particles by the Kittler thresholding with morphological processing. The result shows that the dirt particles are successfully categorized for bark and for fibre bundles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is devoted to the development of numerical method to deal with convection diffusion dominated problem with reaction term, non - stiff chemical reaction and stiff chemical reaction. The technique is based on the unifying Eulerian - Lagrangian schemes (particle transport method) under the framework of operator splitting method. In the computational domain, the particle set is assigned to solve the convection reaction subproblem along the characteristic curves created by convective velocity. At each time step, convection, diffusion and reaction terms are solved separately by assuming that, each phenomenon occurs separately in a sequential fashion. Moreover, adaptivities and projection techniques are used to add particles in the regions of high gradients (steep fronts) and discontinuities and transfer a solution from particle set onto grid point respectively. The numerical results show that, the particle transport method has improved the solutions of CDR problems. Nevertheless, the method is time consumer when compared with other classical technique e.g., method of lines. Apart from this advantage, the particle transport method can be used to simulate problems that involve movingsteep/smooth fronts such as separation of two or more elements in the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective for this study was to explore certain organization’s product line rebranding process and its impact on product line’s perceived image. The case company is a global paper, packaging and forest products company, business segment paper board. The audience explored is one of the company’s major customers, merchant in Germany. The research was performed as a descriptive case study with a purpose to provide longitudinal insight into the product line image and its eventual alteration as a result of the case company’s rebranding process. Mainly qualitative methods were used for conducting the research. The data for the empirical part was collected with a web-based survey at two different points of time; before the rebranded products entered the market and after they had been available approximately six months. The results of this study reveal that the case company has performed well in its attempt to improve product line’s brand image through rebranding. It was found that between the two brand image measurements the product brand image seems to have improved in all of the areas which according to theoretical framework of this study contribute to formation of brand image; brand associations, marketing communications and interpersonal relationships, not forgetting the original platform that initiated the change; technical quality modifications. In other words it may be concluded that as technical quality was brought to a new level, also assessments about the brand image improved respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diabetes is a rapidly increasing worldwide problem which is characterised by defective metabolism of glucose that causes long-term dysfunction and failure of various organs. The most common complication of diabetes is diabetic retinopathy (DR), which is one of the primary causes of blindness and visual impairment in adults. The rapid increase of diabetes pushes the limits of the current DR screening capabilities for which the digital imaging of the eye fundus (retinal imaging), and automatic or semi-automatic image analysis algorithms provide a potential solution. In this work, the use of colour in the detection of diabetic retinopathy is statistically studied using a supervised algorithm based on one-class classification and Gaussian mixture model estimation. The presented algorithm distinguishes a certain diabetic lesion type from all other possible objects in eye fundus images by only estimating the probability density function of that certain lesion type. For the training and ground truth estimation, the algorithm combines manual annotations of several experts for which the best practices were experimentally selected. By assessing the algorithm’s performance while conducting experiments with the colour space selection, both illuminance and colour correction, and background class information, the use of colour in the detection of diabetic retinopathy was quantitatively evaluated. Another contribution of this work is the benchmarking framework for eye fundus image analysis algorithms needed for the development of the automatic DR detection algorithms. The benchmarking framework provides guidelines on how to construct a benchmarking database that comprises true patient images, ground truth, and an evaluation protocol. The evaluation is based on the standard receiver operating characteristics analysis and it follows the medical practice in the decision making providing protocols for image- and pixel-based evaluations. During the work, two public medical image databases with ground truth were published: DIARETDB0 and DIARETDB1. The framework, DR databases and the final algorithm, are made public in the web to set the baseline results for automatic detection of diabetic retinopathy. Although deviating from the general context of the thesis, a simple and effective optic disc localisation method is presented. The optic disc localisation is discussed, since normal eye fundus structures are fundamental in the characterisation of DR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New luminometric particle-based methods were developed to quantify protein and to count cells. The developed methods rely on the interaction of the sample with nano- or microparticles and different principles of detection. In fluorescence quenching, timeresolved luminescence resonance energy transfer (TR-LRET), and two-photon excitation fluorescence (TPX) methods, the sample prevents the adsorption of labeled protein to the particles. Depending on the system, the addition of the analyte increases or decreases the luminescence. In the dissociation method, the adsorbed protein protects the Eu(III) chelate on the surface of the particles from dissociation at a low pH. The experimental setups are user-friendly and rapid and do not require hazardous test compounds and elevated temperatures. The sensitivity of the quantification of protein (from 40 to 500 pg bovine serum albumin in a sample) was 20-500-fold better than in most sensitive commercial methods. The quenching method exhibited low protein-to-protein variability and the dissociation method insensitivity to the assay contaminants commonly found in biological samples. Less than ten eukaryotic cells were detected and quantified with all the developed methods under optimized assay conditions. Furthermore, two applications, the method for detection of the aggregation of protein and the cell viability test, were developed by utilizing the TR-LRET method. The detection of the aggregation of protein was allowed at a more than 10,000 times lower concentration, 30 μg/L, compared to the known methods of UV240 absorbance and dynamic light scattering. The TR-LRET method was combined with a nucleic acid assay with cell-impermeable dye to measure the percentage of dead cells in a single tube test with cell counts below 1000 cells/tube.