940 resultados para Lagrangian particle tracking method
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach
Resumo:
Estimating the magnitude of Agulhas leakage, the volume flux of water from the Indian to the Atlantic Ocean, is difficult because of the presence of other circulation systems in the Agulhas region. Indian Ocean water in the Atlantic Ocean is vigorously mixed and diluted in the Cape Basin. Eulerian integration methods, where the velocity field perpendicular to a section is integrated to yield a flux, have to be calibrated so that only the flux by Agulhas leakage is sampled. Two Eulerian methods for estimating the magnitude of Agulhas leakage are tested within a high-resolution two-way nested model with the goal to devise a mooring-based measurement strategy. At the GoodHope line, a section halfway through the Cape Basin, the integrated velocity perpendicular to that line is compared to the magnitude of Agulhas leakage as determined from the transport carried by numerical Lagrangian floats. In the first method, integration is limited to the flux of water warmer and more saline than specific threshold values. These threshold values are determined by maximizing the correlation with the float-determined time series. By using the threshold values, approximately half of the leakage can directly be measured. The total amount of Agulhas leakage can be estimated using a linear regression, within a 90% confidence band of 12 Sv. In the second method, a subregion of the GoodHope line is sought so that integration over that subregion yields an Eulerian flux as close to the float-determined leakage as possible. It appears that when integration is limited within the model to the upper 300 m of the water column within 900 km of the African coast the time series have the smallest root-mean-square difference. This method yields a root-mean-square error of only 5.2 Sv but the 90% confidence band of the estimate is 20 Sv. It is concluded that the optimum thermohaline threshold method leads to more accurate estimates even though the directly measured transport is a factor of two lower than the actual magnitude of Agulhas leakage in this model.
Resumo:
The skill of numerical Lagrangian drifter trajectories in three numerical models is assessed by comparing these numerically obtained paths to the trajectories of drifting buoys in the real ocean. The skill assessment is performed using the two-sample KolmogorovSmirnov statistical test. To demonstrate the assessment procedure, it is applied to three different models of the Agulhas region. The test can either be performed using crossing positions of one-dimensional sections in order to test model performance in specific locations, or using the total two-dimensional data set of trajectories. The test yields four quantities: a binary decision of model skill, a confidence level which can be used as a measure of goodness-of-fit of the model, a test statistic which can be used to determine the sensitivity of the confidence level, and cumulative distribution functions that aid in the qualitative analysis. The ordering of models by their confidence levels is the same as the ordering based on the qualitative analysis, which suggests that the method is suited for model validation. Only one of the three models, a 1/10 two-way nested regional ocean model, might have skill in the Agulhas region. The other two models, a 1/2 global model and a 1/8 assimilative model, might have skill only on some sections in the region
Resumo:
Four multiparous cows with cannulas in the rumen and proximal duodenum were used in early lactation in a 4 x 4 Latin square experiment to investigate the effect of method of application of a fibrolytic enzyme product on digestive processes and milk production. The cows were given ad libitum a total mixed ration (TMR) composed of 57% (dry matter basis) forage (3:1 corn silage:grass silage) and 43% concentrates. The TMR contained (g/kg dry matter): 274 neutral detergent fiber, 295 starch, 180 crude protein. Treatments were TMR alone or TMR with the enzyme product added (2 kg/1000 kg TMR dry matter) either sprayed on the TMR 1 h before the morning feed (TMR-E), sprayed only on the concentrate the day before feeding (Concs-E), or infused into the rumen for 14 h/d (Rumen-E). There Was no significant effect on either feed intake or milk yield but both were highest on TMR-E. Rumen digestibility of dry matter, organic matter, and starch was unaffected by the enzyme. Digestibility of NDF was lowest on TMR-E in the rumen but highest postruminally. Total Tract digestibility was highest on TMR-E for dry matter, organic matter, and starch but treatment differences were nonsignificant for neutral detergent fiber: Corn silage stover retention time in the rumen was reduced by all enzyme treatments but postruminal transit time vas increased so the decline in total tract retention. time with enzymes was not significant. It is suggested that the tendency for enzymes to reduce particle retention time in the rumen may, by reducing the time available for fibrolysis to occur, at least partly explain the variability in the reported responses to enzyme treatment.
Resumo:
In this work we describe the synthesis of a variety of MCM-41 type hexagonal and SBA-1 type cubic mesostructures and mesoporous silicious materials employing a novel synthesis concept based on polyacrylic acid (Pac)-C(n)TAB complexes as backbones of the developing structures. The ordered porosity of the solids was established by XRD and TEM techniques. The synthesis concept makes use of Pac-C(n)TAB nanoassemblies as a preformed scaffold, formed by the gradual increase of pH. On this starting matrix the inorganic precursor species SiO2 precipitate via hydrolysis of TEOS under the influence of increasing pH. The molecular weight (MW) of Pac, as well as the length of carbon chain in C,TAB, determine the physical and structural characteristics of the obtained materials. Longer chain surfactants (C(16)TAB) lead to the formation of hexagonal phase, while shorter chain surfactants (C(14)TAB, C(12)TAB) favor the SBA-1 phase. Lower MW of Pac (approximate to2000) leads to better-organized structures compared to higher MW ( 450,000), which leads to worm-like mesostructures. Cell parameters and pore size increase with increasing polyelectrolyte and/or surfactant chain, while at the same time SEM photography reveals that the particle size decreases. Conductivity experiments provide some insight into the proposed self-assembling pathway. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
Templated sol-gel encapsulation of surfactant-stabilised micelles containing metal precursor(s) with ultra-thin porous silica coating allows solvent extraction of organic based stabiliser from the composites in colloidal state hence a new method of preparing supported alloy catalysts using the inorganic silica-stabilised nano-sized, homogenously mixed, silver - platinum (Ag-Pt) colloidal particles is reported.
Resumo:
Our eyes are input sensors which Provide our brains with streams of visual data. They have evolved to be extremely efficient, and they will constantly dart to-and-fro to rapidly build up a picture of the salient entities in a viewed scene. These actions are almost subconscious. However, they can provide telling signs of how the brain is decoding the visuals and call indicate emotional responses, prior to the viewer becoming aware of them. In this paper we discuss a method of tracking a user's eye movements, and Use these to calculate their gaze within an immersive virtual environment. We investigate how these gaze patterns can be captured and used to identify viewed virtual objects, and discuss how this can be used as a, natural method of interacting with the Virtual Environment. We describe a flexible tool that has been developed to achieve this, and detail initial validating applications that prove the concept.
Resumo:
Most active-contour methods are based either on maximizing the image contrast under the contour or on minimizing the sum of squared distances between contour and image 'features'. The Marginalized Likelihood Ratio (MLR) contour model uses a contrast-based measure of goodness-of-fit for the contour and thus falls into the first class. The point of departure from previous models consists in marginalizing this contrast measure over unmodelled shape variations. The MLR model naturally leads to the EM Contour algorithm, in which pose optimization is carried out by iterated least-squares, as in feature-based contour methods. The difference with respect to other feature-based algorithms is that the EM Contour algorithm minimizes squared distances from Bayes least-squares (marginalized) estimates of contour locations, rather than from 'strongest features' in the neighborhood of the contour. Within the framework of the MLR model, alternatives to the EM algorithm can also be derived: one of these alternatives is the empirical-information method. Tracking experiments demonstrate the robustness of pose estimates given by the MLR model, and support the theoretical expectation that the EM Contour algorithm is more robust than either feature-based methods or the empirical-information method. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.
Resumo:
Recent literature has described a transition zone between the average top of deep convection in the Tropics and the stratosphere. Here transport across this zone is investigated using an offline trajectory model. Particles were advected by the resolved winds from the European Centre for Medium-Range Weather Forecasts reanalyses. For each boreal winter clusters of particles were released in the upper troposphere over the four main regions of tropical deep convection (Indonesia, central Pacific, South America, and Africa). Most particles remain in the troposphere, descending on average for every cluster. The horizontal components of 5-day trajectories are strongly influenced by the El NioSouthern Oscillation (ENSO), but the Lagrangian average descent does not have a clear ENSO signature. Tropopause crossing locations are first identified by recording events when trajectories from the same release regions cross the World Meteorological Organization lapse rate tropopause. Most crossing events occur 515 days after release, and 30-day trajectories are sufficiently long to estimate crossing number densities. In a further two experiments slight excursions across the lapse rate tropopause are differentiated from the drift deeper into the stratosphere by defining the tropopause zone as a layer bounded by the average potential temperature of the lapse rate tropopause and the profile temperature minimum. Transport upward across this zone is studied using forward trajectories released from the lower bound and back trajectories arriving at the upper bound. Histograms of particle potential temperature () show marked differences between the transition zone, where there is a slow spread in values about a peak that shifts slowly upward, and the troposphere below 350 K. There forward trajectories experience slow radiative cooling interspersed with bursts of convective heating resulting in a well-mixed distribution. In contrast histograms for back trajectories arriving in the stratosphere have two distinct peaks just above 300 and 350 K, indicating the sharp change from rapid convective heating in the well-mixed troposphere to slow ascent in the transition zone. Although trajectories slowly cross the tropopause zone throughout the Tropics, all three experiments show that most trajectories reaching the stratosphere from the lower troposphere within 30 days do so over the west Pacific warm pool. This preferred location moves about 3050 farther east in an El Nio year (1982/83) and about 30 farther west in a La Nia year (1988/89). These results could have important implications for upper-tropospherelower-stratosphere pollution and chemistry studies.
Resumo:
Radial basis functions can be combined into a network structure that has several advantages over conventional neural network solutions. However, to operate effectively the number and positions of the basis function centres must be carefully selected. Although no rigorous algorithm exists for this purpose, several heuristic methods have been suggested. In this paper a new method is proposed in which radial basis function centres are selected by the mean-tracking clustering algorithm. The mean-tracking algorithm is compared with k means clustering and it is shown that it achieves significantly better results in terms of radial basis function performance. As well as being computationally simpler, the mean-tracking algorithm in general selects better centre positions, thus providing the radial basis functions with better modelling accuracy
Resumo:
A distributed Lagrangian moving-mesh finite element method is applied to problems involving changes of phase. The algorithm uses a distributed conservation principle to determine nodal mesh velocities, which are then used to move the nodes. The nodal values are obtained from an ALE (Arbitrary Lagrangian-Eulerian) equation, which represents a generalization of the original algorithm presented in Applied Numerical Mathematics, 54:450--469 (2005). Having described the details of the generalized algorithm it is validated on two test cases from the original paper and is then applied to one-phase and, for the first time, two-phase Stefan problems in one and two space dimensions, paying particular attention to the implementation of the interface boundary conditions. Results are presented to demonstrate the accuracy and the effectiveness of the method, including comparisons against analytical solutions where available.
Resumo:
This paper presents an enhanced hypothesis verification strategy for 3D object recognition. A new learning methodology is presented which integrates the traditional dichotomic object-centred and appearance-based representations in computer vision giving improved hypothesis verification under iconic matching. The "appearance" of a 3D object is learnt using an eigenspace representation obtained as it is tracked through a scene. The feature representation implicitly models the background and the objects observed enabling the segmentation of the objects from the background. The method is shown to enhance model-based tracking, particularly in the presence of clutter and occlusion, and to provide a basis for identification. The unified approach is discussed in the context of the traffic surveillance domain. The approach is demonstrated on real-world image sequences and compared to previous (edge-based) iconic evaluation techniques.
Resumo:
Accurate estimates for the fall speed of natural hydrometeors are vital if their evolution in clouds is to be understood quantitatively. In this study, laboratory measurements of the terminal velocity vt for a variety of ice particle models settling in viscous fluids, along with wind-tunnel and field measurements of ice particles settling in air, have been analyzed and compared to common methods of computing vt from the literature. It is observed that while these methods work well for a number of particle types, they fail for particles with open geometries, specifically those particles for which the area ratio Ar is small (Ar is defined as the area of the particle projected normal to the flow divided by the area of a circumscribing disc). In particular, the fall speeds of stellar and dendritic crystals, needles, open bullet rosettes, and low-density aggregates are all overestimated. These particle types are important in many cloud types: aggregates in particular often dominate snow precipitation at the ground and vertically pointing Doppler radar measurements. Based on the laboratory data, a simple modification to previous computational methods is proposed, based on the area ratio. This new method collapses the available drag data onto an approximately universal curve, and the resulting errors in the computed fall speeds relative to the tank data are less than 25% in all cases. Comparison with the (much more scattered) measurements of ice particles falling in air show strong support for this new method, with the area ratio bias apparently eliminated.