52 resultados para PARTIAL-FILLING TECHNIQUE


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We extend the partial resummation technique of Fokker-Planck terms for multivariable stochastic differential equations with colored noise. As an example, a model system of a Brownian particle with colored noise is studied. We prove that the asymmetric behavior found in analog simulations is due to higher-order terms which are left out in that technique. On the contrary, the systematic ¿-expansion approach can explain the analog results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The filling length of an edge-circuit η in the Cayley 2-complex of a finite presentation of a group is the minimal integer length L such that there is a combinatorial null-homotopy of η down to a base point through loops of length at most L. We introduce similar notions in which the full-homotopy is not required to fix a base point, and in which the contracting loop is allowed to bifurcate. We exhibit a group in which the resulting filling invariants exhibit dramatically different behaviour to the standard notion of filling length. We also define the corresponding filling invariants for Riemannian manifolds and translate our results to this setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lean meat percentage (LMP) is an important carcass quality parameter. The aim of this work is to obtain a calibration equation for the Computed Tomography (CT) scans with the Partial Least Square Regression (PLS) technique in order to predict the LMP of the carcass and the different cuts and to study and compare two different methodologies of the selection of the variables (Variable Importance for Projection — VIP- and Stepwise) to be included in the prediction equation. The error of prediction with cross-validation (RMSEPCV) of the LMP obtained with PLS and selection based on VIP value was 0.82% and for stepwise selection it was 0.83%. The prediction of the LMP scanning only the ham had a RMSEPCV of 0.97% and if the ham and the loin were scanned the RMSEPCV was 0.90%. Results indicate that for CT data both VIP and stepwise selection are good methods. Moreover the scanning of only the ham allowed us to obtain a good prediction of the LMP of the whole carcass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores two major issues, from biophysical and historical viewpoints. We examine land management, which we define as the long-term fertility maintenance of land in relation to agriculture, fishery and forestry. We also explore humans’ positive role as agents aiming to reinforce harmonious materials circulation within the land. Liebig’s view on nature, agriculture and land, emphasizes the maintenance of long-term land fertility based on his agronomical thought that the circulation of matter in agricultural fields must be maintained with manure as much as possible. The thoughts of several classical economists, on nature, agriculture and land are reassessed from Liebig’s view point. Then, the land management problem is discussed at a much more fundamental level, to understand the necessary conditions for life in relation to land management. This point is analyzed in terms of two mechanisms: entropy disposal on the earth, and material circulation against gravitational field. Finally from the historical example of the metropolis of Edo, it is shown that there is yet another necessary condition for the sustainable management of land based on the creation of harmonious material cycles among cities, farm land, forests and surrounding sea areas in which humans play a vital role as agent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the implementation details of a coded structured light system for rapid shape acquisition of unknown surfaces. Such techniques are based on the projection of patterns onto a measuring surface and grabbing images of every projection with a camera. Analyzing the pattern deformations that appear in the images, 3D information of the surface can be calculated. The implemented technique projects a unique pattern so that it can be used to measure moving surfaces. The structure of the pattern is a grid where the color of the slits are selected using a De Bruijn sequence. Moreover, since both axis of the pattern are coded, the cross points of the grid have two codewords (which permits to reconstruct them very precisely), while pixels belonging to horizontal and vertical slits have also a codeword. Different sets of colors are used for horizontal and vertical slits, so the resulting pattern is invariant to rotation. Therefore, the alignment constraint between camera and projector considered by a lot of authors is not necessary

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this report we present the growth process of the cobalt oxide system using reactive electron beam deposition. In that technique, a target of metallic cobalt is evaporated and its atoms are in-flight oxidized in an oxygen rich reactive atmosphere before reaching the surface of the substrate. With a trial and error procedure the deposition parameters have been optimized to obtain the correct stoichiometry and crystalline phase. The evaporation conditions to achieve the correct cobalt oxide salt rock structure, when evaporating over amorphous silicon nitride, are: 525 K of substrate temperature, 2.5·10-4 mbar of oxygen partial pressure and 1 Å/s of evaporation rate. Once the parameters were optimized a set of ultra thin film ranging from samples of 1 nm of nominal thickness to 20nm thick and bulk samples were grown. With the aim to characterize the samples and study their microstructure and morphology, X-ray diffraction, transmission electron microscopy, electron diffraction, energy dispersive X-ray spectroscopy and quasi-adiabatic nanocalorimetry techniques are utilised. The final results show a size dependent effect of the antiferromagnetic transition. Its Néel temperature becomes depressed as the size of the grains forming the layer decreases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present work, microstructure improvement using FSP (Friction Stir Processing) is studied. In the first part of the work, the microstructure improvement of as-cast A356 is demonstrated. Some tensile tests were applied to check the increase in ductility. However, the expected results couldn’t be achieved. In the second part, the microstructure improvement of a fusion weld in 1050 aluminium alloy is presented. Hardness tests were carried out to prove the mechanical propertyimprovements. In the third and last part, the microstructure improvement of 1050 aluminium alloy is achieved. A discussion of the mechanical property improvements induced by FSP is made. The influence of tool traverse speed on microstructure and mechanical properties is also discussed. Hardness tests and recrystallization theory enabled us to find out such influence

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that if performance measures in a stochastic scheduling problem satisfy a set of so-called partial conservation laws (PCL), which extend previously studied generalized conservation laws (GCL), then the problem is solved optimally by a priority-index policy for an appropriate range of linear performance objectives, where the optimal indices are computed by a one-pass adaptive-greedy algorithm, based on Klimov's. We further apply this framework to investigate the indexability property of restless bandits introduced by Whittle, obtaining the following results: (1) we identify a class of restless bandits (PCL-indexable) which are indexable; membership in this class is tested through a single run of the adaptive-greedy algorithm, which also computes the Whittle indices when the test is positive; this provides a tractable sufficient condition for indexability; (2) we further indentify the class of GCL-indexable bandits, which includes classical bandits, having the property that they are indexable under any linear reward objective. The analysis is based on the so-called achievable region method, as the results follow fromnew linear programming formulations for the problems investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Weather radar observations are currently the most reliable method for remote sensing of precipitation. However, a number of factors affect the quality of radar observations and may limit seriously automated quantitative applications of radar precipitation estimates such as those required in Numerical Weather Prediction (NWP) data assimilation or in hydrological models. In this paper, a technique to correct two different problems typically present in radar data is presented and evaluated. The aspects dealt with are non-precipitating echoes - caused either by permanent ground clutter or by anomalous propagation of the radar beam (anaprop echoes) - and also topographical beam blockage. The correction technique is based in the computation of realistic beam propagation trajectories based upon recent radiosonde observations instead of assuming standard radio propagation conditions. The correction consists of three different steps: 1) calculation of a Dynamic Elevation Map which provides the minimum clutter-free antenna elevation for each pixel within the radar coverage; 2) correction for residual anaprop, checking the vertical reflectivity gradients within the radar volume; and 3) topographical beam blockage estimation and correction using a geometric optics approach. The technique is evaluated with four case studies in the region of the Po Valley (N Italy) using a C-band Doppler radar and a network of raingauges providing hourly precipitation measurements. The case studies cover different seasons, different radio propagation conditions and also stratiform and convective precipitation type events. After applying the proposed correction, a comparison of the radar precipitation estimates with raingauges indicates a general reduction in both the root mean squared error and the fractional error variance indicating the efficiency and robustness of the procedure. Moreover, the technique presented is not computationally expensive so it seems well suited to be implemented in an operational environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An experimental method of studying shifts between concentration-versus-depth profiles of vacancy- and interstitial-type defects in ion-implanted silicon is demonstrated. The concept is based on deep level transient spectroscopy measurements utilizing the filling pulse variation technique. The vacancy profile, represented by the vacancy¿oxygen center, and the interstitial profile, represented by the interstitial carbon¿substitutional carbon pair, are obtained at the same sample temperature by varying the duration of the filling pulse. The effect of the capture in the Debye tail has been extensively studied and taken into account. Thus, the two profiles can be recorded with a high relative depth resolution. Using low doses, point defects have been introduced in lightly doped float zone n-type silicon by implantation with 6.8 MeV boron ions and 680 keV and 1.3 MeV protons at room temperature. The effect of the angle of ion incidence has also been investigated. For all implantation conditions the peak of the interstitial profile is displaced towards larger depths compared to that of the vacancy profile. The amplitude of this displacement increases as the width of the initial point defect distribution increases. This behavior is explained by a simple model where the preferential forward momentum of recoiling silicon atoms and the highly efficient direct recombination of primary point defects are taken into account.