1000 resultados para Saline flotation method
Resumo:
In this paper, an extension of the multi-scale finite-volume (MSFV) method is devised, which allows to Simulate flow and transport in reservoirs with complex well configurations. The new framework fits nicely into the data Structure of the original MSFV method,and has the important property that large patches covering the whole well are not required. For each well. an additional degree of freedom is introduced. While the treatment of pressure-constraint wells is trivial (the well-bore reference pressure is explicitly specified), additional equations have to be solved to obtain the unknown well-bore pressure of rate-constraint wells. Numerical Simulations of test cases with multiple complex wells demonstrate the ability of the new algorithm to capture the interference between the various wells and the reservoir accurately. (c) 2008 Elsevier Inc. All rights reserved.
Resumo:
We continue the development of a method for the selection of a bandwidth or a number of design parameters in density estimation. We provideexplicit non-asymptotic density-free inequalities that relate the $L_1$ error of the selected estimate with that of the best possible estimate,and study in particular the connection between the richness of the classof density estimates and the performance bound. For example, our methodallows one to pick the bandwidth and kernel order in the kernel estimatesimultaneously and still assure that for {\it all densities}, the $L_1$error of the corresponding kernel estimate is not larger than aboutthree times the error of the estimate with the optimal smoothing factor and kernel plus a constant times $\sqrt{\log n/n}$, where $n$ is the sample size, and the constant only depends on the complexity of the family of kernels used in the estimate. Further applications include multivariate kernel estimates, transformed kernel estimates, and variablekernel estimates.
Resumo:
New method for rearing Spodoptera frugiperda in laboratory shows that larval cannibalism is not obligatory. Here we show, for the first time, that larvae of the fall armyworm (FAW), Spodoptera frugiperda (Lepidoptera, Noctuidae), can be successfully reared in a cohort-based manner with virtually no cannibalism. FAW larvae were reared since the second instar to pupation in rectangular plastic containers containing 40 individuals with a surprisingly ca. 90% larval survivorship. Adult females from the cohort-based method showed fecundity similar to that already reported on literature for larvae reared individually, and fertility higher than 99%, with the advantage of combining economy of time, space and material resources. These findings suggest that the factors affecting cannibalism of FAW larvae in laboratory rearings need to be reevaluated, whilst the new technique also show potential to increase the efficiency of both small and mass FAW rearings.
Resumo:
A significant postoperative problem in patients undergoing excision of intramedullary tumors is painful dysesthesiae, attributed to various causes, including edema, arachnoid scarring and cord tethering. The authors describe a technique of welding the pia and arachnoid after the excision of intramedullary spinal cord tumors used in seven cases. Using a fine bipolar forcep and a low current, the pial edges of the myelotomy were brought together and welded under saline irrigation. A similar method was used for closing the arachnoid while the dura was closed with a running 5-0 vicryl suture. Closing the pia and arachnoid restores normal cord anatomy after tumor excision and may reduce the incidence of postoperative painful dysesthesiae.
Resumo:
This paper describes an optimized model to support QoS by mean of Congestion minimization on LSPs (Label Switching Path). In order to perform this model, we start from a CFA (Capacity and Flow Allocation) model. As this model does not consider the buffer size to calculate the capacity cost, our model- named BCA (Buffer Capacity Allocation)- take into account this issue and it improve the CFA performance. To test our proposal, we perform several simulations; results show that BCA model minimizes LSP congestion and uniformly distributes flows on the network
Resumo:
Endogenous glucose production rate (EGPR) remains constant when lactate is infused in healthy humans. A decrease of glycogenolysis or of gluconeogenesis from endogenous precursors or a stimulation of glycogen synthesis, may all be involved; This autoregulation does not depend on changes in glucoregulatory hormones. It may be speculated that alterations in basal sympathetic tone may be involved. To gain insights into the mechanisms responsible for autoregulation of EGPR, glycogenolysis and gluconeogenesis were measured, with a novel method (based on the prelabelling of endogenous glycogen with 13C glucose, and determination of hepatic 13C glycogen enrichment from breath 13CO2 and respiratory gas exchanges) in healthy humans infused with lactate or saline. These measurements were performed with or without beta-adrenergic receptor blockade (propranolol). Infusion of lactate increased energy expenditure, but did not increase EGPR; the relative contributions of gluconeogenesis and glycogenolysis to EGPR were also unaltered. This indicates that autoregulation is attained, at least in part, by inhibition of gluconeogenesis from endogenous precursors. beta-adrenergic receptor blockade alone (with propranolol) did not alter EGPR, glycogenolysis or gluconeogenesis. During infusion of lactate, propranolol decreased the thermic effect of lactate but EGPR remained constant. This indicates that alterations of beta-adrenergic activity is not required for autoregulation of EGPR.
Resumo:
Under the influence of intelligence-led policing models, crime analysis methods have known of important developments in recent years. Applications have been proposed in several fields of forensic science to exploit and manage various types of material evidence in a systematic and more efficient way. However, nothing has been suggested so far in the field of false identity documents.This study seeks to fill this gap by proposing a simple and general method for profiling false identity documents which aims to establish links based on their visual forensic characteristics. A sample of more than 200 false identity documents including French stolen blank passports, counterfeited driving licenses from Iraq and falsified Bulgarian driving licenses was gathered from nine Swiss police departments and integrated into an ad hoc developed database called ProfID. Links detected automatically and systematically through this database were exploited and analyzed to produce strategic and tactical intelligence useful to the fight against identity document fraud.The profiling and intelligence process established for these three types of false identity documents has confirmed its efficiency, more than 30% of documents being linked. Identity document fraud appears as a structured and interregional criminality, against which material and forensic links detected between false identity documents might serve as a tool for investigation.
Resumo:
Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g.,rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperaturedependent viscosity. Results show that average RMS errors are ∼5%–7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck’09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ∼10°C–15°C effluent reach ∼5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2‐D flows where background objects have a small spatial scale, such as sand or gravel
Resumo:
A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.
Resumo:
Segmenting ultrasound images is a challenging problemwhere standard unsupervised segmentation methods such asthe well-known Chan-Vese method fail. We propose in thispaper an efficient segmentation method for this class ofimages. Our proposed algorithm is based on asemi-supervised approach (user labels) and the use ofimage patches as data features. We also consider thePearson distance between patches, which has been shown tobe robust w.r.t speckle noise present in ultrasoundimages. Our results on phantom and clinical data show avery high similarity agreement with the ground truthprovided by a medical expert.
Resumo:
Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
Spatial resolution is a key parameter of all remote sensing satellites and platforms. The nominal spatial resolution of satellites is a well-known characteristic because it is directly related to the area in ground that represents a pixel in the detector. Nevertheless, in practice, the actual resolution of a specific image obtained from a satellite is difficult to know precisely because it depends on many other factors such as atmospheric conditions. However, if one has two or more images of the same region, it is possible to compare their relative resolutions. In this paper, a wavelet-decomposition-based method for the determination of the relative resolution between two remotely sensed images of the same area is proposed. The method can be applied to panchromatic, multispectral, and mixed (one panchromatic and one multispectral) images. As an example, the method was applied to compute the relative resolution between SPOT-3, Landsat-5, and Landsat-7 panchromatic and multispectral images taken under similar as well as under very different conditions. On the other hand, if the true absolute resolution of one of the images of the pair is known, the resolution of the other can be computed. Thus, in the last part of this paper, a spatial calibrator that is designed and constructed to help compute the absolute resolution of a single remotely sensed image is described, and an example of its use is presented.
Resumo:
The presence of the Etruscan shrew Suncus etruscus is hard to prove where its predator, the barn owl Tyto alba, is absent, because most live traps are not triggered by it. I therefore developed a new trapping method involving a feeding period of 1 week followed by one night of trapping using modified Trip Trap traps. I show here in detail how I caught four Etruscan shrews in 2010 with 24 traps in the Valley of Dora Baltea (Piemonte, Italy). In 2011, another 11 Etruscan shrews were caught in Piemonte and Lombardia, Italy, and Ticino, Switzerland. The proposed new method is useful for establishing the presence of the species.