989 resultados para Fission Track Method
Resumo:
We continue the development of a method for the selection of a bandwidth or a number of design parameters in density estimation. We provideexplicit non-asymptotic density-free inequalities that relate the $L_1$ error of the selected estimate with that of the best possible estimate,and study in particular the connection between the richness of the classof density estimates and the performance bound. For example, our methodallows one to pick the bandwidth and kernel order in the kernel estimatesimultaneously and still assure that for {\it all densities}, the $L_1$error of the corresponding kernel estimate is not larger than aboutthree times the error of the estimate with the optimal smoothing factor and kernel plus a constant times $\sqrt{\log n/n}$, where $n$ is the sample size, and the constant only depends on the complexity of the family of kernels used in the estimate. Further applications include multivariate kernel estimates, transformed kernel estimates, and variablekernel estimates.
Resumo:
New method for rearing Spodoptera frugiperda in laboratory shows that larval cannibalism is not obligatory. Here we show, for the first time, that larvae of the fall armyworm (FAW), Spodoptera frugiperda (Lepidoptera, Noctuidae), can be successfully reared in a cohort-based manner with virtually no cannibalism. FAW larvae were reared since the second instar to pupation in rectangular plastic containers containing 40 individuals with a surprisingly ca. 90% larval survivorship. Adult females from the cohort-based method showed fecundity similar to that already reported on literature for larvae reared individually, and fertility higher than 99%, with the advantage of combining economy of time, space and material resources. These findings suggest that the factors affecting cannibalism of FAW larvae in laboratory rearings need to be reevaluated, whilst the new technique also show potential to increase the efficiency of both small and mass FAW rearings.
Resumo:
Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.
Resumo:
This paper describes an optimized model to support QoS by mean of Congestion minimization on LSPs (Label Switching Path). In order to perform this model, we start from a CFA (Capacity and Flow Allocation) model. As this model does not consider the buffer size to calculate the capacity cost, our model- named BCA (Buffer Capacity Allocation)- take into account this issue and it improve the CFA performance. To test our proposal, we perform several simulations; results show that BCA model minimizes LSP congestion and uniformly distributes flows on the network
Resumo:
We perform direct numerical simulations of drainage by solving Navier- Stokes equations in the pore space and employing the Volume Of Fluid (VOF) method to track the evolution of the fluid-fluid interface. After demonstrating that the method is able to deal with large viscosity contrasts and to model the transition from stable flow to viscous fingering, we focus on the definition of macroscopic capillary pressure. When the fluids are at rest, the difference between inlet and outlet pressures and the difference between the intrinsic phase average pressure coincide with the capillary pressure. However, when the fluids are in motion these quantities are dominated by viscous forces. In this case, only a definition based on the variation of the interfacial energy provides an accurate measure of the macroscopic capillary pressure and allows separating the viscous from the capillary pressure components.
Resumo:
Over the last century, numerous techniques have been developed to analyze the movement of humans while walking and running. The combined use of kinematics and kinetics methods, mainly based on high speed video analysis and forceplate, have permitted a comprehensive description of locomotion process in terms of energetics and biomechanics. While the different phases of a single gait cycle are well understood, there is an increasing interest to know how the neuro-motor system controls gait form stride to stride. Indeed, it was observed that neurodegenerative diseases and aging could impact gait stability and gait parameters steadiness. From both clinical and fundamental research perspectives, there is therefore a need to develop techniques to accurately track gait parameters stride-by-stride over a long period with minimal constraints to patients. In this context, high accuracy satellite positioning can provide an alternative tool to monitor outdoor walking. Indeed, the high-end GPS receivers provide centimeter accuracy positioning with 5-20 Hz sampling rate: this allows the stride-by-stride assessment of a number of basic gait parameters--such as walking speed, step length and step frequency--that can be tracked over several thousand consecutive strides in free-living conditions. Furthermore, long-range correlations and fractal-like pattern was observed in those time series. As compared to other classical methods, GPS seems a promising technology in the field of gait variability analysis. However, relative high complexity and expensiveness--combined with a usability which requires further improvement--remain obstacles to the full development of the GPS technology in human applications.
Resumo:
Under the influence of intelligence-led policing models, crime analysis methods have known of important developments in recent years. Applications have been proposed in several fields of forensic science to exploit and manage various types of material evidence in a systematic and more efficient way. However, nothing has been suggested so far in the field of false identity documents.This study seeks to fill this gap by proposing a simple and general method for profiling false identity documents which aims to establish links based on their visual forensic characteristics. A sample of more than 200 false identity documents including French stolen blank passports, counterfeited driving licenses from Iraq and falsified Bulgarian driving licenses was gathered from nine Swiss police departments and integrated into an ad hoc developed database called ProfID. Links detected automatically and systematically through this database were exploited and analyzed to produce strategic and tactical intelligence useful to the fight against identity document fraud.The profiling and intelligence process established for these three types of false identity documents has confirmed its efficiency, more than 30% of documents being linked. Identity document fraud appears as a structured and interregional criminality, against which material and forensic links detected between false identity documents might serve as a tool for investigation.
Resumo:
CONTEXT: A passive knee-extension test has been shown to be a reliable method of assessing hamstring tightness, but this method does not take into account the potential effect of gravity on the tested leg. OBJECTIVE: To compare an original passive knee-extension test with 2 adapted methods including gravity's effect on the lower leg. DESIGN: Repeated measures. SETTING: Laboratory. PARTICIPANTS: 20 young track and field athletes (16.6 ± 1.6 y, 177.6 ± 9.2 cm, 75.9 ± 24.8 kg). INTERVENTION: Each subject was tested in a randomized order with 3 different methods: In the original one (M1), passive knee angle was measured with a standard force of 68.7 N (7 kg) applied proximal to the lateral malleolus. The second (M2) and third (M3) methods took into account the relative lower-leg weight (measured respectively by handheld dynamometer and anthropometrical table) to individualize the force applied to assess passive knee angle. MAIN OUTCOME MEASURES: Passive knee angles measured with video-analysis software. RESULTS: No difference in mean individualized applied force was found between M2 and M3, so the authors assessed passive knee angle only with M2. The mean knee angle was different between M1 and M2 (68.8 ± 12.4 vs 73.1 ± 10.6, P < .001). Knee angles in M1 and M2 were correlated (r = .93, P < .001). CONCLUSIONS: Differences in knee angle were found between the original passive knee-extension test and a method with gravity correction. M2 is an improved version of the original method (M1) since it minimizes the effect of gravity. Therefore, we recommend using it rather than M1.
Resumo:
The exocyst complex is essential for many exocytic events, by tethering vesicles at the plasma membrane for fusion. In fission yeast, polarized exocytosis for growth relies on the combined action of the exocyst at cell poles and myosin-driven transport along actin cables. We report here the identification of fission yeast Schizosaccharomyces pombe Sec3 protein, which we identified through sequence homology of its PH-like domain. Like other exocyst subunits, sec3 is required for secretion and cell division. Cells deleted for sec3 are only conditionally lethal and can proliferate when osmotically stabilized. Sec3 is redundant with Exo70 for viability and for the localization of other exocyst subunits, suggesting these components act as exocyst tethers at the plasma membrane. Consistently, Sec3 localizes to zones of growth independently of other exocyst subunits but depends on PIP(2) and functional Cdc42. FRAP analysis shows that Sec3, like all other exocyst subunits, localizes to cell poles largely independently of the actin cytoskeleton. However, we show that Sec3, Exo70 and Sec5 are transported by the myosin V Myo52 along actin cables. These data suggest that the exocyst holocomplex, including Sec3 and Exo70, is present on exocytic vesicles, which can reach cell poles by either myosin-driven transport or random walk.
Resumo:
Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g.,rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperaturedependent viscosity. Results show that average RMS errors are ∼5%–7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck’09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ∼10°C–15°C effluent reach ∼5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2‐D flows where background objects have a small spatial scale, such as sand or gravel
Resumo:
A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.
Resumo:
Homologous recombination is important for the repair of double-strand breaks during meiosis. Eukaryotic cells require two homologs of Escherichia coli RecA protein, Rad51 and Dmc1, for meiotic recombination. To date, it is not clear, at the biochemical level, why two homologs of RecA are necessary during meiosis. To gain insight into this, we purified Schizosaccharomyces pombe Rad51 and Dmc1 to homogeneity. Purified Rad51 and Dmc1 form homo-oligomers, bind single-stranded DNA preferentially, and exhibit DNA-stimulated ATPase activity. Both Rad51 and Dmc1 promote the renaturation of complementary single-stranded DNA. Importantly, Rad51 and Dmc1 proteins catalyze ATP-dependent strand exchange reactions with homologous duplex DNA. Electron microscopy reveals that both S. pombe Rad51 and Dmc1 form nucleoprotein filaments. Rad51 formed helical nucleoprotein filaments on single-stranded DNA, whereas Dmc1 was found in two forms, as helical filaments and also as stacked rings. These results demonstrate that Rad51 and Dmc1 are both efficient recombinases in lower eukaryotes and reveal closer functional and structural similarities between the meiotic recombinase Dmc1 and Rad51. The DNA strand exchange activity of both Rad51 and Dmc1 is most likely critical for proper meiotic DNA double-strand break repair in lower eukaryotes.
Resumo:
Segmenting ultrasound images is a challenging problemwhere standard unsupervised segmentation methods such asthe well-known Chan-Vese method fail. We propose in thispaper an efficient segmentation method for this class ofimages. Our proposed algorithm is based on asemi-supervised approach (user labels) and the use ofimage patches as data features. We also consider thePearson distance between patches, which has been shown tobe robust w.r.t speckle noise present in ultrasoundimages. Our results on phantom and clinical data show avery high similarity agreement with the ground truthprovided by a medical expert.
Resumo:
Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.