983 resultados para Scanline sampling technique


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work describes the foraging techniques, body positions and behavior of free-ranging Ingram's squirrel Guerlinguetus ingrami Thomas, 1901 in a region of the Araucaria moist forest, in the Atlantic Forest of southern Brazil. The animals were observed using the all occurrence sampling method with the aid of binoculars and a digital camcorder. All behaviors were described in diagrams and an ethogram. We recorded five basic body positions, 24 behaviors, two food choices, and three feeding strategies utilized to open fruits of Syagrus romanzoffiana (Cham.), the main food source of Ingram's squirrels. We also observed a variance in the animals' stance, which is possibly influenced by predation risk, and discuss the causes of some behaviors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the seed production system, genetic purity is one of the fundamental requirements for its commercialization. The present work had the goal of determined the sample size for genetic purity evaluation, in order to protect the seed consumer and the producer and to evaluate the sensitivity of microsatellite technique for discriminating hybrids from their respective relatives and for detecting mixtures when they are present in small amounts in the samples. For the sequential sampling, hybrid seeds were marked and mixed in with the seed lots, simulating the following levels of contamination: 0.25, 0.5, 1.0, 2.0, 4.0, and 6.0%. After this, groups of 40 seeds were taken in sequence, up to a maximum of 400 seeds, with the objective of determining the quantity of seeds necessary to detect the percentage of mixture mentioned above. The sensitivity of microsatellite technique was evaluated by mixing different proportions of DNA from the hybrids with their respective seed lines. For the level of mixture was higher than 1:8 (1P1:8P2; 8P1:1P2), the sensitivity of the marker in detecting different proportions of the mixture varied according to the primer used. In terms of the sequential sampling, it was verified that in order to detect mixture levels higher than 1% within the seed lot- with a risk level for both the producer and the consumer of 0.05- the size of the necessary sample was smaller than the size needed for the fixed sample size. This also made it possible to reduce costs, making it possible to use microsatellites to certify the genetic purity of corn seeds lots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diffusive gradients in thin films (DGT) technique has shown enormous potential for labile metal monitoring in fresh water due to the preconcentration, time-integrated, matrix interference removal and speciation analytical features. In this work, the coupling of energy dispersive X-ray fluorescence (EDXRF) with paper-based DGT devices was evaluated for the direct determination of Mn, Co. Ni, Cu, Zn and Pb in fresh water. The DGT samplers were assembled with cellulose (Whatman 3 MM chromatography paper) as the diffusion layer and a cellulose phosphate ion exchange membrane (Whatman P 81 paper) as the binding agent. The diffusion coefficients of the analytes on 3 MM chromatography paper were calculated by deploying the DGT samplers in synthetic solutions containing 500 mu g L-1 of Mn. Co, Ni, Cu, Zn and Pb (4 L at pH 5.5 and ionic strength at 0.05 mol L-1). After retrieval, the DGT units were disassembled and the P81 papers were dried and analysed by EDXRF directly. The 3 MM chromatographic paper diffusion coefficients of the analytes ranged from 1.67 to 1.87 x 10(-6) cm(2) s(-1). The metal retention and phosphate group homogeneities on the P81 membrane was studied by a spot analysis with a diameter of 1 mm. The proposed approach (DGT-EDXRF coupling) was applied to determine the analytes at five sampling sites (48 h in situ deployment) on the Piracicaba river basin, and the results (labile fraction) were compared with 0.45 mu m dissolved fractions determined by synchrotron radiation-excited total reflection X-ray fluorescence (SR-TXRF). The limits of detection of DGT-EDXRF coupling for the analytes (from 7.5 to 26 mu g L-1) were similar to those obtained by the sensitive SR-TXRF technique (3.8 to 9.1 mu g L-1). (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, researches have shown that the performance of metaheuristics can be affected by population initialization. Opposition-based Differential Evolution (ODE), Quasi-Oppositional Differential Evolution (QODE), and Uniform-Quasi-Opposition Differential Evolution (UQODE) are three state-of-the-art methods that improve the performance of the Differential Evolution algorithm based on population initialization and different search strategies. In a different approach to achieve similar results, this paper presents a technique to discover promising regions in a continuous search-space of an optimization problem. Using machine-learning techniques, the algorithm named Smart Sampling (SS) finds regions with high possibility of containing a global optimum. Next, a metaheuristic can be initialized inside each region to find that optimum. SS and DE were combined (originating the SSDE algorithm) to evaluate our approach, and experiments were conducted in the same set of benchmark functions used by ODE, QODE and UQODE authors. Results have shown that the total number of function evaluations required by DE to reach the global optimum can be significantly reduced and that the success rate improves if SS is employed first. Such results are also in consonance with results from the literature, stating the importance of an adequate starting population. Moreover, SS presents better efficacy to find initial populations of superior quality when compared to the other three algorithms that employ oppositional learning. Finally and most important, the SS performance in finding promising regions is independent of the employed metaheuristic with which SS is combined, making SS suitable to improve the performance of a large variety of optimization techniques. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proper ion channels’ functioning is a prerequisite for a normal cell and disorders involving ion channels, or channelopathies, underlie many human diseases. Long QT syndromes (LQTS) for example may arise from the malfunctioning of hERG channel, caused either by the binding of drugs or mutations in HERG gene. In the first part of this thesis I present a framework to investigate the mechanism of ion conduction through hERG channel. The free energy profile governing the elementary steps of ion translocation in the pore was computed by means of umbrella sampling simulations. Compared to previous studies, we detected a different dynamic behavior: according to our data hERG is more likely to mediate a conduction mechanism which has been referred to as “single-vacancy-like” by Roux and coworkers (2001), rather then a “knock-on” mechanism. The same protocol was applied to a model of hERG presenting the Gly628Ser mutation, found to be cause of congenital LQTS. The results provided interesting insights about the reason of the malfunctioning of the mutant channel. Since they have critical functions in viruses’ life cycle, viral ion channels, such as M2 proton channel, are considered attractive targets for antiviral therapy. A deep knowledge of the mechanisms that the virus employs to survive in the host cell is of primary importance in the identification of new antiviral strategies. In the second part of this thesis I shed light on the role that M2 plays in the control of electrical potential inside the virus, being the charge equilibration a condition required to allow proton influx. The ion conduction through M2 was simulated using metadynamics technique. Based on our results we suggest that a potential anion-mediated cation-proton exchange, as well as a direct anion-proton exchange could both contribute to explain the activity of the M2 channel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the feasibility of postmortem percutaneous needle biopsy (PNB) for obtaining pulmonary samples adequate for the study of pulmonary fat embolism (PFE). Samples of both lungs were obtained from 26 cadavers via two different methods: (i) PNB and (ii) the double-edged knife technique, the gold standard at our institute. After water storage and Sudan III staining, six forensic pathologists independently examined all samples for the presence and severity of PFE. The results were compared and analyzed in each case regarding the vitality of the PFE and its relationship to the cause of death. The results showed that PFE was almost identically diagnosed and graded on the samples obtained via both methods. The discrepancies between the two techniques did not affect the diagnoses of vitality or cause of death related to PFE. This study demonstrates the feasibility of the PNB sampling method for the diagnosis and interpretation of PFE in the postmortem setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Diagnosing arrhythmias by conventional Holter-ECG can be cumbersome because of artifacts, skin irritation and poor P-waves. In contrast, esophageal electrocardiography (eECG) is promising due to the anatomic relationship of the esophagus to the atria and its favorable bioelectric properties. Methods used: In an ambulant setting, we recorded eECGs from 10 volunteers with a novel, highly-miniaturized eECG recorder that is worn discretely behind the ear (1.5×1.8×5cm, 22grams). The device continuously records two eECG leads during 3 days with 500Hz sampling frequency and 24-bit resolution. Results: Mean ± SD recording time was 21.7±19.6 hours (max. 60 hours). Test persons were not limited in daily activities (e.g. eating, speaking) and only complained mild discomfort during probe insertion, which subsided later on. During 99.8% of time, the recorder acquired signals appropriate for further analysis. In unfiltered data, QRS complexes and P-waves were identifiable during >98% of time. P waves had higher amplitudes as compared to surface ECG (0.71 ± 0.42mV vs. 0.16 ± 0.03mV, p = 0.004). No complications occurred. Conclusion: Ambulatory eECG recording is safe, well tolerated and promising due to excellent P-wave detection, overcoming some limitations of conventional Holter ECG.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Periprosthetic joint infection (PJI) is the most severe complication, following joint arthroplasty. Identification of the causal microbial factor is of paramount importance for the successful treatment. PURPOSE The aim of this study is to compare the sonication fluid cultures derived from joint prosthetic components with the respective periprosthetic tissue cultures. METHODS Explanted prosthesis components for suspected infection were placed into a tank containing sterile Ringer's solution and sonicated for 1 minute at 40 kHz. Sonication fluid cultures were examined for 10 days, and the number and identity of any colony morphology was recorded. In addition, periprosthetic tissue specimens (>5) were collected and cultured according to standard practice. The duration of antimicrobial interruption interval before culture sampling was recorded. RESULTS Thirty-four patients composed the study group. Sonication fluid cultures were positive in 24 patients (70.5%). Sixteen of thirty four periprosthetic tissue cultures (47.1%) were considered positive, all revealing the same microbial species with the respective sonication fluid cultures: 3 tissue samples showed polymicrobial infection. All tissue cultures were also found positive by the sonication fluid culture. CONCLUSIONS Sonication fluid cultures represent a cheap, easy, accurate, and sensitive diagnostic modality demonstrating increased sensitivity compared to periprosthetic tissue cultures (70.5 versus 47.1%).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Establishing precise age-depth relationships of high-alpine ice cores is essential in order to deduce conclusive paleoclimatic information from these archives. Radiocarbon dating of carbonaceous aerosol particles incorporated in such glaciers is a promising tool to gain absolute ages, especially from the deepest parts where conventional methods are commonly inapplicable. In this study, we present a new validation for a published C-14 dating method for ice cores. Previously C-14-dated horizons of organic material from the Juvfonne ice patch in central southern Norway (61.676 degrees N, 8.354 degrees E) were used as reference dates for adjacent ice layers, which were C-14 dated based on their particulate organic carbon (POC) fraction. Multiple measurements were carried out on 3 sampling locations within the ice patch featuring modern to multimillennial ice. The ages obtained from the analyzed samples were in agreement with the given age estimates. In addition to previous validation work, this independent verification gives further confidence that the investigated method provides the actual age of the ice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a technique for online compression of ECG signals using the Golomb-Rice encoding algorithm. This is facilitated by a novel time encoding asynchronous analog-to-digital converter targeted for low-power, implantable, long-term bio-medical sensing applications. In contrast to capturing the actual signal (voltage) values the asynchronous time encoder captures and encodes the time information at which predefined changes occur in the signal thereby minimizing the sensor's energy use and the number of bits we store to represent the information by not capturing unnecessary samples. The time encoder transforms the ECG signal data to pure time information that has a geometric distribution such that the Golomb-Rice encoding algorithm can be used to further compress the data. An overall online compression rate of about 6 times is achievable without the usual computations associated with most compression methods.