887 resultados para path sampling
Resumo:
1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.
Resumo:
We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.
Resumo:
"How large a sample is needed to survey the bird damage to corn in a county in Ohio or New Jersey or South Dakota?" Like those in the Bureau of Sport Fisheries and Wildlife and the U.S.D.A. who have been faced with a question of this sort we found only meager information on which to base an answer, whether the problem related to a county in Ohio or to one in New Jersey, or elsewhere. Many sampling methods and rates of sampling did yield reliable estimates but the judgment was often intuitive or based on the reasonableness of the resulting data. Later, when planning the next study or survey, little additional information was available on whether 40 samples of 5 ears each or 5 samples of 200 ears should be examined, i.e., examination of a large number of small samples or a small number of large samples. What information is needed to make a reliable decision? Those of us involved with the Agricultural Experiment Station regional project concerned with the problems of bird damage to crops, known as NE-49, thought we might supply an ans¬wer if we had a corn field in which all the damage was measured. If all the damage were known, we could then sample this field in various ways and see how the estimates from these samplings compared to the actual damage and pin-point the best and most accurate sampling procedure. Eventually the investigators in four states became involved in this work1 and instead of one field we were able to broaden the geographical base by examining all the corn ears in 2 half-acre sections of fields in each state, 8 sections in all. When the corn had matured well past the dough stage, damage on each corn ear was assessed, without removing the ear from the stalk, by visually estimating the percent of the kernel surface which had been destroyed and rating it in one of 5 damage categories. Measurements (by row-centimeters) of the rows of kernels pecked by birds also were made on selected ears representing all categories and all parts of each field section. These measurements provided conversion factors that, when fed into a computer, were applied to the more than 72,000 visually assessed ears. The machine now had in its memory and could supply on demand a map showing each ear, its location and the intensity of the damage.
Resumo:
Contamination by butyltin compounds (BTs) has been reported in estuarine environments worldwide, with serious impacts on the biota of these areas. Considering that BTs can be degraded by varying environmental conditions such as incident light and salinity, the short-term variations in such factors may lead to inaccurate estimates of BTs concentrations in nature. Therefore, the present study aimed to evaluate the possibility that measurements of BTs in estuarine sediments are influenced by different sampling conditions, including period of the day (day or night), tidal zone (intertidal or subtidal), and tides (high or low). The study area is located on the Brazilian southeastern coast, Sao Vicente Estuary, at Pescadores Beach, where BT contamination was previously detected. Three replicate samples of surface sediment were collected randomly in each combination of period of the day, tidal zone, and tide condition, from three subareas along the beach, totaling 72 samples. BTs were analyzed by GC-PFPD using a tin filter and a VF-5 column, by means of a validated method. The concentrations of tributyltin (TBT), dibutyltin (DBT), and monobutyltin (MBT) ranged from undetectable to 161 ng Sn g(-1) (d.w.). In most samples (71%), only MBT was quantifiable, whereas TBTs were measured in only 14, suggesting either an old contamination or rapid degradation processes. DBT was found in 27 samples, but could be quantified in only one. MBT concentrations did not differ significantly with time of day, zones, or tide conditions. DBT and TBT could not be compared under all these environmental conditions, because only a few samples were above the quantification limit. Pooled samples of TBT did not reveal any difference between day and night. These results indicated that, in assessing contamination by butyltin compounds, surface-sediment samples can be collected in any environmental conditions. However, the wide variation of BTs concentrations in the study area, i.e., over a very small geographic scale, illustrates the need for representative hierarchical and composite sampling designs that are compatible with the multiscalar temporal and spatial variability common to most marine systems. The use of such sampling designs will be necessary for future attempts to quantitatively evaluate and monitor the occurrence and impact of these compounds in nature
Resumo:
y In this exploratory and descriptive research, we identified the meaning of religion and spirituality in the experience of patients at a public health service for treatment of HIV/AIDS in a Brazilian upcountry town. Eight participants were selected through theoretical sampling. Data were collected through semistructured interviews, and analyzed by means of qualitative content analysis. The emerging themes were religion: a path to support, and God is everything. Religion, as a path that leads patients to different sources of support, included exploration of different churches, acknowledgment of guilt, and finding strength to cope with the disease, rationalization of the disease process, meeting other churchgoers, and finding God and faith. God, an important source of support, was present in prayers, in the belief in healing through faith, and in the feeling of comfort and relief. Because spirituality and religion were seen as important sources of support, in this study we that health professionals include these aspects in care planning.
Resumo:
Recently, researches have shown that the performance of metaheuristics can be affected by population initialization. Opposition-based Differential Evolution (ODE), Quasi-Oppositional Differential Evolution (QODE), and Uniform-Quasi-Opposition Differential Evolution (UQODE) are three state-of-the-art methods that improve the performance of the Differential Evolution algorithm based on population initialization and different search strategies. In a different approach to achieve similar results, this paper presents a technique to discover promising regions in a continuous search-space of an optimization problem. Using machine-learning techniques, the algorithm named Smart Sampling (SS) finds regions with high possibility of containing a global optimum. Next, a metaheuristic can be initialized inside each region to find that optimum. SS and DE were combined (originating the SSDE algorithm) to evaluate our approach, and experiments were conducted in the same set of benchmark functions used by ODE, QODE and UQODE authors. Results have shown that the total number of function evaluations required by DE to reach the global optimum can be significantly reduced and that the success rate improves if SS is employed first. Such results are also in consonance with results from the literature, stating the importance of an adequate starting population. Moreover, SS presents better efficacy to find initial populations of superior quality when compared to the other three algorithms that employ oppositional learning. Finally and most important, the SS performance in finding promising regions is independent of the employed metaheuristic with which SS is combined, making SS suitable to improve the performance of a large variety of optimization techniques. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
In this article we propose an efficient and accurate method for fault location in underground distribution systems by means of an Optimum-Path Forest (OPF) classifier. We applied the time domains reflectometry method for signal acquisition, which was further analyzed by OPF and several other well-known pattern recognition techniques. The results indicated that OPF and support vector machines outperformed artificial neural networks and a Bayesian classifier, but OPF was much more efficient than all classifiers for training, and the second fastest for classification.
Resumo:
Within-site variability in species detectability is a problem common to many biodiversity assessments and can strongly bias the results. Such variability can be caused by many factors, including simple counting inaccuracies, which can be solved by increasing sample size, or by temporal changes in species behavior, meaning that the way the temporal sampling protocol is designed is also very important. Here we use the example of mist-netted tropical birds to determine how design decisions in the temporal sampling protocol can alter the data collected and how these changes might affect the detection of ecological patterns, such as the species-area relationship (SAR). Using data from almost 3400 birds captured from 21,000 net-hours at 31 sites in the Brazilian Atlantic Forest, we found that the magnitude of ecological trends remained fairly stable, but the probability of detecting statistically significant ecological patterns varied depending on sampling effort, time of day and season in which sampling was conducted. For example, more species were detected in the wet season, but the SAR was strongest in the dry season. We found that the temporal distribution of sampling effort was more important than its total amount, discovering that similar ecological results could have been obtained with one-third of the total effort, as long as each site had been equally sampled over 2 yr. We discuss that projects with the same sampling effort and spatial design, but with different temporal sampling protocol are likely to report different ecological patterns, which may ultimately lead to inappropriate conservation strategies.
Resumo:
Abstract Background Air pollution in São Paulo is constantly being measured by the State of Sao Paulo Environmental Agency, however there is no information on the variation between places with different traffic densities. This study was intended to identify a gradient of exposure to traffic-related air pollution within different areas in São Paulo to provide information for future epidemiological studies. Methods We measured NO2 using Palmes' diffusion tubes in 36 sites on streets chosen to be representative of different road types and traffic densities in São Paulo in two one-week periods (July and August 2000). In each study period, two tubes were installed in each site, and two additional tubes were installed in 10 control sites. Results Average NO2 concentrations were related to traffic density, observed on the spot, to number of vehicles counted, and to traffic density strata defined by the city Traffic Engineering Company (CET). Average NO2concentrations were 63μg/m3 and 49μg/m3 in the first and second periods, respectively. Dividing the sites by the observed traffic density, we found: heavy traffic (n = 17): 64μg/m3 (95% CI: 59μg/m3 – 68μg/m3); local traffic (n = 16): 48μg/m3 (95% CI: 44μg/m3 – 52μg/m3) (p < 0.001). Conclusion The differences in NO2 levels between heavy and local traffic sites are large enough to suggest the use of a more refined classification of exposure in epidemiological studies in the city. Number of vehicles counted, traffic density observed on the spot and traffic density strata defined by the CET might be used as a proxy for traffic exposure in São Paulo when more accurate measurements are not available.
Resumo:
This thesis covers sampling and analytical procedures for isocyanates (R-NCO) and amines (R-NH2), two kinds of chemicals frequently used in association with the polymeric material polyurethane (PUR). Exposure to isocyanates may result in respiratory disorders and dermal sensitisation, and they are one of the main causes of occupational asthma. Several of the aromatic diamines associated with PUR production are classified as suspected carcinogens. Hence, the presence of these chemicals in different exposure situations must be monitored. In the context of determining isocyanates in air, the methodologies included derivatisation with the reagent di-n-butylamine (DBA) upon collection and subsequent determination using liquid chromatography (LC) and mass spectrometric detection (MS). A user-friendly solvent-free sampler for collection of airborne isocyanates was developed as an alternative to a more cumbersome impinger-filter sampling technique. The combination of the DBA reagent together with MS detection techniques revealed several new exposure situations for isocyanates, such as isocyanic acid during thermal degradation of PUR and urea-based resins. Further, a method for characterising isocyanates in technical products used in the production of PUR was developed. This enabled determination of isocyanates in air for which pure analytical standards are missing. Tandem MS (MS/MS) determination of isocyanates in air below 10-6 of the threshold limit values was achieved. As for the determination of amines, the analytical methods included derivatisation into pentafluoropropionic amide or ethyl carbamate ester derivatives and subsequent MS analysis. Several amines in biological fluids, as markers of exposure for either the amines themselves or the corresponding isocyanates, were determined by LC-MS/MS at amol level. In aqueous extraction solutions of flexible PUR foam products, toluene diamine and related compounds were found. In conclusion, this thesis demonstrates the usefulness of well characterised analytical procedures and techniques for determination of hazardous compounds. Without reliable and robust methodologies there is a risk that exposure levels will be underestimated or, even worse, that relevant compounds will be completely missed.
Resumo:
[ES] La Planificación de Rutas o Caminos es un disciplina de Robótica que trata la búsqueda de caminos factibles u óptimos. Para la mayoría de vehículos y entornos, no es un problema trivial y por tanto nos encontramos con un gran diversidad de algoritmos para resolverlo, no sólo en Robótica e Inteligencia Artificial, sino también como parte de la literatura de Optimización, con Métodos Numéricos y Algoritmos Bio-inspirados, como Algoritmos Genéticos y el Algoritmo de la Colonia de Hormigas. El caso particular de escenarios de costes variables es considerablemente difícil de abordar porque el entorno en el que se mueve el vehículo cambia con el tiempo. El presente trabajo de tesis estudia este problema y propone varias soluciones prácticas para aplicaciones de Robótica Submarina.
Resumo:
This thesis deals with Visual Servoing and its strictly connected disciplines like projective geometry, image processing, robotics and non-linear control. More specifically the work addresses the problem to control a robotic manipulator through one of the largely used Visual Servoing techniques: the Image Based Visual Servoing (IBVS). In Image Based Visual Servoing the robot is driven by on-line performing a feedback control loop that is closed directly in the 2D space of the camera sensor. The work considers the case of a monocular system with the only camera mounted on the robot end effector (eye in hand configuration). Through IBVS the system can be positioned with respect to a 3D fixed target by minimizing the differences between its initial view and its goal view, corresponding respectively to the initial and the goal system configurations: the robot Cartesian Motion is thus generated only by means of visual informations. However, the execution of a positioning control task by IBVS is not straightforward because singularity problems may occur and local minima may be reached where the reached image is very close to the target one but the 3D positioning task is far from being fulfilled: this happens in particular for large camera displacements, when the the initial and the goal target views are noticeably different. To overcame singularity and local minima drawbacks, maintaining the good properties of IBVS robustness with respect to modeling and camera calibration errors, an opportune image path planning can be exploited. This work deals with the problem of generating opportune image plane trajectories for tracked points of the servoing control scheme (a trajectory is made of a path plus a time law). The generated image plane paths must be feasible i.e. they must be compliant with rigid body motion of the camera with respect to the object so as to avoid image jacobian singularities and local minima problems. In addition, the image planned trajectories must generate camera velocity screws which are smooth and within the allowed bounds of the robot. We will show that a scaled 3D motion planning algorithm can be devised in order to generate feasible image plane trajectories. Since the paths in the image are off-line generated it is also possible to tune the planning parameters so as to maintain the target inside the camera field of view even if, in some unfortunate cases, the feature target points would leave the camera images due to 3D robot motions. To test the validity of the proposed approach some both experiments and simulations results have been reported taking also into account the influence of noise in the path planning strategy. The experiments have been realized with a 6DOF anthropomorphic manipulator with a fire-wire camera installed on its end effector: the results demonstrate the good performances and the feasibility of the proposed approach.
Resumo:
A path integral simulation algorithm which includes a higher-order Trotter approximation (HOA)is analyzed and compared to an approach which includes the correct quantum mechanical pair interaction (effective Propagator (EPr)). It is found that the HOA algorithmconverges to the quantum limit with increasing Trotter number P as P^{-4}, while the EPr algorithm converges as P^{-2}.The convergence rate of the HOA algorithm is analyzed for various physical systemssuch as a harmonic chain,a particle in a double-well potential, gaseous argon, gaseous helium and crystalline argon. A new expression for the estimator for the pair correlation function in the HOA algorithm is derived. A new path integral algorithm, the hybrid algorithm, is developed.It combines an exact treatment of the quadratic part of the Hamiltonian and thehigher-order Trotter expansion techniques.For the discrete quantum sine-Gordon chain (DQSGC), it is shown that this algorithm works more efficiently than all other improved path integral algorithms discussed in this work. The new simulation techniques developed in this work allow the analysis of theDQSGC and disordered model systems in the highly quantum mechanical regime using path integral molecular dynamics (PIMD)and adiabatic centroid path integral molecular dynamics (ACPIMD).The ground state phonon dispersion relation is calculated for the DQSGC by the ACPIMD method.It is found that the excitation gap at zero wave vector is reduced by quantum fluctuations. Two different phases exist: One phase with a finite excitation gap at zero wave vector, and a gapless phase where the excitation gap vanishes.The reaction of the DQSGC to an external driving force is analyzed at T=0.In the gapless phase the system creeps if a small force is applied, and in the phase with a gap the system is pinned. At a critical force, the systems undergo a depinning transition in both phases and flow is induced. The analysis of the DQSGC is extended to models with disordered substrate potentials. Three different cases are analyzed: Disordered substrate potentials with roughness exponent H=0, H=1/2,and a model with disordered bond length. For all models, the ground state phonon dispersion relation is calculated.