999 resultados para line intersect sampling
Resumo:
Distance sampling using line transects has not been previously used or tested for estimating koala abundance. In July 2001, a pilot survey was conducted to compare the use of line transects with strip transects for estimating koala abundance. Both methods provided a similar estimate of density. On the basis of the results of the pilot survey, the distribution and abundance of koalas in the Pine Rivers Shire, south-east Queensland, was determined using line-transect sampling. In total, 134 lines (length 64 km) were used to sample bushland areas. Eighty-two independent koalas were sighted. Analysis of the frequency distribution of sighting distances using the software program DISTANCE enabled a global detection function to be estimated for survey sites in bushland areas across the Shire. Abundance in urban parts of the Shire was estimated from densities obtained from total counts at eight urban sites that ranged from 26 to 51 ha in size. Koala abundance in the Pine Rivers Shire was estimated at 4584 (95% confidence interval, 4040-5247). Line-transect sampling is a useful method for estimating koala abundance provided experienced koala observers are used when conducting surveys.
Resumo:
Killer whale (Orcinus orca Linnaeus, 1758) abundance in the North Pacific is known only for a few populations for which extensive longitudinal data are available, with little quantitative data from more remote regions. Line-transect ship surveys were conducted in July and August of 2001–2003 in coastal waters of the western Gulf of Alaska and the Aleutian Islands. Conventional and Multiple Covariate Distance Sampling methods were used to estimate the abundance of different killer whale ecotypes, which were distinguished based upon morphological and genetic data. Abundance was calculated separately for two data sets that differed in the method by which killer whale group size data were obtained. Initial group size (IGS) data corresponded to estimates of group size at the time of first sighting, and post-encounter group size (PEGS) corresponded to estimates made after closely approaching sighted groups.
Resumo:
Estimates of greenhouse-gas emissions from deforestation are highly uncertain because of high variability in key parameters and because of the limited number of studies providing field measurements of these parameters. One such parameter is burning efficiency, which determines how much of the original forest`s aboveground carbon stock will be released in the burn, as well as how much will later be released by decay and how much will remain as charcoal. In this paper we examined the fate of biomass from a semideciduous tropical forest in the ""arc of deforestation,"" where clearing activity is concentrated along the southern edge of the Amazon forest. We estimated carbon content, charcoal formation and burning efficiency by direct measurements (cutting and weighing) and by line-intersect sampling (LIS) done along the axis of each plot before and after burning of felled vegetation. The total aboveground dry biomass found here (219.3 Mg ha(-1)) is lower than the values found in studies that have been done in other parts of the Amazon region. Values for burning efficiency (65%) and charcoal formation (6.0%, or 5.98 Mg C ha(-1)) were much higher than those found in past studies in tropical areas. The percentage of trunk biomass lost in burning (49%) was substantially higher than has been found in previous studies. This difference may be explained by the concentration of more stems in the smaller diameter classes and the low humidity of the fuel (the dry season was unusually long in 2007, the year of the burn). This study provides the first measurements of forest burning parameters for a group of forest types that is now undergoing rapid deforestation. The burning parameters estimated here indicate substantially higher burning efficiency than has been found in other Amazonian forest types. Quantification of burning efficiency is critical to estimates of trace-gas emissions from deforestation. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Abstract
Resumo:
In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk), and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve), at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 %) revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm). Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm²) as well as in CXve (-4231 to 612.1 mm²). However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are closer to the results obtained with the reference method.
Resumo:
A chemical process optimization and control is strongly correlated with the quantity of information can be obtained from the system. In biotechnological processes, where the transforming agent is a cell, many variables can interfere in the process, leading to changes in the microorganism metabolism and affecting the quantity and quality of final product. Therefore, the continuously monitoring of the variables that interfere in the bioprocess, is crucial to be able to act on certain variables of the system, keeping it under desirable operational conditions and control. In general, during a fermentation process, the analysis of important parameters such as substrate, product and cells concentration, is done off-line, requiring sampling, pretreatment and analytical procedures. Therefore, this steps require a significant run time and the use of high purity chemical reagents to be done. In order to implement a real time monitoring system for a benchtop bioreactor, these study was conducted in two steps: (i) The development of a software that presents a communication interface between bioreactor and computer based on data acquisition and process variables data recording, that are pH, temperature, dissolved oxygen, level, foam level, agitation frequency and the input setpoints of the operational parameters of the bioreactor control unit; (ii) The development of an analytical method using near-infrared spectroscopy (NIRS) in order to enable substrate, products and cells concentration monitoring during a fermentation process for ethanol production using the yeast Saccharomyces cerevisiae. Three fermentation runs were conducted (F1, F2 and F3) that were monitored by NIRS and subsequent sampling for analytical characterization. The data obtained were used for calibration and validation, where pre-treatments combined or not with smoothing filters were applied to spectrum data. The most satisfactory results were obtained when the calibration models were constructed from real samples of culture medium removed from the fermentation assays F1, F2 and F3, showing that the analytical method based on NIRS can be used as a fast and effective method to quantify cells, substrate and products concentration what enables the implementation of insitu real time monitoring of fermentation processes
Resumo:
O Parque Nacional das Quirimbas (PNQ) é a primeira e única área com estatuto de Parque Nacional no norte de Moçambique. A necessidade de estudar mamíferos de médio e grande porte fez deste trabalho o primeiro estudo dedicado à densidade de primatas no sul do PNQ. De Outubro de 2014 a Janeiro de 2015 foram realizados, em duas áreas do sul do PNQ, Taratibu e Mareja, censos segundo o método de amostragem de distâncias de transectos lineares. As densidades estimadas de cada uma das espécies estudadas, Papio cynocephalus, Cercophitecus albogularis e Chlorocebus pygerythrus, revelaram ser mais elevadas em Mareja do que em Taratibu, área caracterizada pela presença de inselbergs. No total das duas áreas de estudo (sul do PNQ) a estimativa da densidade foi de 12,4855 indivíduos/km2 para P. cynocephalus, de 2,8700 indivíduos/km2 para C. albogularis e de 0,5172 indivíduos/km2 para C. pygerythrus. Estas estimativas quando comparadas com o único estudo geral sobre a densidade de mamíferos presente no PNQ, revelam ser mais elevadas. Uma vez que o método utilizado no presente estudo, permitiu uma maior facilidade na visualização direta de primatas e um maior número de indivíduos avistados, podemos concluir que estes valores de estimativas de densidade poderão ser mais fiáveis que os obtidos anteriormente para esta zona. As densidades obtidas quando comparadas com outros estudos mostram ser mais elevadas no caso P. cynocephalus, o que poderá fazer do PNQ uma área de valor inestimável e de grande importância para a conservação desta espécie, e mais baixas no que toca à espécie C. albogularis, o que poderá indicar um declíneo desta população. Relativamente a C. pygerythrus, ainda que o número de avistamentos tenha sido reduzido, a estimativa da densidade obtida segue a tendência de estabilidade da população.
Resumo:
Apresenta·se um breve resumo histórico da evolução da amostragem por transectos lineares e desenvolve·se a sua teoria. Descrevemos a teoria de amostragem por transectos lineares, proposta por Buckland (1992), sendo apresentados os pontos mais relevantes, no que diz respeito à modelação da função de detecção. Apresentamos uma descrição do princípio CDM (Rissanen, 1978) e a sua aplicação à estimação de uma função densidade por um histograma (Kontkanen e Myllymãki, 2006), procedendo à aplicação de um exemplo prático, recorrendo a uma mistura de densidades. Procedemos à sua aplicação ao cálculo do estimador da probabilidade de detecção, no caso dos transectos lineares e desta forma estimar a densidade populacional de animais. Analisamos dois casos práticos, clássicos na amostragem por distâncias, comparando os resultados obtidos. De forma a avaliar a metodologia, simulámos vários conjuntos de observações, tendo como base o exemplo das estacas, recorrendo às funções de detecção semi-normal, taxa de risco, exponencial e uniforme com um cosseno. Os resultados foram obtidos com o programa DISTANCE (Thomas et al., in press) e um algoritmo escrito em linguagem C, cedido pelo Professor Doutor Petri Kontkanen (Departamento de Ciências da Computação, Universidade de Helsínquia). Foram desenvolvidos programas de forma a calcular intervalos de confiança recorrendo à técnica bootstrap (Efron, 1978). São discutidos os resultados finais e apresentadas sugestões de desenvolvimentos futuros. ABSTRACT; We present a brief historical note on the evolution of line transect sampling and its theoretical developments. We describe line transect sampling theory as proposed by Buckland (1992), and present the most relevant issues about modeling the detection function. We present a description of the CDM principle (Rissanen, 1978) and its application to histogram density estimation (Kontkanen and Myllymãki, 2006), with a practical example, using a mixture of densities. We proceed with the application and estimate probability of detection and animal population density in the context of line transect sampling. Two classical examples from the literature are analyzed and compared. ln order to evaluate the proposed methodology, we carry out a simulation study based on a wooden stakes example, and using as detection functions half normal, hazard rate, exponential and uniform with a cosine term. The results were obtained using program DISTANCE (Thomas et al., in press), and an algorithm written in C language, kindly offered by Professor Petri Kontkanen (Department of Computer Science, University of Helsinki). We develop some programs in order to estimate confidence intervals using the bootstrap technique (Efron, 1978). Finally, the results are presented and discussed with suggestions for future developments.
Resumo:
One of the key aspects in 3D-image registration is the computation of the joint intensity histogram. We propose a new approach to compute this histogram using uniformly distributed random lines to sample stochastically the overlapping volume between two 3D-images. The intensity values are captured from the lines at evenly spaced positions, taking an initial random offset different for each line. This method provides us with an accurate, robust and fast mutual information-based registration. The interpolation effects are drastically reduced, due to the stochastic nature of the line generation, and the alignment process is also accelerated. The results obtained show a better performance of the introduced method than the classic computation of the joint histogram
Resumo:
One of the key aspects in 3D-image registration is the computation of the joint intensity histogram. We propose a new approach to compute this histogram using uniformly distributed random lines to sample stochastically the overlapping volume between two 3D-images. The intensity values are captured from the lines at evenly spaced positions, taking an initial random offset different for each line. This method provides us with an accurate, robust and fast mutual information-based registration. The interpolation effects are drastically reduced, due to the stochastic nature of the line generation, and the alignment process is also accelerated. The results obtained show a better performance of the introduced method than the classic computation of the joint histogram
Diagnostic errors and repetitive sequential classifications in on-line process control by attributes
Resumo:
The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.
Resumo:
Objectives: Chorionic Vilus Sampling (CVS) has several advantages over amniocentesis: it may be performed at an earlier gestational age, the results are quicker to obtain and there’s a lower miscarriage risk – 1%. However, the higher prevalence of discrepant fetal and vilus sampling material’s karyotype findings is a disadvantage of this technique – 0.5%. This is caused, amongst other causes, by placental mosaicism which consists of two genetically different cell lines. There are three types of placental mosaicism according to the abnormal cell line location: Type I – in the cytotrophoblast; Type II – in the vilus’ stroma; Type III – in both the above locations. Material and Methods: We present a case report about a 36-year-old pregnant woman going through our Department’s 1st trimester combined screening program; a CVS was performed, which showed Confined Placental Mosaicism (CPM). Results and Conclusion: Although the pregnant woman was in the low-risk group for aneuploidy, the patient wanted the cytogenetic study to be performed in order to reduce maternal anxiety. CVS was performed at the gestational age of 12 weeks + 5 days and the karyotype was 47XY+2/46XY. For the correct interpretation of this data an amniocentesis was performed at the gestational age of 15 weeks + 6 days, which showed a 46XY karyotype. We therefore conclude that the cytogenetic analysis of the CVS was the result of a CPM. A careful follow-up including fetal echocardiogram and seriated ultrasonographic monitoring was used to safely exclude malformations and fetal growth restriction. We verified no occurences throughout pregnancy, delivery and perinatal period. CVS practice was recently implemented in our country and has many advantages over amniocentesis. Besides the fact that an earlier gestational age usually means less affective bonding to the fetus and therefore makes medical termination of pregnancy somewhat less difficult, one should consider specific situations like the one reported in which CPM may be diagnosed. This condition is associated with increased risk of fetal growth restriction, so the clinician should be aware of the need for a more careful follow-up, since perinatal complications, which should be anticipated and treated, can be expected in 16-21% of these cases.
Resumo:
The aim of this study is to investigate the influence of unusual writing positions on a person's signature, in comparison to a standard writing position. Ten writers were asked to sign their signature six times, in each of four different writing positions, including the standard one. In order to take into consideration the effect of the day-to-day variation, this same process was repeated over 12 sessions, giving a total of 288 signatures per subject. The signatures were collected simultaneously in an off-line and on-line acquisition mode, using an interactive tablet and a ballpoint pen. Unidimensional variables (height to width ratio; time with or without in air displacement) and time-dependent variables (pressure; X and Y coordinates; altitude and azimuth angles) were extracted from each signature. For the unidimensional variables, the position effect was assessed through ANOVA and Dunnett contrast tests. Concerning the time-dependent variables, the signatures were compared by using dynamic time warping, and the position effect was evaluated through classification by linear discriminant analysis. Both of these variables provided similar results: no general tendency regarding the position factor could be highlighted. The influence of the position factor varies according to the subject as well as the variable studied. The impact of the session factor was shown to cover the impact that could be ascribed to the writing position factor. Indeed, the day-to-day variation has a greater effect than the position factor on the studied signature variables. The results of this study suggest guidelines for best practice in the area of signature comparisons and demonstrate the importance of a signature collection procedure covering an adequate number of sampling sessions, with a sufficient number of samples per session.
Resumo:
The aim of this work is to present a new concept, called on-line desorption of dried blood spots (on-line DBS), allowing the direct analysis of a dried blood spot coupled to liquid chromatography mass spectrometry device (LC/MS). The system is based on an inox cell which can receive a blood sample (10 microL) previously spotted on a filter paper. The cell is then integrated into LC/MS system where the analytes are desorbed out of the paper towards a column switching system ensuring the purification and separation of the compounds before their detection on a single quadrupole MS coupled to atmospheric pressure chemical ionisation (APCI) source. The described procedure implies that no pretreatment is necessary in spite the analysis is based on whole blood sample. To ensure the applicability of the concept, saquinavir, imipramine, and verapamil were chosen. Despite the use of a small sampling volume and a single quadrupole detector, on-line DBS allowed the analyses of these three compounds over their therapeutic concentrations from 50 to 500 ng/mL for imipramine and verapamil and from 100 to 1000 ng/mL for saquinavir. Moreover, the method showed good repeatability with relative standard deviation (RSD) lower than 15% based on two levels of concentration (low and high). Function responses were found to be linear over the therapeutic concentration for each compound and were used to determine the concentrations of real patient samples for saquinavir. Comparison of the founded values with those of a validated method used routinely in a reference laboratory showed a good correlation between the two methods. Moreover, good selectivity was observed ensuring that no endogenous or chemical components interfered with the quantitation of the analytes. This work demonstrates the feasibility and applicability of the on-line DBS procedure for bioanalysis.
Resumo:
In this paper, we present view-dependent information theory quality measures for pixel sampling and scene discretization in flatland. The measures are based on a definition for the mutual information of a line, and have a purely geometrical basis. Several algorithms exploiting them are presented and compare well with an existing one based on depth differences