967 resultados para Capture-recapture Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determinar con buena precisión la posición en la que se encuentra un terminal móvil, cuando éste se halla inmerso en un entorno de interior (centros comerciales, edificios de oficinas, aeropuertos, estaciones, túneles, etc), es el pilar básico sobre el que se sustentan un gran número de aplicaciones y servicios. Muchos de esos servicios se encuentran ya disponibles en entornos de exterior, aunque los entornos de interior se prestan a otros servicios específicos para ellos. Ese número, sin embargo, podría ser significativamente mayor de lo que actualmente es, si no fuera necesaria una costosa infraestructura para llevar a cabo el posicionamiento con la precisión adecuada a cada uno de los hipotéticos servicios. O, igualmente, si la citada infraestructura pudiera tener otros usos distintos, además del relacionado con el posicionamiento. La usabilidad de la misma infraestructura para otros fines distintos ofrecería la oportunidad de que la misma estuviera ya presente en las diferentes localizaciones, porque ha sido previamente desplegada para esos otros usos; o bien facilitaría su despliegue, porque el coste de esa operación ofreciera un mayor retorno de usabilidad para quien lo realiza. Las tecnologías inalámbricas de comunicaciones basadas en radiofrecuencia, ya en uso para las comunicaciones de voz y datos (móviles, WLAN, etc), cumplen el requisito anteriormente indicado y, por tanto, facilitarían el crecimiento de las aplicaciones y servicios basados en el posicionamiento, en el caso de poderse emplear para ello. Sin embargo, determinar la posición con el nivel de precisión adecuado mediante el uso de estas tecnologías, es un importante reto hoy en día. El presente trabajo pretende aportar avances significativos en este campo. A lo largo del mismo se llevará a cabo, en primer lugar, un estudio de los principales algoritmos y técnicas auxiliares de posicionamiento aplicables en entornos de interior. La revisión se centrará en aquellos que sean aptos tanto para tecnologías móviles de última generación como para entornos WLAN. Con ello, se pretende poner de relieve las ventajas e inconvenientes de cada uno de estos algoritmos, teniendo como motivación final su aplicabilidad tanto al mundo de las redes móviles 3G y 4G (en especial a las femtoceldas y small-cells LTE) como al indicado entorno WLAN; y teniendo siempre presente que el objetivo último es que vayan a ser usados en interiores. La principal conclusión de esa revisión es que las técnicas de triangulación, comúnmente empleadas para realizar la localización en entornos de exterior, se muestran inútiles en los entornos de interior, debido a efectos adversos propios de este tipo de entornos como la pérdida de visión directa o los caminos múltiples en el recorrido de la señal. Los métodos de huella radioeléctrica, más conocidos bajo el término inglés “fingerprinting”, que se basan en la comparación de los valores de potencia de señal que se están recibiendo en el momento de llevar a cabo el posicionamiento por un terminal móvil, frente a los valores registrados en un mapa radio de potencias, elaborado durante una fase inicial de calibración, aparecen como los mejores de entre los posibles para los escenarios de interior. Sin embargo, estos sistemas se ven también afectados por otros problemas, como por ejemplo los importantes trabajos a realizar para ponerlos en marcha, y la variabilidad del canal. Frente a ellos, en el presente trabajo se presentan dos contribuciones originales para mejorar los sistemas basados en los métodos fingerprinting. La primera de esas contribuciones describe un método para determinar, de manera sencilla, las características básicas del sistema a nivel del número de muestras necesarias para crear el mapa radio de la huella radioeléctrica de referencia, junto al número mínimo de emisores de radiofrecuencia que habrá que desplegar; todo ello, a partir de unos requerimientos iniciales relacionados con el error y la precisión buscados en el posicionamiento a realizar, a los que uniremos los datos correspondientes a las dimensiones y realidad física del entorno. De esa forma, se establecen unas pautas iniciales a la hora de dimensionar el sistema, y se combaten los efectos negativos que, sobre el coste o el rendimiento del sistema en su conjunto, son debidos a un despliegue ineficiente de los emisores de radiofrecuencia y de los puntos de captura de su huella. La segunda contribución incrementa la precisión resultante del sistema en tiempo real, gracias a una técnica de recalibración automática del mapa radio de potencias. Esta técnica tiene en cuenta las medidas reportadas continuamente por unos pocos puntos de referencia estáticos, estratégicamente distribuidos en el entorno, para recalcular y actualizar las potencias registradas en el mapa radio. Un beneficio adicional a nivel operativo de la citada técnica, es la prolongación del tiempo de usabilidad fiable del sistema, bajando la frecuencia en la que se requiere volver a capturar el mapa radio de potencias completo. Las mejoras anteriormente citadas serán de aplicación directa en la mejora de los mecanismos de posicionamiento en interiores basados en la infraestructura inalámbrica de comunicaciones de voz y datos. A partir de ahí, esa mejora será extensible y de aplicabilidad sobre los servicios de localización (conocimiento personal del lugar donde uno mismo se encuentra), monitorización (conocimiento por terceros del citado lugar) y seguimiento (monitorización prolongada en el tiempo), ya que todos ellas toman como base un correcto posicionamiento para un adecuado desempeño. ABSTRACT To find the position where a mobile is located with good accuracy, when it is immersed in an indoor environment (shopping centers, office buildings, airports, stations, tunnels, etc.), is the cornerstone on which a large number of applications and services are supported. Many of these services are already available in outdoor environments, although the indoor environments are suitable for other services that are specific for it. That number, however, could be significantly higher than now, if an expensive infrastructure were not required to perform the positioning service with adequate precision, for each one of the hypothetical services. Or, equally, whether that infrastructure may have other different uses beyond the ones associated with positioning. The usability of the same infrastructure for purposes other than positioning could give the opportunity of having it already available in the different locations, because it was previously deployed for these other uses; or facilitate its deployment, because the cost of that operation would offer a higher return on usability for the deployer. Wireless technologies based on radio communications, already in use for voice and data communications (mobile, WLAN, etc), meet the requirement of additional usability and, therefore, could facilitate the growth of applications and services based on positioning, in the case of being able to use it. However, determining the position with the appropriate degree of accuracy using these technologies is a major challenge today. This paper provides significant advances in this field. Along this work, a study about the main algorithms and auxiliar techniques related with indoor positioning will be initially carried out. The review will be focused in those that are suitable to be used with both last generation mobile technologies and WLAN environments. By doing this, it is tried to highlight the advantages and disadvantages of each one of these algorithms, having as final motivation their applicability both in the world of 3G and 4G mobile networks (especially in femtocells and small-cells of LTE) and in the WLAN world; and having always in mind that the final aim is to use it in indoor environments. The main conclusion of that review is that triangulation techniques, commonly used for localization in outdoor environments, are useless in indoor environments due to adverse effects of such environments as loss of sight or multipaths. Triangulation techniques used for external locations are useless due to adverse effects like the lack of line of sight or multipath. Fingerprinting methods, based on the comparison of Received Signal Strength values measured by the mobile phone with a radio map of RSSI Recorded during the calibration phase, arise as the best methods for indoor scenarios. However, these systems are also affected by other problems, for example the important load of tasks to be done to have the system ready to work, and the variability of the channel. In front of them, in this paper we present two original contributions to improve the fingerprinting methods based systems. The first one of these contributions describes a method for find, in a simple way, the basic characteristics of the system at the level of the number of samples needed to create the radio map inside the referenced fingerprint, and also by the minimum number of radio frequency emitters that are needed to be deployed; and both of them coming from some initial requirements for the system related to the error and accuracy in positioning wanted to have, which it will be joined the data corresponding to the dimensions and physical reality of the environment. Thus, some initial guidelines when dimensioning the system will be in place, and the negative effects into the cost or into the performance of the whole system, due to an inefficient deployment of the radio frequency emitters and of the radio map capture points, will be minimized. The second contribution increases the resulting accuracy of the system when working in real time, thanks to a technique of automatic recalibration of the power measurements stored in the radio map. This technique takes into account the continuous measures reported by a few static reference points, strategically distributed in the environment, to recalculate and update the measurements stored into the map radio. An additional benefit at operational level of such technique, is the extension of the reliable time of the system, decreasing the periodicity required to recapture the radio map within full measurements. The above mentioned improvements are directly applicable to improve indoor positioning mechanisms based on voice and data wireless communications infrastructure. From there, that improvement will be also extensible and applicable to location services (personal knowledge of the location where oneself is), monitoring (knowledge by other people of your location) and monitoring (prolonged monitoring over time) as all of them are based in a correct positioning for proper performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Humans' desire for knowledge regarding animal species and their interactions with the natural world have spurred centuries of studies. The relatively new development of remote sensing systems using satellite or aircraft-borne sensors has opened up a wide field of research, which unfortunately largely remains dependent on coarse-scale image spatial resolution, particularly for habitat modeling. For habitat-specialized species, such data may not be sufficient to successfully capture the nuances of their preferred areas. Of particular concern are those species for which topographic feature attributes are a main limiting factor for habitat use. Coarse spatial resolution data can smooth over details that may be essential for habitat characterization. Three studies focusing on sea turtle nesting beaches were completed to serve as an example of how topography can be a main deciding factor for certain species. Light Detection and Ranging (LiDAR) data were used to illustrate that fine spatial scale data can provide information not readily captured by either field work or coarser spatial scale sources. The variables extracted from the LiDAR data could successfully model nesting density for loggerhead (Caretta caretta), green (Chelonia mydas), and leatherback (Dermochelys coriacea) sea turtle species using morphological beach characteristics, highlight beach changes over time and their correlations with nesting success, and provide comparisons for nesting density models across large geographic areas. Comparisons between the LiDAR dataset and other digital elevation models (DEMs) confirmed that fine spatial scale data sources provide more similar habitat information than those with coarser spatial scales. Although these studies focused solely on sea turtles, the underlying principles are applicable for many other wildlife species whose range and behavior may be influenced by topographic features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A hydrological–economic model is introduced to describe the dynamics of groundwater-dependent economics (agriculture and tourism) for sustainable use in sparse-data drylands. The Amtoudi Oasis, a remote area in southern Morocco, in the northern Sahara attractive for tourism and with evidence of groundwater degradation, was chosen to show the model operation. Governing system variables were identified and put into action through System Dynamics (SD) modeling causal diagrams to program basic formulations into a model having two modules coupled by the nexus ‘pumping’: (1) the hydrological module represents the net groundwater balance (G) dynamics; and (2) the economic module reproduces the variation in the consumers of water, both the population and tourists. The model was operated under similar influx of tourists and different scenarios of water availability, such as the wet 2009–2010 and the average 2010–2011 hydrological years. The rise in international tourism is identified as the main driving force reducing emigration and introducing new social habits in the population, in particular concerning water consumption. Urban water allotment (PU) was doubled for less than a 100-inhabitant net increase in recent decades. The water allocation for agriculture (PI), the largest consumer of water, had remained constant for decades. Despite that the 2-year monitoring period is not long enough to draw long-term conclusions, groundwater imbalance was reflected by net aquifer recharge (R) less than PI + PU (G < 0) in the average year 2010–2011, with net lateral inflow from adjacent Cambrian formations being the largest recharge component. R is expected to be much less than PI + PU in recurrent dry spells. Some low-technology actions are tentatively proposed to mitigate groundwater degradation, such as: wastewater capture, treatment, and reuse for irrigation; storm-water harvesting for irrigation; and active maintenance of the irrigation system to improve its efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jellyfishes have functionally replaced several overexploited commercial stocks of planktivorous fishes. This is paradoxical, because they use a primitive prey capture mechanism requiring direct contact with the prey, whereas fishes use more efficient visual detection. We have compiled published data to show that, in spite of their primitive life-style, jellyfishes exhibit similar instantaneous prey clearance and respiration rates as their fish competitors and similar potential for growth and reproduction. To achieve this production, they have evolved large, water-laden bodies that increase prey contact rates. Although larger bodies are less efficient for swimming, optimization analysis reveals that large collectors are advantageous if they move through the water sufficiently slowly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The leatherback turtle Dermochelys coriacea is considered to be at serious risk of global extinction, despite ongoing conservation efforts. Intensive long-term monitoring of a leatherback nesting population on Sandy Point (St. Croix, US Virgin Islands) offers a unique opportunity to quantify basic population parameters and evaluate effectiveness of nesting beach conservation practices. We report a significant increase in the number of females nesting annually from ca. 18-30 in the 1980s to 186 in 2001, with a corresponding increase in annual hatchling production from ca. 2000 to over 49,000. We then analyzed resighting data from 1991 to 2001 with an open robust-design capture-mark-recapture model to estimate annual nester survival and adult abundance for this population. The expected annual survival probability was estimated at ca. 0.893 (95% CL 0.87-0.92) and the population was estimated to be increasing ca. 13% pa since the early 1990s. Taken together with DNA fingerprinting that identify mother-daughter relations, our findings suggest that the increase in the size of the nesting population since 1991 was probably due to an aggressive program of beach protection and egg relocation initiated more than 20 years ago. Beach protection and egg relocation provide a simple and effective conservation strategy for this Northern Caribbean nesting population as long as adult survival at sea remains relatively high. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Australian energy market is in the final stages of deregulation. These changes have created a dynamic environment which is highly volatile and competitive with respect to both demand and price. Our current research seeks to visualise aspects of the National Energy Market with a view to developing techniques which may be useful in identifying significant characteristics and/or drivers of these characteristics. In order to capture the complexity of the problem we explore a suite of different visualisation techniques, which, when combined into a unified package, highlight aspects of the problem. The particular problem visualised here is "Does the date exhibit characteristics which suggest that the time of day, day of the week, or the season, aflect the variation in demand and/or price?" © Austral. Mathematical Soc. 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A longitudinal capture-mark-recapture study was conducted to determine the temporal dynamics of rabbit haemorrhagic disease (RHD) in a European rabbit (Oryctolagus cuniculus) population of low to moderate density on sand-hill country in the lower North Island of New Zealand. A combination of sampling ( trapping and radio-tracking) and diagnostic (cELISA, PCR and isotype ELISA) methods was employed to obtain data weekly from May 1998 until June 2001. Although rabbit haemorrhagic disease virus ( RHDV) infection was detected in the study population in all 3 years, disease epidemics were evident only in the late summer or autumn months in 1999 and 2001. Overall, 20% of 385 samples obtained from adult animals older than 11 weeks were seropositive. An RHD outbreak in 1999 contributed to an estimated population decline of 26%. A second RHD epidemic in February 2001 was associated with a population decline of 52% over the subsequent month. Following the outbreaks, the seroprevalence in adult survivors was between 40% and 50%. During 2000, no deaths from RHDV were confirmed and mortalities were predominantly attributed to predation. Influx of seronegative immigrants was greatest in the 1999 and 2001 breeding seasons, and preceded the RHD epidemics in those years. Our data suggest that RHD epidemics require the population immunity level to fall below a threshold where propagation of infection can be maintained through the population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed a method to rapidly and safely live capture wild dugongs based on the “rodeo method” employed to catch marine turtles. This method entails close pursuit of a dugong by boat until it is fatigued. The dugong is then caught around the peduncle region by a catcher leaping off the boat, and the dugong is restrained at the water surface by several people while data are collected. Our sampling protocol involves a short restraint time, typically < 5 min. No ropes or nets were attached to the dugong to avoid the risk of entanglement and subsequent drowning. This method is suitable for shallow, open-water captures when weather and water conditions are fair, and may be adapted for deeper waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hierarchical visualization systems are desirable because a single two-dimensional visualization plot may not be sufficient to capture all of the interesting aspects of complex high-dimensional data sets. We extend an existing locally linear hierarchical visualization system PhiVis [1] in several directions: bf(1) we allow for em non-linear projection manifolds (the basic building block is the Generative Topographic Mapping -- GTM), bf(2) we introduce a general formulation of hierarchical probabilistic models consisting of local probabilistic models organized in a hierarchical tree, bf(3) we describe folding patterns of low-dimensional projection manifold in high-dimensional data space by computing and visualizing the manifold's local directional curvatures. Quantities such as magnification factors [3] and directional curvatures are helpful for understanding the layout of the nonlinear projection manifold in the data space and for further refinement of the hierarchical visualization plot. Like PhiVis, our system is statistically principled and is built interactively in a top-down fashion using the EM algorithm. We demonstrate the visualization system principle of the approach on a complex 12-dimensional data set and mention possible applications in the pharmaceutical industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exploratory analysis of data in all sciences seeks to find common patterns to gain insights into the structure and distribution of the data. Typically visualisation methods like principal components analysis are used but these methods are not easily able to deal with missing data nor can they capture non-linear structure in the data. One approach to discovering complex, non-linear structure in the data is through the use of linked plots, or brushing, while ignoring the missing data. In this technical report we discuss a complementary approach based on a non-linear probabilistic model. The generative topographic mapping enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate far more structure than a two dimensional principal components plot could, and deal at the same time with missing data. We show that using the generative topographic mapping provides us with an optimal method to explore the data while being able to replace missing values in a dataset, particularly where a large proportion of the data is missing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a data envelopment analysis (DEA) based method for assessing the comparative efficiencies of units operating production processes where input-output levels are inter-temporally dependent. One cause of inter-temporal dependence between input and output levels is capital stock which influences output levels over many production periods. Such units cannot be assessed by traditional or 'static' DEA which assumes input-output correspondences are contemporaneous in the sense that the output levels observed in a time period are the product solely of the input levels observed during that same period. The method developed in the paper overcomes the problem of inter-temporal input-output dependence by using input-output 'paths' mapped out by operating units over time as the basis of assessing them. As an application we compare the results of the dynamic and static model for a set of UK universities. The paper is suggested that dynamic model capture the efficiency better than static model. © 2003 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few works address methodological issues of how to conduct strategy-as-practice research and even fewer focus on how to analyse the subsequent data in ways that illuminate strategy as an everyday, social practice. We address this gap by proposing a quantitative method for analysing observational data, which can complement more traditional qualitative methodologies. We propose that rigorous but context-sensitive coding of transcripts can render everyday practice analysable statistically. Such statistical analysis provides a means for analytically representing patterns and shifts within the mundane, repetitive elements through which practice is accomplished. We call this approach the Event Database (EDB) and it consists of five basic coding categories that help us capture the stream of practice. Indexing codes help to index or categorise the data, in order to give context and offer some basic information about the event under discussion. Indexing codes are descriptive codes, which allow us to catalogue and classify events according to their assigned characteristics. Content codes are to do with the qualitative nature of the event; this is the essence of the event. It is a description that helps to inform judgements about the phenomenon. Nature codes help us distinguish between discursive and tangible events. We include this code to acknowledge that some events differ qualitatively from other events. Type events are codes abstracted from the data in order to help us classify events based on their description or nature. This involves significantly more judgement than the index codes but consequently is also more meaningful. Dynamics codes help us capture some of the movement or fluidity of events. This category has been included to let us capture the flow of activity over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Foley [J. Opt. Soc. Am. A 11 (1994) 1710] has proposed an influential psychophysical model of masking in which mask components in a contrast gain pool are raised to an exponent before summation and divisive inhibition. We tested this summation rule in experiments in which contrast detection thresholds were measured for a vertical 1 c/deg (or 2 c/deg) sine-wave component in the presence of a 3 c/deg (or 6 c/deg) mask that had either a single component oriented at -45° or a pair of components oriented at ±45°. Contrary to the predictions of Foley's model 3, we found that for masks of moderate contrast and above, threshold elevation was predicted by linear summation of the mask components in the inhibitory stage of the contrast gain pool. We built this feature into two new models, referred to as the early adaptation model and the hybrid model. In the early adaptation model, contrast adaptation controls a threshold-like nonlinearity on the output of otherwise linear pathways that provide the excitatory and inhibitory inputs to a gain control stage. The hybrid model involves nonlinear and nonadaptable routes to excitatory and inhibitory stages as well as an adaptable linear route. With only six free parameters, both models provide excellent fits to the masking and adaptation data of Foley and Chen [Vision Res. 37 (1997) 2779] but unlike Foley and Chen's model, are able to do so with only one adaptation parameter. However, only the hybrid model is able to capture the features of Foley's (1994) pedestal plus orthogonal fixed mask data. We conclude that (1) linear summation of inhibitory components is a feature of contrast masking, and (2) that the main aftereffect of spatial adaptation on contrast increment thresholds can be assigned to a single site. © 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exploratory analysis of data seeks to find common patterns to gain insights into the structure and distribution of the data. In geochemistry it is a valuable means to gain insights into the complicated processes making up a petroleum system. Typically linear visualisation methods like principal components analysis, linked plots, or brushing are used. These methods can not directly be employed when dealing with missing data and they struggle to capture global non-linear structures in the data, however they can do so locally. This thesis discusses a complementary approach based on a non-linear probabilistic model. The generative topographic mapping (GTM) enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate more structure than a two dimensional principal components plot. The model can deal with uncertainty, missing data and allows for the exploration of the non-linear structure in the data. In this thesis a novel approach to initialise the GTM with arbitrary projections is developed. This makes it possible to combine GTM with algorithms like Isomap and fit complex non-linear structure like the Swiss-roll. Another novel extension is the incorporation of prior knowledge about the structure of the covariance matrix. This extension greatly enhances the modelling capabilities of the algorithm resulting in better fit to the data and better imputation capabilities for missing data. Additionally an extensive benchmark study of the missing data imputation capabilities of GTM is performed. Further a novel approach, based on missing data, will be introduced to benchmark the fit of probabilistic visualisation algorithms on unlabelled data. Finally the work is complemented by evaluating the algorithms on real-life datasets from geochemical projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exploratory analysis of petroleum geochemical data seeks to find common patterns to help distinguish between different source rocks, oils and gases, and to explain their source, maturity and any intra-reservoir alteration. However, at the outset, one is typically faced with (a) a large matrix of samples, each with a range of molecular and isotopic properties, (b) a spatially and temporally unrepresentative sampling pattern, (c) noisy data and (d) often, a large number of missing values. This inhibits analysis using conventional statistical methods. Typically, visualisation methods like principal components analysis are used, but these methods are not easily able to deal with missing data nor can they capture non-linear structure in the data. One approach to discovering complex, non-linear structure in the data is through the use of linked plots, or brushing, while ignoring the missing data. In this paper we introduce a complementary approach based on a non-linear probabilistic model. Generative topographic mapping enables the visualisation of the effects of very many variables on a single plot, while also dealing with missing data. We show how using generative topographic mapping also provides an optimal method with which to replace missing values in two geochemical datasets, particularly where a large proportion of the data is missing.