993 resultados para Reference Area


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Below are the results of the survey of the Iberian lynx obtained with camera-trapping between 2000 and 2007 in Sierra Morena. Two very important aspects of camera-trapping concerning its efficiency are also analyzed. The first is the evolution along years according to the camera-trapping type used of two efficiency indicators. The results obtained demonstrate that the most efficient lure is rabbit, though it is the less proven (92 trap-nights), followed by camera-trapping in the most frequent marking places (latrines). And, we propose as a novel the concept of use area as a spatial reference unit for the camera-trapping monitoring of non radio-marked animals is proposed, and its validity discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a method to estimate and correct slow time-dependent position errors due to non perfect ground station synchronization and tropospheric propagation. It uses opportunity traffic emissions, i.e. signals transmitted from the aircrafts within the coverage zone. This method is used to overcome the difficulty of installing reference beacons simultaneously visible by all the base stations in a given Wide Area Multilateration (WAM) system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Métrica de calidad de video de alta definición construida a partir de ratios de referencia completa. La medida de calidad de video, en inglés Visual Quality Assessment (VQA), es uno de los mayores retos por solucionar en el entorno multimedia. La calidad de vídeo tiene un impacto altísimo en la percepción del usuario final (consumidor) de los servicios sustentados en la provisión de contenidos multimedia y, por tanto, factor clave en la valoración del nuevo paradigma denominado Calidad de la Experiencia, en inglés Quality of Experience (QoE). Los modelos de medida de calidad de vídeo se pueden agrupar en varias ramas según la base técnica que sustenta el sistema de medida, destacando en importancia los que emplean modelos psicovisuales orientados a reproducir las características del sistema visual humano, en inglés Human Visual System, del que toman sus siglas HVS, y los que, por el contrario, optan por una aproximación ingenieril en la que el cálculo de calidad está basado en la extracción de parámetros intrínsecos de la imagen y su comparación. A pesar de los avances recogidos en este campo en los últimos años, la investigación en métricas de calidad de vídeo, tanto en presencia de referencia (los modelos denominados de referencia completa), como en presencia de parte de ella (modelos de referencia reducida) e incluso los que trabajan en ausencia de la misma (denominados sin referencia), tiene un amplio camino de mejora y objetivos por alcanzar. Dentro de ellos, la medida de señales de alta definición, especialmente las utilizadas en las primeras etapas de la cadena de valor que son de muy alta calidad, son de especial interés por su influencia en la calidad final del servicio y no existen modelos fiables de medida en la actualidad. Esta tesis doctoral presenta un modelo de medida de calidad de referencia completa que hemos llamado PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), basado en la ponderación de cuatro ratios de calidad calculados a partir de características intrínsecas de la imagen. Son: El Ratio de Fidelidad, calculado mediante el gradiente morfológico o gradiente de Beucher. El Ratio de Similitud Visual, calculado mediante los puntos visualmente significativos de la imagen a través de filtrados locales de contraste. El Ratio de Nitidez, que procede de la extracción del estadístico de textura de Haralick contraste. El Ratio de Complejidad, obtenido de la definición de homogeneidad del conjunto de estadísticos de textura de Haralick PARMENIA presenta como novedad la utilización de la morfología matemática y estadísticos de Haralick como base de una métrica de medida de calidad, pues esas técnicas han estado tradicionalmente más ligadas a la teledetección y la segmentación de objetos. Además, la aproximación de la métrica como un conjunto ponderado de ratios es igualmente novedosa debido a que se alimenta de modelos de similitud estructural y otros más clásicos, basados en la perceptibilidad del error generado por la degradación de la señal asociada a la compresión. PARMENIA presenta resultados con una altísima correlación con las valoraciones MOS procedentes de las pruebas subjetivas a usuarios que se han realizado para la validación de la misma. El corpus de trabajo seleccionado procede de conjuntos de secuencias validados internacionalmente, de modo que los resultados aportados sean de la máxima calidad y el máximo rigor posible. La metodología de trabajo seguida ha consistido en la generación de un conjunto de secuencias de prueba de distintas calidades a través de la codificación con distintos escalones de cuantificación, la obtención de las valoraciones subjetivas de las mismas a través de pruebas subjetivas de calidad (basadas en la recomendación de la Unión Internacional de Telecomunicaciones BT.500), y la validación mediante el cálculo de la correlación de PARMENIA con estos valores subjetivos, cuantificada a través del coeficiente de correlación de Pearson. Una vez realizada la validación de los ratios y optimizada su influencia en la medida final y su alta correlación con la percepción, se ha realizado una segunda revisión sobre secuencias del hdtv test dataset 1 del Grupo de Expertos de Calidad de Vídeo (VQEG, Video Quality Expert Group) mostrando los resultados obtenidos sus claras ventajas. Abstract Visual Quality Assessment has been so far one of the most intriguing challenges on the media environment. Progressive evolution towards higher resolutions while increasing the quality needed (e.g. high definition and better image quality) aims to redefine models for quality measuring. Given the growing interest in multimedia services delivery, perceptual quality measurement has become a very active area of research. First, in this work, a classification of objective video quality metrics based on their underlying methodologies and approaches for measuring video quality has been introduced to sum up the state of the art. Then, this doctoral thesis describes an enhanced solution for full reference objective quality measurement based on mathematical morphology, texture features and visual similarity information that provides a normalized metric that we have called PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), with a high correlated MOS score. The PARMENIA metric is based on the pooling of different quality ratios that are obtained from three different approaches: Beucher’s gradient, local contrast filtering, and contrast and homogeneity Haralick’s texture features. The metric performance is excellent, and improves the current state of the art by providing a wide dynamic range that make easier to discriminate between very close quality coded sequences, especially for very high bit rates whose quality, currently, is transparent for quality metrics. PARMENIA introduces a degree of novelty against other working metrics: on the one hand, exploits the structural information variation to build the metric’s kernel, but complements the measure with texture information and a ratio of visual meaningful points that is closer to typical error sensitivity based approaches. We would like to point out that PARMENIA approach is the only metric built upon full reference ratios, and using mathematical morphology and texture features (typically used in segmentation) for quality assessment. On the other hand, it gets results with a wide dynamic range that allows measuring the quality of high definition sequences from bit rates of hundreds of Megabits (Mbps) down to typical distribution rates (5-6 Mbps), even streaming rates (1- 2 Mbps). Thus, a direct correlation between PARMENIA and MOS scores are easily constructed. PARMENIA may further enhance the number of available choices in objective quality measurement, especially for very high quality HD materials. All this results come from validation that has been achieved through internationally validated datasets on which subjective tests based on ITU-T BT.500 methodology have been carried out. Pearson correlation coefficient has been calculated to verify the accuracy of PARMENIA and its reliability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acoplamiento del sistema informático de control de piso de producción (SFS) con el conjunto de equipos de fabricación (SPE) es una tarea compleja. Tal acoplamiento involucra estándares abiertos y propietarios, tecnologías de información y comunicación, entre otras herramientas y técnicas. Debido a la turbulencia de mercados, ya sea soluciones personalizadas o soluciones basadas en estándares eventualmente requieren un esfuerzo considerable de adaptación. El concepto de acoplamiento débil ha sido identificado en la comunidad de diseño organizacional como soporte para la sobrevivencia de la organización. Su presencia reduce la resistencia de la organización a cambios en el ambiente. En este artículo los resultados obtenidos por la comunidad de diseño organizacional son identificados, traducidos y organizados para apoyar en la solución del problema de integración SFS-SPE. Un modelo clásico de acoplamiento débil, desarrollado por la comunidad de estudios de diseño organizacional, es resumido y trasladado al área de interés. Los aspectos claves son identificados para utilizarse como promotores del acoplamiento débil entre SFS-SPE, y presentados en forma de esquema de referencia. Así mismo, este esquema de referencia es presentado como base para el diseño e implementación de una solución genérica de acoplamiento o marco de trabajo (framework) de acoplamiento, a incluir como etapa de acoplamiento débil entre SFS y SPE. Un ejemplo de validación con varios conjuntos de equipos de fabricación, usando diferentes medios físicos de comunicación, comandos de controlador, lenguajes de programación de equipos y protocolos de comunicación es presentado, mostrando un nivel aceptable de autonomía del SFS. = Coupling shop floor software system (SFS) with the set of production equipment (SPE) becomes a complex task. It involves open and proprietary standards, information and communication technologies among other tools and techniques. Due to market turbulence, either custom solutions or standards based solutions eventually require a considerable effort of adaptation. Loose coupling concept has been identified in the organizational design community as a compensator for organization survival. Its presence reduces organization reaction to environment changes. In this paper the results obtained by the organizational de sign community are identified, translated and organized to support the SFS-SPE integration problem solution. A classical loose coupling model developed by organizational studies community is abstracted and translated to the area of interest. Key aspects are identified to be used as promoters of SFS-SPE loose coupling and presented in a form of a reference scheme. Furthermore, this reference scheme is proposed here as a basis for the design and implementation of a generic coupling solution or coupling framework, that is included as a loose coupling stage between SFS and SPE. A validation example with various sets of manufacturing equipment, using different physical communication media, controller commands, programming languages and wire protocols is presented, showing an acceptable level of autonomy gained by the SFS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanostructured TiO2 photocatalysts with small crystalline sizes have been synthesized by sol-gel using the amphiphilic triblock copolymer Pluronic P123 as template. A new synthesis route, based on the treatment of TiO2 xerogels with acid-ethanol mixtures in two different steps, synthesis and extraction-crystallization, has been investigated, analyzing two acids, hydrochloric and hydriodic acid. As reference, samples have also been prepared by extraction-crystallization in ethanol, being these TiO2 materials amorphous and presenting higher porosities. The prepared materials present different degrees of crystallinity depending on the experimental conditions used. In general, these materials exhibit high surface areas, with an important contribution of microporosity and mesoporosity, and with very small size anatase crystals, ranging from 5 to 7 nm. The activity of the obtained photocatalysts has been assessed in the oxidation of propene in gas phase at low concentration (100 ppmv) under a UVA lamp with 365 nm wavelength. In the conditions studied, these photocatalysts show different activities in the oxidation of propene which do not depend on their surface areas, but on their crystallinity and band gap energies, being sample prepared with HCl both during synthesis and in extraction-crystallizations steps, the most active one, with superior performance than Evonik P25.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of nonlocal density functional theory (NLDFT) to determine pore size distribution (PSD) of activated carbons using a nongraphitized carbon black, instead of graphitized thermal carbon black, as a reference system is explored. We show that in this case nitrogen and argon adsorption isotherms in activated carbons are precisely correlated by the theory, and such an excellent correlation would never be possible if the pore wall surface was assumed to be identical to that of graphitized carbon black. It suggests that pore wall surfaces of activated carbon are closer to that of amorphous solids because of defects of crystalline lattice, finite pore length, and the presence of active centers.. etc. Application of the NLDFT adapted to amorphous solids resulted in quantitative description of N-2 and Ar adsorption isotherms on nongraphitized carbon black BP280 at their respective boiling points. In the present paper we determined solid-fluid potentials from experimental adsorption isotherms on nongraphitized carbon black and subsequently used those potentials to model adsorption in slit pores and generate a corresponding set of local isotherms, which we used to determine the PSD functions of different activated carbons. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessment of the extent of coral bleaching has become an important part of studies that aim to understand the condition of coral reefs. In this study a reference card that uses differences in coral colour was developed as an inexpensive, rapid and non-invasive method for the assessment of bleaching. The card uses a 6 point brightness/saturation scale within four colour hues to record changes in bleaching state. Changes on the scale of 2 units or more reflect a change in symbiont density and chlorophyll a content, and therefore the bleaching state of the coral. When used by non-specialist observers in the field (here on an intertidal reef flat), there was an inter-observer error of I colour score. This technique improves on existing subjective assessment of bleaching state by visual observation and offers the potential for rapid, wide-area assessment of changing coral condition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La Sequenza Sismica Emiliana del 2012 ha colpito la zona compresa tra Mirandola e Ferrara con notevoli manifestazioni cosismiche e post-sismiche secondarie, soprattutto legate al fenomeno della liquefazione delle sabbie e alla formazione di fratturazioni superficiali del terreno. A fronte del fatto che la deformazione principale, osservata tramite tecniche di remote-sensing, ha permesso di individuare la posizione della struttura generatrice, ci si è interrogati sul rapporto tra strutture profonde e manifestazioni secondarie superficiali. In questa tesi è stato svolto un lavoro di integrazione di dati a varia scala, dalla superficie al sottosuolo, fino profondità di alcuni chilometri, per analizzare il legame tra le strutture geologiche che hanno generato il sisma e gli effetti superficiali percepiti dagli osservatori. Questo, non solo in riferimento allo specifico del sisma emiliano del 2012, ma al fine di trarre utili informazioni in una prospettiva storica e geologica sugli effetti di un terremoto “tipico”, in una regione dove le strutture generatrici non affiorano in superficie. Gli elementi analizzati comprendono nuove acquisizioni e rielaborazioni di dati pregressi, e includono cartografie geomorfologiche, telerilevamenti, profili sismici a riflessione superficiale e profonda, stratigrafie e informazioni sulla caratterizzazione dell’area rispetto al rischio sismico. Parte dei dati di nuova acquisizione è il risultato dello sviluppo e la sperimentazione di metodologie innovative di prospezione sismica in corsi e specchi d’acqua continentali, che sono state utilizzate con successo lungo il Cavo Napoleonico, un canale artificiale che taglia ortogonalmente la zona di massima deformazione del sisma del 20 Maggio. Lo sviluppo della nuova metodologia di indagine geofisica, applicata ad un caso concreto, ha permesso di migliorare le tecniche di imaging del sottosuolo, oltre a segnalare nuove evidenze co-sismiche che rimanevano nascoste sotto le acque del canale, e a fornire elementi utili alla stratigrafia del terreno. Il confronto tra dati geofisici e dati geomorfologici ha permesso di cartografare con maggiore dettaglio i corpi e le forme sedimentarie superficiali legati alla divagazione fluviale dall’VIII sec a.C.. I dati geofisici, superficiali e profondi, hanno evidenziato il legame tra le strutture sismogeniche e le manifestazioni superficiali seguite al sisma emiliano. L’integrazione dei dati disponibili, sia nuovi che da letteratura, ha evidenziato il rapporto tra strutture profonde e sedimentazione, e ha permesso di calcolare i tassi geologici di sollevamento della struttura generatrice del sisma del 20 Maggio. I risultati di questo lavoro hanno implicazioni in vari ambiti, tra i quali la valutazione del rischio sismico e la microzonazione sismica, basata su una caratterizzazione geomorfologico-geologico-geofisica dettagliata dei primi 20 metri al di sotto della superficie topografica. Il sisma emiliano del 2012 ha infatti permesso di riconoscere l’importanza del substrato per lo sviluppo di fenomeni co- e post-sismici secondari, in un territorio fortemente eterogeneo come la Pianura Padana.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous research suggests that changing consumer and producer knowledge structures play a role in market evolution and that the sociocognitive processes of product markets are revealed in the sensemaking stories of market actors that are rebroadcasted in commercial publications. In this article, the authors lend further support to the story-based nature of market sensemaking and the use of the sociocognitive approach in explaining the evolution of high-technology markets. They examine the content (i.e., subject matter or topic) and volume (i.e., the number) of market stories and the extent to which content and volume of market stories evolve as a technology emerges. Data were obtained from a content analysis of 10,412 article abstracts, published in key trade journals, pertaining to Local Area Network (LAN) technologies and spanning the period 1981 to 2000. Hypotheses concerning the evolving nature (content and volume) of market stories in technology evolution are tested. The analysis identified four categories of market stories - technical, product availability, product adoption, and product discontinuation. The findings show that the emerging technology passes initially through a 'technical-intensive' phase whereby technology related stories dominate, through a 'supply-push' phase, in which stories presenting products embracing the technology tend to exceed technical stories while there is a rise in the number of product adoption reference stories, to a 'product-focus' phase, with stories predominantly focusing on product availability. Overall story volume declines when a technology matures as the need for sensemaking reduces. When stories about product discontinuation surface, these signal the decline of current technology. New technologies that fail to maintain the 'product-focus' stage also reflect limited market acceptance. The article also discusses the theoretical and managerial implications of the study's findings. © 2002 Elsevier Science Inc. All rights reserved.