932 resultados para Retrieval


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La asesoría de empresas Ribas Álvarez tiene, actualmente, un problema con la gestión de documentos internos, que se realiza mediante correo interno y una aplicación sencilla de indexación de archivos (HTML); sin ningún tipo de supervisión ni restricción. Esta empresa dispone de un cierto número de trabajadores, los cuales pertenecen a diferentes secciones (privadas o públicas) dentro de la empresa. La información que circula dentro de la empresa, no tiene ningún tipo de seguridad pudiendo cualquier trabajador, disponer de ella aunque no le sea de utilidad. Se quiere crear una aplicación que cumpla con las necesidades que la empresa desea para la administración y gestión de documentos internos, con un control de usuarios y seguridad de acceso a esta aplicación. El objetivo básico de la aplicación seria la creación y gestión de una intranet de control y seguimiento de documentos para una asesoría de empresas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho procurou uma resposta para a aparente contradição entre os actos de preservar e de desenvolver no trabalho museológico. E desejava, com essa resposta, obter uma compreensão mais profunda sobre a Museologia. Utilizando a metodologia de investigação “Grounded Theory” (Glaser & Strauss, 1967; Ellen, 1992; Mark, 1996; Marshall & Rossman, 1999) adoptou a definição de museu dos Estatutos do ICOM (2001) como ponto de partida conceptual para o desenrolar da pesquisa. A - Com o esforço necessário à obtenção da resposta inicial o trabalho pôde alcançar os seguintes resultados: i) Discerniu as fases e a racionalidade do processo museológico, através do qual os objectos adquirem a “identidade patrimonial”. ii) Formulou o conceito de “objecto museológico” numa acepção distinta do de Património ou de “objecto patrimonial”, permitindo confirmar que a contradição formulada na hipótese inicial só poderia desaparecer, ou ser conciliada, num paradigma de trabalho museológico concebido como um acto de comunicação. iii) Propôs, em consequência, um diferente Programa para a orientação do trabalho museológico, demonstrando que garantiria ao património uma maior perenidade e transmissibilidade, sendo ainda capaz de incluir o património referente à materialidade, à iconicidade, à oralidade e à gestualidade dos objectos. iv) Propôs um Léxico de Conceitos capaz de justificar essas novas propostas. v) Sugeriu um índice de desenvolvimento museal (IDM = Σ ƒξ [IP.ID.IC] / CT.CR) para ser possível avaliar e quantificar o trabalho museológico. B – Para o objectivo de uma compreensão mais profunda da Museologia o trabalho alcançaria os seguintes resultados: vi) Verificou a necessidade de se dominarem competências de Gestão, para o trabalho museológico não se restringir apenas a um tipo de colecções ou de património. vii) Sugeriu, para ser possível continuar a investigar a Museologia como um novo ramo ou disciplina do saber, a necessidade estratégica de a ligar ao estudo mais vasto da Memória, apontando dois caminhos: Por um lado, considerar a herança filogenética dos “modos de guardar informações” entre os diferentes organismos e sistemas (Lecointre & Le Guyader, 2001). Por outro lado, considerar os constrangimentos ocorridos durante a ontogenia e a maturação individual que obrigam a ter em consideração, no processamento da memória e do património (codificação, armazenamento, evocação e recuperação, esquecimento), a biologia molecular da cognição (Squire & Kandel, 2002).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O aspecto fulcral desta dissertação centra-se-à volta do desafio de procurar facilitar o acesso à informação contida na base de dados bibliográfica da Biblioteca Universitária João Paulo II (BUJPII) da Universidade Católica Portuguesa (UCP) cujo conteúdo temático tem sido até agora representado pela Classificação Decimal Universal (CDU), linguagem documental pouco acessível a grande parte dos nossos utilizadores, na sua maioria estudantes universitários que a consideram um instrumento de pesquisa pouco amigável porque estão muito pouco ou nada familiarizados com este tipo de classificação numérica preferindo o uso de palavras-chave no acesso ao conteúdo temático das obras. Com este objectivo em vista, propusemo-nos levar a cabo este trabalho de investigação fazendo a harmonização (correspondência) entre as notações da CDU, usada na classificação da colecção de fundos da BUJPII e uma lista simplificada de Cabeçalhos de Assunto da Biblioteca do Congresso, com o propósito de iniciar um processo de atribuição de cabeçalhos de assunto, mapeados a partir das notações da CDU, a parte dos referidos fundos, cuja recuperação de conteúdo tem sido feita até agora através da Classificação Decimal Universal. O estudo incidiu experimentalmente numa amostragem de monografias de áreas não indexadas mas já classificadas, cujos registos bibliográficos se encontram na base de dados da Biblioteca Universitária João Paulo II. O projecto consistiu na atribuição de cabeçalhos de assunto, traduzidos manualmente para português a partir da lista em inglês dos Cabeçalhos de Assunto da Biblioteca do Congresso (LCSH). Procurou-se que estivessem semanticamente tão próximos quanto possível dos assuntos que correspondiam às notações da Classificação Decimal Universal (CDU) com as quais as monografias tinham sido anteriormente classificadas. O trabalho foi primeiro elaborado de forma manual e depois “carregado” no software Horizon, dado ser este o sistema informático de gestão integrada em uso na Biblioteca Universitária João Paulo II, sendo o objectivo futuro a indexação de todas as áreas do seu acervo bibliográfico, como forma complementar privilegiada no acesso à informação.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El artículo analiza la conservación del patrimonio arquitectónico en el Ecuador. Propone recuperar la noción de patrimonio ligada a la noción de ciudadanía. El ensayo plantea el estado de la cuestión en Ecuador, revisa los casos de Quito y Guayaquil y se concentra en la problemática en Cuenca, ciudad Patrimonio de la Humanidad desde 1999. Formula la necesidad de que la investigación y la difusión de la historia de estas ciudades sean adoptadas como parte de una política pública permanente.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The representation of the diurnal cycle in the Hadley Centre climate model is evaluated using simulations of the infrared radiances observed by Meteosat 7. In both the window and water vapour channels, the standard version of the model with 19 levels produces a good simulation of the geographical distributions of the mean radiances and of the amplitude of the diurnal cycle. Increasing the vertical resolution to 30 levels leads to further improvements in the mean fields. The timing of the maximum and minimum radiances reveals significant model errors, however, which are sensitive to the frequency with which the radiation scheme is called. In most regions, these errors are consistent with well documented errors in the timing of convective precipitation, which peaks before noon in the model, in contrast to the observed peak in the late afternoon or evening. When the radiation scheme is called every model time step (half an hour), as opposed to every three hours in the standard version, the timing of the minimum radiance is improved for convective regions over central Africa, due to the creation of upper-level layer-cloud by detrainment from the convection scheme, which persists well after the convection itself has dissipated. However, this produces a decoupling between the timing of the diurnal cycles of precipitation and window channel radiance. The possibility is raised that a similar decoupling may occur in reality and the implications of this for the retrieval of the diurnal cycle of precipitation from infrared radiances are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The water vapour continuum absorption is an important component of molecular absorption of radiation in atmosphere. However, uncertainty in knowledge of the value of the continuum absorption at present can achieve 100% in different spectral regions leading to an error in flux calculation up to 3-5 W/m2 global mean. This work uses line-by-line calculations to reveal the best spectral intervals for experimental verification of the CKD water vapour continuum models in the currently least studied near-infrared spectral region. Possible sources of errors in continuum retrieval taken into account in the simulation include the sensitivity of laboratory spectrometers and uncertainties in the spectral line parameters in HITRAN-2004 and Schwenke-Partridge database. It is shown that a number of micro-windows in near-IR can be used at present for laboratory detection of the water vapour continuum with estimated accuracy from 30 to 5%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we introduce a novel high-level visual content descriptor which is devised for performing semantic-based image classification and retrieval. The work can be treated as an attempt to bridge the so called “semantic gap”. The proposed image feature vector model is fundamentally underpinned by the image labelling framework, called Collaterally Confirmed Labelling (CCL), which incorporates the collateral knowledge extracted from the collateral texts of the images with the state-of-the-art low-level image processing and visual feature extraction techniques for automatically assigning linguistic keywords to image regions. Two different high-level image feature vector models are developed based on the CCL labelling of results for the purposes of image data clustering and retrieval respectively. A subset of the Corel image collection has been used for evaluating our proposed method. The experimental results to-date already indicates that our proposed semantic-based visual content descriptors outperform both traditional visual and textual image feature models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the model SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes), which is a vertical (1-D) integrated radiative transfer and energy balance model. The model links visible to thermal infrared radiance spectra (0.4 to 50 μm) as observed above the canopy to the fluxes of water, heat and carbon dioxide, as a function of vegetation structure, and the vertical profiles of temperature. Output of the model is the spectrum of outgoing radiation in the viewing direction and the turbulent heat fluxes, photosynthesis and chlorophyll fluorescence. A special routine is dedicated to the calculation of photosynthesis rate and chlorophyll fluorescence at the leaf level as a function of net radiation and leaf temperature. The fluorescence contributions from individual leaves are integrated over the canopy layer to calculate top-of-canopy fluorescence. The calculation of radiative transfer and the energy balance is fully integrated, allowing for feedback between leaf temperatures, leaf chlorophyll fluorescence and radiative fluxes. Leaf temperatures are calculated on the basis of energy balance closure. Model simulations were evaluated against observations reported in the literature and against data collected during field campaigns. These evaluations showed that SCOPE is able to reproduce realistic radiance spectra, directional radiance and energy balance fluxes. The model may be applied for the design of algorithms for the retrieval of evapotranspiration from optical and thermal earth observation data, for validation of existing methods to monitor vegetation functioning, to help interpret canopy fluorescence measurements, and to study the relationships between synoptic observations with diurnally integrated quantities. The model has been implemented in Matlab and has a modular design, thus allowing for great flexibility and scalability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mainframes, corporate and central servers are becoming information servers. The requirement for more powerful information servers is the best opportunity to exploit the potential of parallelism. ICL recognized the opportunity of the 'knowledge spectrum' namely to convert raw data into information and then into high grade knowledge. Parallel Processing and Data Management Its response to this and to the underlying search problems was to introduce the CAFS retrieval engine. The CAFS product demonstrates that it is possible to move functionality within an established architecture, introduce a different technology mix and exploit parallelism to achieve radically new levels of performance. CAFS also demonstrates the benefit of achieving this transparently behind existing interfaces. ICL is now working with Bull and Siemens to develop the information servers of the future by exploiting new technologies as available. The objective of the joint Esprit II European Declarative System project is to develop a smoothly scalable, highly parallel computer system, EDS. EDS will in the main be an SQL server and an information server. It will support the many data-intensive applications which the companies foresee; it will also support application-intensive and logic-intensive systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An efficient method is described for the approximate calculation of the intensity of multiply scattered lidar returns. It divides the outgoing photons into three populations, representing those that have experienced zero, one, and more than one forward-scattering event. Each population is parameterized at each range gate by its total energy, its spatial variance, the variance of photon direction, and the covariance, of photon direction and position. The result is that for an N-point profile the calculation is O(N-2) efficient and implicitly includes up to N-order scattering, making it ideal for use in iterative retrieval algorithms for which speed is crucial. In contrast, models that explicitly consider each scattering order separately are at best O(N-m/m!) efficient for m-order scattering and often cannot be performed to more than the third or fourth order in retrieval algorithms. For typical cloud profiles and a wide range of lidar fields of view, the new algorithm is as accurate as an explicit calculation truncated at the fifth or sixth order but faster by several orders of magnitude. (C) 2006 Optical Society of America.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The combination of radar and lidar in space offers the unique potential to retrieve vertical profiles of ice water content and particle size globally, and two algorithms developed recently claim to have overcome the principal difficulty with this approach-that of correcting the lidar signal for extinction. In this paper "blind tests" of these algorithms are carried out, using realistic 94-GHz radar and 355-nm lidar backscatter profiles simulated from aircraft-measured size spectra, and including the effects of molecular scattering, multiple scattering, and instrument noise. Radiation calculations are performed on the true and retrieved microphysical profiles to estimate the accuracy with which radiative flux profiles could be inferred remotely. It is found that the visible extinction profile can be retrieved independent of assumptions on the nature of the size distribution, the habit of the particles, the mean extinction-to-backscatter ratio, or errors in instrument calibration. Local errors in retrieved extinction can occur in proportion to local fluctuations in the extinction-to-backscatter ratio, but down to 400 m above the height of the lowest lidar return, optical depth is typically retrieved to better than 0.2. Retrieval uncertainties are greater at the far end of the profile, and errors in total optical depth can exceed 1, which changes the shortwave radiative effect of the cloud by around 20%. Longwave fluxes are much less sensitive to errors in total optical depth, and may generally be calculated to better than 2 W m(-2) throughout the profile. It is important for retrieval algorithms to account for the effects of lidar multiple scattering, because if this is neglected, then optical depth is underestimated by approximately 35%, resulting in cloud radiative effects being underestimated by around 30% in the shortwave and 15% in the longwave. Unlike the extinction coefficient, the inferred ice water content and particle size can vary by 30%, depending on the assumed mass-size relationship (a problem common to all remote retrieval algorithms). However, radiative fluxes are almost completely determined by the extinction profile, and if this is correct, then errors in these other parameters have only a small effect in the shortwave (around 6%, compared to that of clear sky) and a negligible effect in the longwave.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the success of studies attempting to integrate remotely sensed data and flood modelling and the need to provide near-real time data routinely on a global scale as well as setting up online data archives, there is to date a lack of spatially and temporally distributed hydraulic parameters to support ongoing efforts in modelling. Therefore, the objective of this project is to provide a global evaluation and benchmark data set of floodplain water stages with uncertainties and assimilation in a large scale flood model using space-borne radar imagery. An algorithm is developed for automated retrieval of water stages with uncertainties from a sequence of radar imagery and data are assimilated in a flood model using the Tewkesbury 2007 flood event as a feasibility study. The retrieval method that we employ is based on possibility theory which is an extension of fuzzy sets and that encompasses probability theory. In our case we first attempt to identify main sources of uncertainty in the retrieval of water stages from radar imagery for which we define physically meaningful ranges of parameter values. Possibilities of values are then computed for each parameter using a triangular ‘membership’ function. This procedure allows the computation of possible values of water stages at maximum flood extents along a river at many different locations. At a later stage in the project these data are then used in assimilation, calibration or validation of a flood model. The application is subsequently extended to a global scale using wide swath radar imagery and a simple global flood forecasting model thereby providing improved river discharge estimates to update the latter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method to estimate the size and liquid water content of drizzle drops using lidar measurements at two wavelengths is described. The method exploits the differential absorption of infrared light by liquid water at 905 nm and 1.5 μm, which leads to a different backscatter cross section for water drops larger than ≈50 μm. The ratio of backscatter measured from drizzle samples below cloud base at these two wavelengths (the colour ratio) provides a measure of the median volume drop diameter D0. This is a strong effect: for D0=200 μm, a colour ratio of ≈6 dB is predicted. Once D0 is known, the measured backscatter at 905 nm can be used to calculate the liquid water content (LWC) and other moments of the drizzle drop distribution. The method is applied to observations of drizzle falling from stratocumulus and stratus clouds. High resolution (32 s, 36 m) profiles of D0, LWC and precipitation rate R are derived. The main sources of error in the technique are the need to assume a value for the dispersion parameter μ in the drop size spectrum (leading to at most a 35% error in R) and the influence of aerosol returns on the retrieval (≈10% error in R for the cases considered here). Radar reflectivities are also computed from the lidar data, and compared to independent measurements from a colocated cloud radar, offering independent validation of the derived drop size distributions.