854 resultados para Concept-based Retrieval
Resumo:
We establish a methodology for calculating uncertainties in sea surface temperature estimates from coefficient based satellite retrievals. The uncertainty estimates are derived independently of in-situ data. This enables validation of both the retrieved SSTs and their uncertainty estimate using in-situ data records. The total uncertainty budget is comprised of a number of components, arising from uncorrelated (eg. noise), locally systematic (eg. atmospheric), large scale systematic and sampling effects (for gridded products). The importance of distinguishing these components arises in propagating uncertainty across spatio-temporal scales. We apply the method to SST data retrieved from the Advanced Along Track Scanning Radiometer (AATSR) and validate the results for two different SST retrieval algorithms, both at a per pixel level and for gridded data. We find good agreement between our estimated uncertainties and validation data. This approach to calculating uncertainties in SST retrievals has a wider application to data from other instruments and retrieval of other geophysical variables.
Resumo:
Texture is one of the most important visual attributes used in image analysis. It is used in many content-based image retrieval systems, where it allows the identification of a larger number of images from distinct origins. This paper presents a novel approach for image analysis and retrieval based on complexity analysis. The approach consists of a texture segmentation step, performed by complexity analysis through BoxCounting fractal dimension, followed by the estimation of complexity of each computed region by multiscale fractal dimension. Experiments have been performed with MRI database in both pattern recognition and image retrieval contexts. Results show the accuracy of the method and also indicate how the performance changes as the texture segmentation process is altered.
Resumo:
This work develops a methodology (using the degree-days concept and linear regression), to forecast the duration of phenological phases in crops. An experiment was conducted in the greenhouse with three cultivars of cowpea (Vigna unguiculata (C.) Walp.), cv. California-781, Tvx 5058-09C and IT 81D-1032. The methodology was based on the relative thermal efficiency rate, determined for each species or cv. The results show that the proposed methodology may be a good alternative in works involving crops, especially because it does not require the repetition of the experiments.
Resumo:
Accurate long-term monitoring of total ozone is one of the most important requirements for identifying possible natural or anthropogenic changes in the composition of the stratosphere. For this purpose, the NDACC (Network for the Detection of Atmospheric Composition Change) UV-visible Working Group has made recommendations for improving and homogenizing the retrieval of total ozone columns from twilight zenith-sky visible spectrometers. These instruments, deployed all over the world in about 35 stations, allow measuring total ozone twice daily with limited sensitivity to stratospheric temperature and cloud cover. The NDACC recommendations address both the DOAS spectral parameters and the calculation of air mass factors (AMF) needed for the conversion of O-3 slant column densities into vertical column amounts. The most important improvement is the use of O-3 AMF look-up tables calculated using the TOMS V8 (TV8) O-3 profile climatology, that allows accounting for the dependence of the O-3 AMF on the seasonal and latitudinal variations of the O-3 vertical distribution. To investigate their impact on the retrieved ozone columns, the recommendations have been applied to measurements from the NDACC/SAOZ (Systeme d'Analyse par Observation Zenithale) network. The revised SAOZ ozone data from eight stations deployed at all latitudes have been compared to TOMS, GOMEGDP4, SCIAMACHY-TOSOMI, SCIAMACHY-OL3, OMI-TOMS, and OMI-DOAS satellite overpass observations, as well as to those of collocated Dobson and Brewer instruments at Observatoire de Haute Provence (44 degrees N, 5.5 degrees E) and Sodankyla (67 degrees N, 27 degrees E), respectively. A significantly better agreement is obtained between SAOZ and correlative reference ground-based measurements after applying the new O-3 AMFs. However, systematic seasonal differences between SAOZ and satellite instruments remain. These are shown to mainly originate from (i) a possible problem in the satellite retrieval algorithms in dealing with the temperature dependence of the ozone cross-sections in the UV and the solar zenith angle (SZA) dependence, (ii) zonal modulations and seasonal variations of tropospheric ozone columns not accounted for in the TV8 profile climatology, and (iii) uncertainty on the stratospheric ozone profiles at high latitude in the winter in the TV8 climatology. For those measurements mostly sensitive to stratospheric temperature like TOMS, OMI-TOMS, Dobson and Brewer, or to SZA like SCIAMACHY-TOSOMI, the application of temperature and SZA corrections results in the almost complete removal of the seasonal difference with SAOZ, improving significantly the consistency between all ground-based and satellite total ozone observations.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
This article reports about the internet based, second multicenter study (MCS II) of the spine study group (AG WS) of the German trauma association (DGU). It represents a continuation of the first study conducted between the years 1994 and 1996 (MCS I). For the purpose of one common, centralised data capture methodology, a newly developed internet-based data collection system ( http://www.memdoc.org ) of the Institute for Evaluative Research in Orthopaedic Surgery of the University of Bern was used. The aim of this first publication on the MCS II was to describe in detail the new method of data collection and the structure of the developed data base system, via internet. The goal of the study was the assessment of the current state of treatment for fresh traumatic injuries of the thoracolumbar spine in the German speaking part of Europe. For that reason, we intended to collect large number of cases and representative, valid information about the radiographic, clinical and subjective treatment outcomes. Thanks to the new study design of MCS II, not only the common surgical treatment concepts, but also the new and constantly broadening spectrum of spine surgery, i.e. vertebro-/kyphoplasty, computer assisted surgery and navigation, minimal-invasive, and endoscopic techniques, documented and evaluated. We present a first statistical overview and preliminary analysis of 18 centers from Germany and Austria that participated in MCS II. A real time data capture at source was made possible by the constant availability of the data collection system via internet access. Following the principle of an application service provider, software, questionnaires and validation routines are located on a central server, which is accessed from the periphery (hospitals) by means of standard Internet browsers. By that, costly and time consuming software installation and maintenance of local data repositories are avoided and, more importantly, cumbersome migration of data into one integrated database becomes obsolete. Finally, this set-up also replaces traditional systems wherein paper questionnaires were mailed to the central study office and entered by hand whereby incomplete or incorrect forms always represent a resource consuming problem and source of error. With the new study concept and the expanded inclusion criteria of MCS II 1, 251 case histories with admission and surgical data were collected. This remarkable number of interventions documented during 24 months represents an increase of 183% compared to the previously conducted MCS I. The concept and technical feasibility of the MEMdoc data collection system was proven, as the participants of the MCS II succeeded in collecting data ever published on the largest series of patients with spinal injuries treated within a 2 year period.
Resumo:
The ancient Indian system of Ayurveda revolves around the concept of tridosha namely vata, pitta and kapha. The disturbance in the steady state of these three factors is said to be the starting point of the development of diseases. Even though the diseases and their associated symptoms are well- described in the medical system, no mention is directly made in the basic texts about the quantification. We offer here a specific protocol to quantify a disease in terms of ayurvedic principles