891 resultados para information source
Resumo:
This paper reports a research to evaluate the potential and the effects of use of annotated Paraconsistent logic in automatic indexing. This logic attempts to deal with contradictions, concerned with studying and developing inconsistency-tolerant systems of logic. This logic, being flexible and containing logical states that go beyond the dichotomies yes and no, permits to advance the hypothesis that the results of indexing could be better than those obtained by traditional methods. Interactions between different disciplines, as information retrieval, automatic indexing, information visualization, and nonclassical logics were considered in this research. From the methodological point of view, an algorithm for treatment of uncertainty and imprecision, developed under the Paraconsistent logic, was used to modify the values of the weights assigned to indexing terms of the text collections. The tests were performed on an information visualization system named Projection Explorer (PEx), created at Institute of Mathematics and Computer Science (ICMC - USP Sao Carlos), with available source code. PEx uses traditional vector space model to represent documents of a collection. The results were evaluated by criteria built in the information visualization system itself, and demonstrated measurable gains in the quality of the displays, confirming the hypothesis that the use of the para-analyser under the conditions of the experiment has the ability to generate more effective clusters of similar documents. This is a point that draws attention, since the constitution of more significant clusters can be used to enhance information indexing and retrieval. It can be argued that the adoption of non-dichotomous (non-exclusive) parameters provides new possibilities to relate similar information.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Vegetal origin agro-industrial wastes are seen as a problem since the beginning of the industrial processes; however, they are becoming attractive as raw material for numerous purposes such as active enzymes and in the molecule bioprospecting area. Moreover, it is difficult to understand what the studied residue is consisted of in studies on agro-industrial waste, since the wastes names and constituents may vary according to the used equipment, as for waste from orange and mango processing. Thus, defining a specific waste, including comparisons between botanical and industrial descriptions, can help in understanding studies about wastes. The current review sought to contextualize such a scenario by gathering definitions, relevant information and studies on agro-industrial wastes and by-products, international enzymes market, and recent studies on bioactive compounds. In this context, waste from orange and mango are interesting because of the expression of these fruits on the world market; moreover, the processing does not include steps that could disrupt these biomolecules.
Resumo:
Setting:White-tailed deer represent the first wildlife reservoir of Mycobacterium bovis in the United States. The behavior of does with nursing fawns provides several potential mechanisms for disease transmission. Little information exists concerning transmission between doe and fawn, specifically transmammary transmission. Objective: Determine if fawns can become infected by ingestion of milk replacer containing M. bovis, thus simulating transmission from doe to fawn through contaminated milk. Design: Seventeen, 21-day-old white-tailed deer fawns were inoculated orally with 2x108 CFU (high dose, n=5), 2.5 x 105 to 2.5 x 106 CFU (medium dose, n=5), and 1x104 CFU (low dose, n=5) of M. bovis in milk replacer. Dosages were divided equally and fed daily over a 5-day period. Positive control fawns (n=2) received 1x105 CFU of M. bovis instilled in the tonsillar crypts. Fawns were euthanized and examined 35-115 days after inoculation and various tissues collected for bacteriologic and microscopic analysis. Results: All fawns in the tonsillar, high oral and medium oral dose groups developed generalized tuberculosis involving numerous organs and tissues by 35-84 days after inoculation. Three of five fawns in the low-dose oral group had tuberculous lesions in the mandibular lymph node, and one of five had lesions in the medial retropharyngeal lymph node when examined 115 days after inoculation. Conclusion: White-tailed deer fawns can become infected through oral exposure to M. bovis. Therefore, the potential exists for fawns to acquire M. bovis while nursing tuberculous does.
Resumo:
The discussions on the future of cataloging has received increased attention in the last ten years, mainly due to the impact of rapid development of information and communication technologies in the same period, which has provided access to the Web anytime, anywhere. These discussions revolve around the need for a new bibliographic framework to meet the demand of this new reality in the digital environment, ie how libraries can process, store, deliver, share and integrate their collections (physical, digital or scanned), in current post-PC era? Faced with this question, Open Access, Open Source and Open Standards are three concepts that need to receive greater attention in the field of Library and Information Science, as it is believed to be fundamental elements for the change of paradigm of descriptive representation, currently based conceptually on physical item rather than intellectual work. This paper aims to raise and discuss such issues and instigate information professionals, especially librarians, to think, discuss and propose initiatives for such problems, contributing and sharing ideas and possible solutions, in multidisciplinary teams. At the end is suggested the effective creation of multidisciplinary and inter-institutional study groups on the future of cataloging and its impact on national collections, in order to contribute to the area of descriptive representation in national and international level
Resumo:
[EN] This abstract describes the development of a wildfire forecasting plugin using Capaware. Capaware is designed as an easy to use open source framework to develop 3D graphics applications over large geographic areas offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems (GIS) community.
Resumo:
[EN]This paper describes a wildfi re forecasting application based on a 3D virtual environment and a fi re simulation engine. A novel open source framework is presented for the development of 3D graphics applications over large geographic areas, off ering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems (GIS) community. The application includes a remote module that allows simultaneous connection of several users for monitoring a real wildfi re event.
Resumo:
[EN]One of the main issues of the current education system is the lack of student motivation. This aspect together with the permanent change that the Information and Communications Technologies involve represents a major challenge for the teacher: to continuously update contents and to keep awake the student’s interest. A tremendously useful tool in classrooms consists on the integration of projects with participative and collaborative dynamics, where the teacher acts mainly as a guidance to the student activity instead of being a mere knowledge and evaluation transmitter. As a specific example of project based learning, the EDUROVs project consists on building an economic underwater robot using low cost materials, but allowing the integration and programming of many accessories and sensors with minimum budget using opensource hardware and software.
Resumo:
To understand a city and its urban structure it is necessary to study its history. This is feasible through GIS (Geographical Information Systems) and its by-products on the web. Starting from a cartographic view they allow an initial understanding of, and a comparison between, present and past data together with an easy and intuitive access to database information. The research done led to the creation of a GIS for the city of Bologna. It is based on varied data such as historical map, vector and alphanumeric historical data, etc.. After providing information about GIS we thought of spreading and sharing the collected data on the Web after studying two solutions available on the market: Web Mapping and WebGIS. In this study we discuss the stages, beginning with the development of Historical GIS of Bologna, which led to the making of a WebGIS Open Source (MapServer and Chameleon) and the Web Mapping services (Google Earth, Google Maps and OpenLayers).
Resumo:
The present study has been carried out with the following objectives: i) To investigate the attributes of source parameters of local and regional earthquakes; ii) To estimate, as accurately as possible, M0, fc, Δσ and their standard errors to infer their relationship with source size; iii) To quantify high-frequency earthquake ground motion and to study the source scaling. This work is based on observational data of micro, small and moderate -earthquakes for three selected seismic sequences, namely Parkfield (CA, USA), Maule (Chile) and Ferrara (Italy). For the Parkfield seismic sequence (CA), a data set of 757 (42 clusters) repeating micro-earthquakes (0 ≤ MW ≤ 2), collected using borehole High Resolution Seismic Network (HRSN), have been analyzed and interpreted. We used the coda methodology to compute spectral ratios to obtain accurate values of fc , Δσ, and M0 for three target clusters (San Francisco, Los Angeles, and Hawaii) of our data. We also performed a general regression on peak ground velocities to obtain reliable seismic spectra of all earthquakes. For the Maule seismic sequence, a data set of 172 aftershocks of the 2010 MW 8.8 earthquake (3.7 ≤ MW ≤ 6.2), recorded by more than 100 temporary broadband stations, have been analyzed and interpreted to quantify high-frequency earthquake ground motion in this subduction zone. We completely calibrated the excitation and attenuation of the ground motion in Central Chile. For the Ferrara sequence, we calculated moment tensor solutions for 20 events from MW 5.63 (the largest main event occurred on May 20 2012), down to MW 3.2 by a 1-D velocity model for the crust beneath the Pianura Padana, using all the geophysical and geological information available for the area. The PADANIA model allowed a numerical study on the characteristics of the ground motion in the thick sediments of the flood plain.
Resumo:
The composition of the atmosphere is frequently perturbed by the emission of gaseous and particulate matter from natural as well as anthropogenic sources. While the impact of trace gases on the radiative forcing of the climate is relatively well understood the role of aerosol is far more uncertain. Therefore, the study of the vertical distribution of particulate matter in the atmosphere and its chemical composition contribute valuable information to bridge this gap of knowledge. The chemical composition of aerosol reveals information on properties such as radiative behavior and hygroscopicity and therefore cloud condensation or ice nucleus potential. rnThis thesis focuses on aerosol pollution plumes observed in 2008 during the POLARCAT (Polar Study using Aircraft, Remote Sensing, Surface Measurements and Models, of Climate, Chemistry, Aerosols, and Transport) campaign over Greenland in June/July and CONCERT (Contrail and Cirrus Experiment) campaign over Central and Western Europe in October/November. Measurements were performed with an Aerodyne compact time-of-flight aerosol mass spectrometer (AMS) capable of online size-resolved chemical characterization of non-refractory submicron particles. In addition, the origins of pollution plumes were determined by means of modeling tools. The characterized pollution episodes originated from a large variety of sources and were encountered at distinct altitudes. They included pure natural emissions from two volcanic eruptions in 2008. By the time of detection over Western Europe between 10 and 12 km altitude the plume was about 3 months old and composed to 71 % of particulate sulfate and 21 % of carbonaceous compounds. Also, biomass burning (BB) plumes were observed over Greenland between 4 and 7 km altitude (free troposphere) originating from Canada and East Siberia. The long-range transport took roughly one and two weeks, respectively. The aerosol was composed of 78 % organic matter and 22 % particulate sulfate. Some Canadian and all Siberian BB plumes were mixed with anthropogenic emissions from fossil fuel combustion (FF) in North America and East Asia. It was found that the contribution of particulate sulfate increased with growing influences from anthropogenic activity and Asia reaching up to 37 % after more than two weeks of transport time. The most exclusively anthropogenic emission source probed in the upper troposphere was engine exhaust from commercial aircraft liners over Germany. However, in-situ characterization of this aerosol type during aircraft chasing was not possible. All long-range transport aerosol was found to have an O:C ratio close to or greater than 1 implying that low-volatility oxygenated organic aerosol was present in each case despite the variety of origins and the large range in age from 3 to 100 days. This leads to the conclusion that organic particulate matter reaches a final and uniform state of oxygenation after at least 3 days in the free troposphere. rnExcept for aircraft exhaust all emission sources mentioned above are surface-bound and thus rely on different types of vertical transport mechanisms, such as direct high altitude injection in the case of a volcanic eruption, or severe BB, or uplift by convection, to reach higher altitudes where particles can travel long distances before removal mainly caused by cloud scavenging. A lifetime for North American mixed BB and FF aerosol of 7 to 11 days was derived. This in consequence means that emission from surface point sources, e.g. volcanoes, or regions, e.g. East Asia, do not only have a relevant impact on the immediate surroundings but rather on a hemispheric scale including such climate sensitive zones as the tropopause or the Arctic.
Resumo:
Die Molekularbiologie von Menschen ist ein hochkomplexes und vielfältiges Themengebiet, in dem in vielen Bereichen geforscht wird. Der Fokus liegt hier insbesondere auf den Bereichen der Genomik, Proteomik, Transkriptomik und Metabolomik, und Jahre der Forschung haben große Mengen an wertvollen Daten zusammengetragen. Diese Ansammlung wächst stetig und auch für die Zukunft ist keine Stagnation absehbar. Mittlerweile aber hat diese permanente Informationsflut wertvolles Wissen in unüberschaubaren, digitalen Datenbergen begraben und das Sammeln von forschungsspezifischen und zuverlässigen Informationen zu einer großen Herausforderung werden lassen. Die in dieser Dissertation präsentierte Arbeit hat ein umfassendes Kompendium von humanen Geweben für biomedizinische Analysen generiert. Es trägt den Namen medicalgenomics.org und hat diverse biomedizinische Probleme auf der Suche nach spezifischem Wissen in zahlreichen Datenbanken gelöst. Das Kompendium ist das erste seiner Art und sein gewonnenes Wissen wird Wissenschaftlern helfen, einen besseren systematischen Überblick über spezifische Gene oder funktionaler Profile, mit Sicht auf Regulation sowie pathologische und physiologische Bedingungen, zu bekommen. Darüber hinaus ermöglichen verschiedene Abfragemethoden eine effiziente Analyse von signalgebenden Ereignissen, metabolischen Stoffwechselwegen sowie das Studieren der Gene auf der Expressionsebene. Die gesamte Vielfalt dieser Abfrageoptionen ermöglicht den Wissenschaftlern hoch spezialisierte, genetische Straßenkarten zu erstellen, mit deren Hilfe zukünftige Experimente genauer geplant werden können. Infolgedessen können wertvolle Ressourcen und Zeit eingespart werden, bei steigenden Erfolgsaussichten. Des Weiteren kann das umfassende Wissen des Kompendiums genutzt werden, um biomedizinische Hypothesen zu generieren und zu überprüfen.
Resumo:
A major barrier to widespread clinical implementation of Monte Carlo dose calculation is the difficulty in characterizing the radiation source within a generalized source model. This work aims to develop a generalized three-component source model (target, primary collimator, flattening filter) for 6- and 18-MV photon beams that match full phase-space data (PSD). Subsource by subsource comparison of dose distributions, using either source PSD or the source model as input, allows accurate source characterization and has the potential to ease the commissioning procedure, since it is possible to obtain information about which subsource needs to be tuned. This source model is unique in that, compared to previous source models, it retains additional correlations among PS variables, which improves accuracy at nonstandard source-to-surface distances (SSDs). In our study, three-dimensional (3D) dose calculations were performed for SSDs ranging from 50 to 200 cm and for field sizes from 1 x 1 to 30 x 30 cm2 as well as a 10 x 10 cm2 field 5 cm off axis in each direction. The 3D dose distributions, using either full PSD or the source model as input, were compared in terms of dose-difference and distance-to-agreement. With this model, over 99% of the voxels agreed within +/-1% or 1 mm for the target, within 2% or 2 mm for the primary collimator, and within +/-2.5% or 2 mm for the flattening filter in all cases studied. For the dose distributions, 99% of the dose voxels agreed within 1% or 1 mm when the combined source model-including a charged particle source and the full PSD as input-was used. The accurate and general characterization of each photon source and knowledge of the subsource dose distributions should facilitate source model commissioning procedures by allowing scaling the histogram distributions representing the subsources to be tuned.