935 resultados para International Institute for Applied Systems Analysis
Resumo:
La mayor parte delas personas que conviven con algún tipo de enfermedad, tienden a adoptar un mayor número de hábitos saludables, pudiendo crear nuevas maneras de ver la vida y a sí mismos. El objetivo de este estudio fue verificar la relación entre índices de calidad de vida y percepción de la imagen corporal de los pacientes incluidos en un programa de rehabilitación cardiovascular en Florianópolis-Brasil. La muestra estuvo compuesta por 24 sujetos varones con una edad de 62 ± 1,3 años, portadores de Enfermedad Arterial Coronaria. Para evaluar la calidad de vida, se utilizó el cuestionario Minnesota Living With Heart Failure Questionnaire (MLHFQ) y para identificar el grado de descontento de la muestra con la imagen corporal, se aplicó el cuestionario de Stunkard y Sorensen (1993). El análisis de las pruebas fue hecho a través de un programa de estadísticas utilizando para tal fin, el software SPSS 11.0. El grado de asociación entre variables fue estudiado a partir del test de Kendall. Se verificó que cuanto mayor es el IMC y la silueta actual, mayor el grado de insatisfacción con la imagen corporal. Los síntomas emocionales también parecen estar correlacionados significativamente con un deseo de obtener una menor silueta corporal y con indicadores de menor calidad de vida (r= 0,474 y r= 0,735; p mayor 0,05). Los síntomas físicostambién se encuentran correlacionados de manera significativa con los síntomas emocionales. Estos resultados sugieren que las variables referentes a la calidad de vida poseen un peso significativo en la imagen corporal y la satisfacción con ésta, parece correlacionar con una menor cantidad de problemas emocionales y en un mejor afrontamiento de la enfermedad. Los programas de rehabilitación cardiovascular que implementan actividad física en los hábitos diarios se muestran adecuados como herramienta para la mejora de dichas dolencias en esta fase post aguda
Resumo:
El estudio de los vínculos entre distintas comunidades generó diversos modelos teóricos e interpretaciones que fueron desarrollados para el estudio de las relaciones de intercambio existentes entre las poblaciones antiguas. En general, estas posturas teóricas se pueden diferenciar entre las que se concentran específicamente en los vínculos de intercambio establecidos entre distintas comunidades y diferentes áreas, y las que analizan el efecto de ellos en los procesos internos de una comunidad. En este trabajo nos enfocamos en el primer aspecto, ya que nuestro objeto es abordar las relaciones de intercambio existentes desde la Alta Nubia hasta el Levante, atravesando Baja Nubia, Alto Egipto y Bajo Egipto durante el período que se extiende desde el 3400 a.C al 3000 a.C., aplicando la teoría sistema-mundo y los análisis de los sistemas-mundo
Resumo:
Secchi depth is a measure of water transparency. In the Baltic Sea region, Secchi depth maps are used to assess eutrophication and as input for habitat models. Due to their spatial and temporal coverage, satellite data would be the most suitable data source for such maps. But the Baltic Sea's optical properties are so different from the open ocean that globally calibrated standard models suffer from large errors. Regional predictive models that take the Baltic Sea's special optical properties into account are thus needed. This paper tests how accurately generalized linear models (GLMs) and generalized additive models (GAMs) with MODIS/Aqua and auxiliary data as inputs can predict Secchi depth at a regional scale. It uses cross-validation to test the prediction accuracy of hundreds of GAMs and GLMs with up to 5 input variables. A GAM with 3 input variables (chlorophyll a, remote sensing reflectance at 678 nm, and long-term mean salinity) made the most accurate predictions. Tested against field observations not used for model selection and calibration, the best model's mean absolute error (MAE) for daily predictions was 1.07 m (22%), more than 50% lower than for other publicly available Baltic Sea Secchi depth maps. The MAE for predicting monthly averages was 0.86 m (15%). Thus, the proposed model selection process was able to find a regional model with good prediction accuracy. It could be useful to find predictive models for environmental variables other than Secchi depth, using data from other satellite sensors, and for other regions where non-standard remote sensing models are needed for prediction and mapping. Annual and monthly mean Secchi depth maps for 2003-2012 come with this paper as Supplementary materials.
Resumo:
Aims. We carried out an investigation of the surface variegation of comet 67P/Churyumov-Gerasimenko, the detection of regions showing activity, the determination of active and inactive surface regions of the comet with spectral methods, and the detection of fallback material. Methods. We analyzed multispectral data generated with Optical, Spectroscopic, and Infrared Remote Imaging System (OSIRIS) narrow angle camera (NAC) observations via spectral techniques, reflectance ratios, and spectral slopes in order to study active regions. We applied clustering analysis to the results of the reflectance ratios, and introduced the new technique of activity thresholds to detect areas potentially enriched in volatiles. Results. Local color inhomogeneities are detected over the investigated surface regions. Active regions, such as Hapi, the active pits of Seth and Ma'at, the clustered and isolated bright features in Imhotep, the alcoves in Seth and Ma'at, and the large alcove in Anuket, have bluer spectra than the overall surface. The spectra generated with OSIRIS NAC observations are dominated by cometary emissions of around 700 nm to 750 nm as a result of the coma between the comet's surface and the camera. One of the two isolated bright features in the Imhotep region displays an absorption band of around 700 nm, which probably indicates the existence of hydrated silicates. An absorption band with a center between 800-900 nm is tentatively observed in some regions of the nucleus surface. This absorption band can be explained by the crystal field absorption of Fe2+, which is a common spectral feature seen in silicates.
Resumo:
Most fusion satellite image methodologies at pixel-level introduce false spatial details, i.e.artifacts, in the resulting fusedimages. In many cases, these artifacts appears because image fusion methods do not consider the differences in roughness or textural characteristics between different land covers. They only consider the digital values associated with single pixels. This effect increases as the spatial resolution image increases. To minimize this problem, we propose a new paradigm based on local measurements of the fractal dimension (FD). Fractal dimension maps (FDMs) are generated for each of the source images (panchromatic and each band of the multi-spectral images) with the box-counting algorithm and by applying a windowing process. The average of source image FDMs, previously indexed between 0 and 1, has been used for discrimination of different land covers present in satellite images. This paradigm has been applied through the fusion methodology based on the discrete wavelet transform (DWT), using the à trous algorithm (WAT). Two different scenes registered by optical sensors on board FORMOSAT-2 and IKONOS satellites were used to study the behaviour of the proposed methodology. The implementation of this approach, using the WAT method, allows adapting the fusion process to the roughness and shape of the regions present in the image to be fused. This improves the quality of the fusedimages and their classification results when compared with the original WAT method
Resumo:
This poster raises the issue of a research work oriented to the storage, retrieval, representation and analysis of dynamic GI, taking into account The ultimate objective is the modelling and representation of the dynamic nature of geographic features, establishing mechanisms to store geometries enriched with a temporal structure (regardless of space) and a set of semantic descriptors detailing and clarifying the nature of the represented features and their temporality. the semantic, the temporal and the spatiotemporal components. We intend to define a set of methods, rules and restrictions for the adequate integration of these components into the primary elements of the GI: theme, location, time [1]. We intend to establish and incorporate three new structures (layers) into the core of data storage by using mark-up languages: a semantictemporal structure, a geosemantic structure, and an incremental spatiotemporal structure. Thus, data would be provided with the capability of pinpointing and expressing their own basic and temporal characteristics, enabling them to interact each other according to their context, and their time and meaning relationships that could be eventually established
Resumo:
The aim is to obtain computationally more powerful, neuro physiologically founded, artificial neurons and neural nets. Artificial Neural Nets (ANN) of the Perceptron type evolved from the original proposal by McCulloch an Pitts classical paper [1]. Essentially, they keep the computing structure of a linear machine followed by a non linear operation. The McCulloch-Pitts formal neuron (which was never considered by the author’s to be models of real neurons) consists of the simplest case of a linear computation of the inputs followed by a threshold. Networks of one layer cannot compute anylogical function of the inputs, but only those which are linearly separable. Thus, the simple exclusive OR (contrast detector) function of two inputs requires two layers of formal neurons
Resumo:
Trillas et al. (1999, Soft computing, 3 (4), 197–199) and Trillas and Cubillo (1999, On non-contradictory input/output couples in Zadeh's CRI proceeding, 28–32) introduced the study of contradiction in the framework of fuzzy logic because of the significance of avoiding contradictory outputs in inference processes. Later, the study of contradiction in the framework of Atanassov's intuitionistic fuzzy sets (A-IFSs) was initiated by Cubillo and Castiñeira (2004, Contradiction in intuitionistic fuzzy sets proceeding, 2180–2186). The axiomatic definition of contradiction measure was stated in Castiñeira and Cubillo (2009, International journal of intelligent systems, 24, 863–888). Likewise, the concept of continuity of these measures was formalized through several axioms. To be precise, they defined continuity when the sets ‘are increasing’, denominated continuity from below, and continuity when the sets ‘are decreasing’, or continuity from above. The aim of this paper is to provide some geometrical construction methods for obtaining contradiction measures in the framework of A-IFSs and to study what continuity properties these measures satisfy. Furthermore, we show the geometrical interpretations motivating the measures.
Resumo:
Copper nitride is a metastable material which results very attractive because of their potential to be used in functional device. Cu3 N easily decomposes into Cu and N2 by annealing [1] or irradiation (electron, ions, laser) [2, 3]. Previous studies carried out in N-rich Cu3 N films irradiated with Cu at 42MeV evidence a very efficient sputtering of N whose yield (5×10 3 atom/ion), for a film with a thickness of just 100 nm, suggest that the origin of the sputtering has an electronic nature. This N depletion was observed to be responsible for new phase formation ( Cu2 O) and pure Cu [4]
Resumo:
Interlinking text documents with Linked Open Data enables the Web of Data to be used as background knowledge within document-oriented applications such as search and faceted browsing. As a step towards interconnecting the Web of Documents with the Web of Data, we developed DBpedia Spotlight, a system for automatically annotating text documents with DBpedia URIs. DBpedia Spotlight allows users to configure the annotations to their specific needs through the DBpedia Ontology and quality measures such as prominence, topical pertinence, contextual ambiguity and disambiguation confidence. We compare our approach with the state of the art in disambiguation, and evaluate our results in light of three baselines and six publicly available annotation systems, demonstrating the competitiveness of our system. DBpedia Spotlight is shared as open source and deployed as a Web Service freely available for public use.
Resumo:
An important competence of human data analysts is to interpret and explain the meaning of the results of data analysis to end-users. However, existing automatic solutions for intelligent data analysis provide limited help to interpret and communicate information to non-expert users. In this paper we present a general approach to generating explanatory descriptions about the meaning of quantitative sensor data. We propose a type of web application: a virtual newspaper with automatically generated news stories that describe the meaning of sensor data. This solution integrates a variety of techniques from intelligent data analysis into a web-based multimedia presentation system. We validated our approach in a real world problem and demonstrate its generality using data sets from several domains. Our experience shows that this solution can facilitate the use of sensor data by general users and, therefore, can increase the utility of sensor network infrastructures.
Resumo:
A Probabilistic Safety Assessment (PSA) is being developed for a steam-methane reforming hydrogen production plant linked to a High-Temperature Gas Cooled Nuclear Reactor (HTGR). This work is based on the Japan Atomic Energy Research Institute’s (JAERI) High Temperature Test Reactor (HTTR) prototype in Japan. This study has two major objectives: calculate the risk to onsite and offsite individuals, and calculate the frequency of different types of damage to the complex. A simplified HAZOP study was performed to identify initiating events, based on existing studies. The initiating events presented here are methane pipe break, helium pipe break, and PPWC heat exchanger pipe break. Generic data was used for the fault tree analysis and the initiating event frequency. Saphire was used for the PSA analysis. The results show that the average frequency of an accident at this complex is 2.5E-06, which is divided into the various end states. The dominant sequences result in graphite oxidation which does not pose a health risk to the population. The dominant sequences that could affect the population are those that result in a methane explosion and occur 6.6E-8/year, while the other sequences are much less frequent. The health risk presents itself if there are people in the vicinity who could be affected by the explosion. This analysis also demonstrates that an accident in one of the plants has little effect on the other. This is true given the design base distance between the plants, the fact that the reactor is underground, as well as other safety characteristics of the HTGR. Sensitivity studies are being performed in order to determine where additional and improved data is needed.
Resumo:
Collaborative filtering recommender systems contribute to alleviating the problem of information overload that exists on the Internet as a result of the mass use of Web 2.0 applications. The use of an adequate similarity measure becomes a determining factor in the quality of the prediction and recommendation results of the recommender system, as well as in its performance. In this paper, we present a memory-based collaborative filtering similarity measure that provides extremely high-quality and balanced results; these results are complemented with a low processing time (high performance), similar to the one required to execute traditional similarity metrics. The experiments have been carried out on the MovieLens and Netflix databases, using a representative set of information retrieval quality measures.
Resumo:
Validación de la cartografía generada del terreno a partir de una nuevo sistema de validación propuesto