44 resultados para Knowledge Information Objects
Resumo:
The observation that real complex networks have internal structure has important implication for dynamic processes occurring on such topologies. Here we investigate the impact of community structure on a model of information transfer able to deal with both search and congestion simultaneously. We show that networks with fuzzy community structure are more efficient in terms of packet delivery than those with pronounced community structure. We also propose an alternative packet routing algorithm which takes advantage of the knowledge of communities to improve information transfer and show that in the context of the model an intermediate level of community structure is optimal. Finally, we show that in a hierarchical network setting, providing knowledge of communities at the level of highest modularity will improve network capacity by the largest amount.
Resumo:
Transmission electron microscopy is a proven technique in the field of cell biology and a very useful tool in biomedical research. Innovation and improvements in equipment together with the introduction of new technology have allowed us to improve our knowledge of biological tissues, to visualizestructures better and both to identify and to locate molecules. Of all the types ofmicroscopy exploited to date, electron microscopy is the one with the mostadvantageous resolution limit and therefore it is a very efficient technique fordeciphering the cell architecture and relating it to function. This chapter aims toprovide an overview of the most important techniques that we can apply to abiological sample, tissue or cells, to observe it with an electron microscope, fromthe most conventional to the latest generation. Processes and concepts aredefined, and the advantages and disadvantages of each technique are assessedalong with the image and information that we can obtain by using each one ofthem.
Resumo:
Peer-reviewed
Resumo:
Aquest document de treball mira d'establir un nou camp d'investigació a la cruïlla entre els fluxos de migració i d'informació i comunicació. Hi ha diversos factors que fan que valgui la pena adoptar aquesta perspectiva. El punt central és que la migració internacional contemporània és incrustada en la dinàmica de la societat de la informació, seguint models comuns i dinàmiques interconnectades. Per consegüent, s'està començant a identificar els fluxos d'informació com a qüestions clau en les polítiques de migració. A més, hi ha una manca de coneixement empíric en el disseny de xarxes d'informació i l'ús de les tecnologies d'informació i comunicació en contextos migratoris. Aquest document de treball també mira de ser una font d'hipòtesis per a investigacions posteriors.
Resumo:
La migració internacional contemporània és integrada en un procés d'interconnexió global definit per les revolucions del transport i de les tecnologies de la informació i la comunicació. Una de les conseqüències d'aquesta interconnexió global és que les persones migrants tenen més capacitat per a processar informació tant abans com després de marxar. Aquests canvis podrien tenir implicacions inesperades per a la migració contemporània pel que fa a la capacitat de les persones migrants per a prendre decisions més informades, la reducció de la incertesa en contextos migratoris, el desdibuixament del concepte de distància o la decisió d'emigrar cap a llocs més llunyans. Aquesta recerca és important, ja que la manca de coneixement sobre aquesta qüestió podria contribuir a fer augmentar la distància entre els objectius de les polítiques de migració i els seus resultats. El paper que tenen els agents de la informació en els contextos migratoris també podria canviar. En aquest escenari, perquè les polítiques de migració siguin més efectives, s'haurà de tenir en compte la major capacitat de la població migrant de processar la informació i les fonts d'informació en què es confia. Aquest article demostra que l'equació més informació equival a més ben informat no es compleix sempre. Fins i tot en l'era de la informació, les fonts no fiables, les expectatives falses, la sobreinformació i els rumors encara són presents en els contextos migratoris. Tanmateix, defensem l'argument que aquests efectes no volguts es podrien reduir complint quatre requisits de la informació fiable: que sigui exhaustiva, que sigui rellevant, que s'hi confiï i que sigui actualitzada.
Resumo:
Observers are often required to adjust actions with objects that change their speed. However, no evidence for a direct sense of acceleration has been found so far. Instead, observers seem to detect changes in velocity within a temporal window when confronted with motion in the frontal plane (2D motion). Furthermore, recent studies suggest that motion-in-depth is detected by tracking changes of position in depth. Therefore, in order to sense acceleration in depth a kind of second-order computation would have to be carried out by the visual system. In two experiments, we show that observers misperceive acceleration of head-on approaches at least within the ranges we used [600-800 ms] resulting in an overestimation of arrival time. Regardless of the viewing condition (only monocular or monocular and binocular), the response pattern conformed to a constant velocity strategy. However, when binocular information was available, overestimation was highly reduced.
Resumo:
The Goliath grouper, Epinephelus itajara, a large-bodied (similar to 2.5 m TL, > 400 kg) and critically endangered fish (Epinephelidae), is highly Vulnerable to overfishing. Although protected from fishing in many countries, its exploitation in Mexico is unregulated; a situation that puts its populations at risk. Fishery records of E. itajara are scarce, which prevents determination of its fishery status. This work aimed to elucidate the E itajara fishery in the northern Yucatan Peninsula by 1) analyzing available catch records and 2) interviewing veteran fishermen (local ecological knowledge) from two traditional landing sites: Dzilam de Bravo and Puerto Progreso. Historic fishery records from two fishing cooperatives were analyzed in order to elucidate the current situation and offer viable alternatives for conservation and management. Catches have decreased severely. Local knowledge obtained from fishermen represented a very important source of information for reconstructing the fisheries history of this species. Conservation measures that incorporate regional and international regulations on critically endangered fish species are suggested
Resumo:
Semantic Web technology is able to provide the required computational semantics for interoperability of learning resources across different Learning Management Systems (LMS) and Learning Object Repositories (LOR). The EU research project LUISA (Learning Content Management System Using Innovative Semantic Web Services Architecture) addresses the development of a reference semantic architecture for the major challenges in the search, interchange and delivery of learning objects in a service-oriented context. One of the key issues, highlighted in this paper, is Digital Rights Management (DRM) interoperability. A Semantic Web approach to copyright management has been followed, which places a Copyright Ontology as the key component for interoperability among existing DRM systems and other licensing schemes like Creative Commons. Moreover, Semantic Web tools like reasoners, rule engines and semantic queries facilitate the implementation of an interoperable copyright management component in the LUISA architecture.
Resumo:
This empirical study consists in an investigation of the effects, on the development of Information Problem Solving (IPS) skills, of a long-term embedded, structured and supported instruction in Secondary Education. Forty secondary students of 7th and 8th grades (13–15 years old) participated in the 2-year IPS instruction designed in this study. Twenty of them participated in the IPS instruction, and the remaining twenty were the control group. All the students were pre- and post-tested in their regular classrooms, and their IPS process and performance were logged by means of screen capture software, to warrant their ecological validity. The IPS constituent skills, the web search sub-skills and the answers given by each participant were analyzed. The main findings of our study suggested that experimental students showed a more expert pattern than the control students regarding the constituent skill ‘defining the problem’ and the following two web search sub-skills: ‘search terms’ typed in a search engine, and ‘selected results’ from a SERP. In addition, scores of task performance were statistically better in experimental students than in control group students. The paper contributes to the discussion of how well-designed and well-embedded scaffolds could be designed in instructional programs in order to guarantee the development and efficiency of the students’ IPS skills by using net information better and participating fully in the global knowledge society.
Resumo:
This paper aims to explore asynchronous communication in computer supported collaborative learning (CSCL). Thirty virtual forums are analysed in both a quantitative and a qualitative way. Quantitatively, the number of messages written, message threads and original and answer messages are counted. Qualitatively, the content of the notes is analysed, cataloguing these into two different levels: on the one hand, as a set of knowledge building process categories, and on the other hand, following the scaffolds that Knowledge Forum offers. The results show that both an exchange of information and a collaborative work take place. Nevertheless, the construction of knowledge is superficial.
Resumo:
Background: Information about the composition of regulatory regions is of great value for designing experiments to functionally characterize gene expression. The multiplicity of available applications to predict transcription factor binding sites in a particular locus contrasts with the substantial computational expertise that is demanded to manipulate them, which may constitute a potential barrier for the experimental community. Results: CBS (Conserved regulatory Binding Sites, http://compfly.bio.ub.es/CBS) is a public platform of evolutionarily conserved binding sites and enhancers predicted in multiple Drosophila genomes that is furnished with published chromatin signatures associated to transcriptionally active regions and other experimental sources of information. The rapid access to this novel body of knowledge through a user-friendly web interface enables non-expert users to identify the binding sequences available for any particular gene, transcription factor, or genome region. Conclusions: The CBS platform is a powerful resource that provides tools for data mining individual sequences and groups of co-expressed genes with epigenomics information to conduct regulatory screenings in Drosophila.
Resumo:
In this paper we seek to verify the hypothesis that trust and cooperation between individuals, and between them and public institutions, can encourage technological innovation and the adoption of knowledge. Additionally, we test the extent to which the interaction of social capital with human capital and R&D expenditures improve their effect on a region’s ability to innovate. Our empirical evidence is taken from the Spanish regions and employs a knowledge production function and longitudinal count data models. Our results suggest that social capital correlates positively with innovation. Further, our analysis reveals a powerful interaction between human and social capital in the production of knowledge, whilst the complementarity with R&D efforts would seem less clear.
Resumo:
Observational and theoretical studies point to microquasars (MQs) as possible counterparts of a significant fraction of the unidentified gamma-ray sources detected so far. At present, a proper scenario to explain the emission beyond soft X-rays from these objects is not known, nor what the precise connection is between the radio and the high-energy radiation. We develop a new model where the MQ jet is dynamically dominated by cold protons and radiatively dominated by relativistic leptons. The matter content and power of the jet are both related with the accretion process. The magnetic field is assumed to be close to equipartition, although it is attached to and dominated by the jet matter. For the relativistic particles in the jet, their maximum energy depends on both the acceleration efficiency and the energy losses. The model takes into account the interaction of the relativistic jet particles with the magnetic field and all the photon and matter fields. Such interaction produces significant amounts of radiation from radio to very high energies through synchrotron, relativistic Bremsstrahlung, and inverse Compton (IC) processes. Variability of the emission produced by changes in the accretion process (e.g. via orbital eccentricity) is also expected. The effects of the gamma-ray absorption by the external photon fields on the gamma-ray spectrum have been taken into account, revealing clear spectral features that might be observed. This model is consistent to the accretion scenario, energy conservation laws, and current observational knowledge, and can provide deeper physical information of the source when tested against multiwavelength data.
Resumo:
One main assumption in the theory of rough sets applied to information tables is that the elements that exhibit the same information are indiscernible (similar) and form blocks that can be understood as elementary granules of knowledge about the universe. We propose a variant of this concept defining a measure of similarity between the elements of the universe in order to consider that two objects can be indiscernible even though they do not share all the attribute values because the knowledge is partial or uncertain. The set of similarities define a matrix of a fuzzy relation satisfying reflexivity and symmetry but transitivity thus a partition of the universe is not attained. This problem can be solved calculating its transitive closure what ensure a partition for each level belonging to the unit interval [0,1]. This procedure allows generalizing the theory of rough sets depending on the minimum level of similarity accepted. This new point of view increases the rough character of the data because increases the set of indiscernible objects. Finally, we apply our results to a not real application to be capable to remark the differences and the improvements between this methodology and the classical one