959 resultados para Information integration
Resumo:
Hybridisation is a systematic process along which the characteristic features of hybrid logic, both at the syntactic and the semantic levels, are developed on top of an arbitrary logic framed as an institution. It also captures the construction of first-order encodings of such hybridised institutions into theories in first-order logic. The method was originally developed to build suitable logics for the specification of reconfigurable software systems on top of whatever logic is used to describe local requirements of each system’s configuration. Hybridisation has, however, a broader scope, providing a fresh example of yet another development in combining and reusing logics driven by a problem from Computer Science. This paper offers an overview of this method, proposes some new extensions, namely the introduction of full quantification leading to the specification of dynamic modalities, and exemplifies its potential through a didactical application. It is discussed how hybridisation can be successfully used in a formal specification course in which students progress from equational to hybrid specifications in a uniform setting, integrating paradigms, combining data and behaviour, and dealing appropriately with systems evolution and reconfiguration.
Resumo:
The continuous flow of technological developments in communications and electronic industries has led to the growing expansion of the Internet of Things (IoT). By leveraging the capabilities of smart networked devices and integrating them into existing industrial, leisure and communication applications, the IoT is expected to positively impact both economy and society, reducing the gap between the physical and digital worlds. Therefore, several efforts have been dedicated to the development of networking solutions addressing the diversity of challenges associated with such a vision. In this context, the integration of Information Centric Networking (ICN) concepts into the core of IoT is a research area gaining momentum and involving both research and industry actors. The massive amount of heterogeneous devices, as well as the data they produce, is a significant challenge for a wide-scale adoption of the IoT. In this paper we propose a service discovery mechanism, based on Named Data Networking (NDN), that leverages the use of a semantic matching mechanism for achieving a flexible discovery process. The development of appropriate service discovery mechanisms enriched with semantic capabilities for understanding and processing context information is a key feature for turning raw data into useful knowledge and ensuring the interoperability among different devices and applications. We assessed the performance of our solution through the implementation and deployment of a proof-of-concept prototype. Obtained results illustrate the potential of integrating semantic and ICN mechanisms to enable a flexible service discovery in IoT scenarios.
Resumo:
This qualitative study was aimed at investigating foreign language teachers’ attitudes toward use of information and communication technology (ICT) in their instruction. The insight was gained through the reported experience of ICT implementation by teachers, in what way and for which purpose they refer to use of technology, what kind of support and training they are provided with, and what beliefs they express about the influence of ICT implementation. This case study took place in one of the training schools in Finland. Five teachers participated in semi-structured interviews through a face-to-face approach. The findings demonstrated positive attitudes of teachers toward integration of ICT. The teachers shared their opinions about positive influence that ICT implementation has on both teaching and learning processes. However, they also pointed out the negative sides of ICT use: distraction of the students from usage of technology and technical problems causing frustration to the teachers. In addition, the responses revealed that the teachers are provided with adequate training aimed at enhancing their qualification which is provided with well-timed technology support and colleagues’ collaboration facilitating an efficient and smooth pace of the teaching process. According to the teachers’ opinions ICT integration in education appeared to have changed the role of the teacher. Due to different alterations in the field of ICT development teachers are required to upgrade their skills. The paper concludes with the limitations of the study and the recommendations for conducting further research.
Resumo:
Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.
Resumo:
Extreme natural events, like e.g. tsunamis or earthquakes, regularly lead to catastrophes with dramatic consequences. In recent years natural disasters caused hundreds of thousands of deaths, destruction of infrastructure, disruption of economic activity and loss of billions of dollars worth of property and thus revealed considerable deficits hindering their effective management: Needs for stakeholders, decision-makers as well as for persons concerned include systematic risk identification and evaluation, a way to assess countermeasures, awareness raising and decision support systems to be employed before, during and after crisis situations. The overall goal of this study focuses on interdisciplinary integration of various scientific disciplines to contribute to a tsunami early warning information system. In comparison to most studies our focus is on high-end geometric and thematic analysis to meet the requirements of smallscale, heterogeneous and complex coastal urban systems. Data, methods and results from engineering, remote sensing and social sciences are interlinked and provide comprehensive information for disaster risk assessment, management and reduction. In detail, we combine inundation modeling, urban morphology analysis, population assessment, socioeconomic analysis of the population and evacuation modeling. The interdisciplinary results eventually lead to recommendations for mitigation strategies in the fields of spatial planning or coping capacity. © Author(s) 2009.
Resumo:
199 p.
Resumo:
This qualitative study was aimed at investigating foreign language teachers’ attitudes toward use of information and communication technology (ICT) in their instruction. The insight was gained through the reported experience of ICT implementation by teachers, in what way and for which purpose they refer to use of technology, what kind of support and training they are provided with, and what beliefs they express about the influence of ICT implementation. This case study took place in one of the training schools in Finland. Five teachers participated in semi-structured interviews through a face-to-face approach. The findings demonstrated positive attitudes of teachers toward integration of ICT. The teachers shared their opinions about positive influence that ICT implementation has on both teaching and learning processes. However, they also pointed out the negative sides of ICT use: distraction of the students from usage of technology and technical problems causing frustration to the teachers. In addition, the responses revealed that the teachers are provided with adequate training aimed at enhancing their qualification which is provided with well-timed technology support and colleagues’ collaboration facilitating an efficient and smooth pace of the teaching process. According to the teachers’ opinions ICT integration in education appeared to have changed the role of the teacher. Due to different alterations in the field of ICT development teachers are required to upgrade their skills. The paper concludes with the limitations of the study and the recommendations for conducting further research.
Resumo:
The integration of Information and Communication Technologies (ICT) in the tourism industry is an essential element for the success of any tourism enterprise. ICTs provide access to information of tourism products from anywhere and at any time. Tour companies may also reach out to target customers around the world through a series of emerging technologies. This paper aims to make a review of the main key factors of ICT in Tourism. Aspects such as the quality of the website, Digital Marketing, Social Networking, Multimedia, Mobile Technologies and Intelligent Environments are discussed.
Resumo:
The Library of the Institute of Alajuela made an induction experience and training of users and ventured into the information literacy and engaged in the work of the teaching-learning as an integral part of the curriculum. The actions of the library in developing search strategies, location, selection and use of information brought inthe health service, changes to the role of the library, the librarian, the book and the information in the educational environment.By sharing this experience is intended to provide information that can motivate staff of educational institutions that wish toenter the field of information literacy as a strategy to support the development oflifelong independent learning skills and meaningful learning. Currently, the library should be a proactive part in the education of students but also teachers, administrative and family.This will result in a benefit to Costa Rica: the development of youth and their proper integration into the workplace.
Resumo:
Part 21: Mobility and Logistics
Resumo:
The main databases related to metabolic pathways, such as Kegg, Brenda, Reactome and Biocyc, provide partially interlinked data on metabolic pathways. This limitation only allows independent searches to retrieve cross-database information on metabolism and restricts the use of more complex searches to discover new knowledge or relationships.
Resumo:
People possess different sensory modalities to detect, interpret, and efficiently act upon various events in a complex and dynamic environment (Fetsch, DeAngelis, & Angelaki, 2013). Much empirical work has been done to understand the interplay of modalities (e.g. audio-visual interactions, see Calvert, Spence, & Stein, 2004). On the one hand, integration of multimodal input as a functional principle of the brain enables the versatile and coherent perception of the environment (Lewkowicz & Ghazanfar, 2009). On the other hand, sensory integration does not necessarily mean that input from modalities is always weighted equally (Ernst, 2008). Rather, when two or more modalities are stimulated concurrently, one often finds one modality dominating over another. Study 1 and 2 of the dissertation addressed the developmental trajectory of sensory dominance. In both studies, 6-year-olds, 9-year-olds, and adults were tested in order to examine sensory (audio-visual) dominance across different age groups. In Study 3, sensory dominance was put into an applied context by examining verbal and visual overshadowing effects among 4- to 6-year olds performing a face recognition task. The results of Study 1 and Study 2 support default auditory dominance in young children as proposed by Napolitano and Sloutsky (2004) that persists up to 6 years of age. For 9-year-olds, results on privileged modality processing were inconsistent. Whereas visual dominance was revealed in Study 1, privileged auditory processing was revealed in Study 2. Among adults, a visual dominance was observed in Study 1, which has also been demonstrated in preceding studies (see Spence, Parise, & Chen, 2012). No sensory dominance was revealed in Study 2 for adults. Potential explanations are discussed. Study 3 referred to verbal and visual overshadowing effects in 4- to 6-year-olds. The aim was to examine whether verbalization (i.e., verbally describing a previously seen face), or visualization (i.e., drawing the seen face) might affect later face recognition. No effect of visualization on recognition accuracy was revealed. As opposed to a verbal overshadowing effect, a verbal facilitation effect occurred. Moreover, verbal intelligence was a significant predictor for recognition accuracy in the verbalization group but not in the control group. This suggests that strengthening verbal intelligence in children can pay off in non-verbal domains as well, which might have educational implications.
Resumo:
Effective and efficient implementation of intelligent and/or recently emerged networked manufacturing systems require an enterprise level integration. The networked manufacturing offers several advantages in the current competitive atmosphere by way to reduce, by shortening manufacturing cycle time and maintaining the production flexibility thereby achieving several feasible process plans. The first step in this direction is to integrate manufacturing functions such as process planning and scheduling for multi-jobs in a network based manufacturing system. It is difficult to determine a proper plan that meets conflicting objectives simultaneously. This paper describes a mobile-agent based negotiation approach to integrate manufacturing functions in a distributed manner; and its fundamental framework and functions are presented. Moreover, ontology has been constructed by using the Protégé software which possesses the flexibility to convert knowledge into Extensible Markup Language (XML) schema of Web Ontology Language (OWL) documents. The generated XML schemas have been used to transfer information throughout the manufacturing network for the intelligent interoperable integration of product data models and manufacturing resources. To validate the feasibility of the proposed approach, an illustrative example along with varied production environments that includes production demand fluctuations is presented and compared the proposed approach performance and its effectiveness with evolutionary algorithm based Hybrid Dynamic-DNA (HD-DNA) algorithm. The results show that the proposed scheme is very effective and reasonably acceptable for integration of manufacturing functions.
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.
Resumo:
Abstract : Information and communication technologies (ICTs, henceforth) have become ubiquitous in our society. The plethora of devices competing with the computer, from iPads to the Interactive whiteboard, just to name a few, has provided teachers and students alike with the ability to communicate and access information with unprecedented accessibility and speed. It is only logical that schools reflect these changes given that their purpose is to prepare students for the future. Surprisingly enough, research indicates that ICT integration into teaching activities is still marginal. Many elementary and secondary schoolteachers are not making effective use of ICTs in their teaching activities as well as in their assessment practices. The purpose of the current study is a) to describe Quebec ESL teachers’ profiles of using ICTs in their daily teaching activities; b) to describe teachers’ ICT integration and assessment practices; and c) to describe teachers’ social representations regarding the utility and relevance of ICT use in their daily teaching activities and assessment practices. In order to attain our objectives, we based our theoretical framework, principally, on the social representations (SR, henceforth) theory and we defined most related constructs which were deemed fundamental to the current thesis. We also collected data from 28 ESL elementary and secondary school teachers working in public and private sectors. The interview guide used to that end included a range of items to elicit teachers’ SR in terms of ICT daily use in teaching activities as well as in assessment practices. In addition, we carried out our data analyses from a textual statistics perspective, a particular mode of content analysis, in order to extract the indicators underlying teachers’ representations of the teachers. The findings suggest that although almost all participants use a wide range of ICT tools in their practices, ICT implementation is seemingly not exploited to its fullest potential and, correspondingly, is likely to produce limited effects on students’ learning. Moreover, none of the interviewees claim that they use ICTs in their assessment practices and they still hold to the traditional paper-based assessment (PBA, henceforth) approach of assessing students’ learning. Teachers’ common discourse reveals a gap between the positive standpoint with regards to ICT integration, on the one hand, and the actual uses of instructional technology, on the other. These results are useful for better understanding the way ESL teachers in Quebec currently view their use of ICTs, particularly for evaluation purposes. In fact, they provide a starting place for reconsidering the implementation of ICTs in elementary and secondary schools. They may also be useful to open up avenues for the development of a future research program in this regard.