925 resultados para Semantic Web, Exploratory Search, Recommendation Systems
Impact of Commercial Search Engines and International Databases on Engineering Teaching and Research
Resumo:
For the last three decades, the engineering higher education and professional environments have been completely transformed by the "electronic/digital information revolution" that has included the introduction of personal computer, the development of email and world wide web, and broadband Internet connections at home. Herein the writer compares the performances of several digital tools with traditional library resources. While new specialised search engines and open access digital repositories may fill a gap between conventional search engines and traditional references, these should be not be confused with real libraries and international scientific databases that encompass textbooks and peer-reviewed scholarly works. An absence of listing in some Internet search listings, databases and repositories is not an indication of standing. Researchers, engineers and academics should remember these key differences in assessing the quality of bibliographic "research" based solely upon Internet searches.
Resumo:
An order of magnitude sensitivity gain is described for using quasar spectra to investigate possible time or space variation in the fine structure constant alpha. Applied to a sample of 30 absorption systems, spanning redshifts 0.5 < z < 1.6, we derive limits on variations in alpha over a wide range of epochs. For the whole sample, Delta alpha/alpha = (-1.1 +/- 0.4) x 10(-5). This deviation is dominated by measurements at z > 1, where Delta alpha/alpha = (-1.9 +/- 0.5) x 10(-5). For z < 1, Delta alpha/alpha = (-0.2 +/- 0.4) x 10(-5). While this is consistent with a time-varying alpha, further work is required to explore possible systematic errors in the data, although careful searches have so far revealed none.
Resumo:
Ecological interface design (EID) is proving to be a promising approach to the design of interfaces for complex dynamic systems. Although the principles of EID and examples of its effective use are widely available, few readily available examples exist of how the individual displays that constitute an ecological interface are developed. This paper presents the semantic mapping process within EID in the context of prior theoretical work in this area. The semantic mapping process that was used in developing an ecological interface for the Pasteurizer II microworld is outlined, and the results of an evaluation of the ecological interface against a more conventional interface are briefly presented. Subjective reports indicate features of the ecological interface that made it particularly valuable for participants. Finally, we outline the steps of an analytic process for using EID. The findings presented here can be applied in the design of ecological interfaces or of configural displays for dynamic processes.
Resumo:
Formal Concept Analysis is an unsupervised machine learning technique that has successfully been applied to document organisation by considering documents as objects and keywords as attributes. The basic algorithms of Formal Concept Analysis then allow an intelligent information retrieval system to cluster documents according to keyword views. This paper investigates the scalability of this idea. In particular we present the results of applying spatial data structures to large datasets in formal concept analysis. Our experiments are motivated by the application of the Formal Concept Analysis idea of a virtual filesystem [11,17,15]. In particular the libferris [1] Semantic File System. This paper presents customizations to an RD-Tree Generalized Index Search Tree based index structure to better support the application of Formal Concept Analysis to large data sources.
Resumo:
Numerical optimisation methods are being more commonly applied to agricultural systems models, to identify the most profitable management strategies. The available optimisation algorithms are reviewed and compared, with literature and our studies identifying evolutionary algorithms (including genetic algorithms) as superior in this regard to simulated annealing, tabu search, hill-climbing, and direct-search methods. Results of a complex beef property optimisation, using a real-value genetic algorithm, are presented. The relative contributions of the range of operational options and parameters of this method are discussed, and general recommendations listed to assist practitioners applying evolutionary algorithms to the solution of agricultural systems. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper examines the effects of information request ambiguity and construct incongruence on end user's ability to develop SQL queries with an interactive relational database query language. In this experiment, ambiguity in information requests adversely affected accuracy and efficiency. Incongruities among the information request, the query syntax, and the data representation adversely affected accuracy, efficiency, and confidence. The results for ambiguity suggest that organizations might elicit better query development if end users were sensitized to the nature of ambiguities that could arise in their business contexts. End users could translate natural language queries into pseudo-SQL that could be examined for precision before the queries were developed. The results for incongruence suggest that better query development might ensue if semantic distances could be reduced by giving users data representations and database views that maximize construct congruence for the kinds of queries in typical domains. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
Regional planners, policy makers and policing agencies all recognize the importance of better understanding the dynamics of crime. Theoretical and application-oriented approaches which provide insights into why and where crimes take place are much sought after. Geographic information systems and spatial analysis techniques, in particular, are proving to be essential or studying criminal activity. However, the capabilities of these quantitative methods continue to evolve. This paper explores the use of geographic information systems and spatial analysis approaches for examining crime occurrence in Brisbane, Australia. The analysis highlights novel capabilities for the analysis of crime in urban regions.
Resumo:
This paper discusses a document discovery tool based on Conceptual Clustering by Formal Concept Analysis. The program allows users to navigate e-mail using a visual lattice metaphor rather than a tree. It implements a virtual. le structure over e-mail where files and entire directories can appear in multiple positions. The content and shape of the lattice formed by the conceptual ontology can assist in e-mail discovery. The system described provides more flexibility in retrieving stored e-mails than what is normally available in e-mail clients. The paper discusses how conceptual ontologies can leverage traditional document retrieval systems and aid knowledge discovery in document collections.
Resumo:
This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.
Resumo:
The impact of basal ganglia dysfunction on semantic processing was investigated by comparing the performance of individuals with nonthalamic subcortical (NS) vascular lesions, Parkinson's disease (PD), cortical lesions, and matched controls on a semantic priming task. Unequibiased lexical ambiguity primes were used in auditory prime-target pairs comprising 4 critical conditions; dominant related (e.g., bank-money), subordinate related (e.g., bank-river), dominant unrelated (e.g.,foot-money) and subordinate unrelated (e.g., bat-river). Participants made speeded lexical decisions (word/nonword) on targets using a go-no-go response. When a short prime-target interstimulus interval (ISI) of 200 ins was employed, all groups demonstrated priming for dominant and subordinate conditions, indicating nonselective meaning facilitation and intact automatic lexical processing. Differences emerged at the long ISI (1250 ms), where control and cortical lesion participants evidenced selective facilitation of the dominant meaning, whereas NS and PD groups demonstrated a protracted period of nonselective meaning facilitation. This finding suggests a circumscribed deficit in the selective attentional engagement of the semantic network on the basis of meaning frequency, possibly implicating a disturbance of frontal-subcortical systems influencing inhibitory semantic mechanisms.
Resumo:
The goal of the present study is mapping the nature of possible contributions of participatory online platforms in citizen actions that may contribute in the fight against cancer and its associated consequences. These platforms are usually associated with entertainment: in that sense, we intent to test their validity in other domains such as health, as well as contribute to an expanded perception of their potential by their users. The research is based on the analysis of online solidarity networks, namely the ones residing on Facebook, Orkut and the blogosphere, that citizens have been gradually resorting to. The research is also based on the development of newer and more efficient solutions that provide the individual (directly or indirectly affected by issues of oncology) with the means to overcome feelings of impotence and fatality. In this article, we aim at summarizing the processes of usage of these decentralized, freer participatory platforms by citizens and institutions, while attempting to unravel existing hype and stigma; we also provide a first survey of the importance and the role of institutions in this kind of endeavor; lastly, we present a prototype, developed in the context of the present study, that is specifically dedicated to addressing oncology through social media. This prototype is already available online at www.talkingaboutcancer.org, however, still under development and testing. The main objective of this platform is to allow every citizen to freely build their network of contacts and information, according to their own individual and/ or collective needs and desires.
Resumo:
Background and Purpose: Precise needle puncture of the kidney is a challenging and essential step for successful percutaneous nephrolithotomy (PCNL). Many devices and surgical techniques have been developed to easily achieve suitable renal access. This article presents a critical review to address the methodologies and techniques for conducting kidney targeting and the puncture step during PCNL. Based on this study, research paths are also provided for PCNL procedure improvement. Methods: Most relevant works concerning PCNL puncture were identified by a search of Medline/PubMed, ISI Web of Science, and Scopus databases from 2007 to December 2012. Two authors independently reviewed the studies. Results: A total of 911 abstracts and 346 full-text articles were assessed and discussed; 52 were included in this review as a summary of the main contributions to kidney targeting and puncturing. Conclusions: Multiple paths and technologic advances have been proposed in the field of urology and minimally invasive surgery to improve PCNL puncture. The most relevant contributions, however, have been provided by the applicationofmedical imaging guidance, newsurgical tools,motion tracking systems, robotics, andimage processing and computer graphics. Despite the multiple research paths for PCNL puncture guidance, no widely acceptable solution has yet been reached, and it remains an active and challenging research field. Future developments should focus on real-time methods, robust and accurate algorithms, and radiation free imaging techniques