13 resultados para Web Science
em BORIS: Bern Open Repository and Information System - Berna - Sui
Resumo:
For smart cities applications, a key requirement is to disseminate data collected from both scalar and multimedia wireless sensor networks to thousands of end-users. Furthermore, the information must be delivered to non-specialist users in a simple, intuitive and transparent manner. In this context, we present Sensor4Cities, a user-friendly tool that enables data dissemination to large audiences, by using using social networks, or/and web pages. The user can request and receive monitored information by using social networks, e.g., Twitter and Facebook, due to their popularity, user-friendly interfaces and easy dissemination. Additionally, the user can collect or share information from smart cities services, by using web pages, which also include a mobile version for smartphones. Finally, the tool could be configured to periodically monitor the environmental conditions, specific behaviors or abnormal events, and notify users in an asynchronous manner. Sensor4Cities improves the data delivery for individuals or groups of users of smart cities applications and encourages the development of new user-friendly services.
Resumo:
Spiders are the most important terrestrial predators among arthropods. Their ecological success is reflected by a high biodiversity and the conquest of nearly every terrestrial habitat. Spiders are closely associated with silk, a material, often seen to be responsible for their great ecological success and gaining high attention in life sciences. However, it is often overlooked that more than half of all Recent spider species have abandoned web building or never developed such an adaptation. These species must have found other, more economic solutions for prey capture and retention, compensating the higher energy costs of increased locomotion activity. Here we show that hairy adhesive pads (scopulae) are closely associated with the convergent evolution of a vagrant life style, resulting in highly diversified lineages of at least, equal importance as the derived web building taxa. Previous studies often highlighted the idea that scopulae have the primary function of assisting locomotion, neglecting the fact that only the distal most pads (claw tufts) are suitable for those purposes. The former observations, that scopulae are used in prey capture, are largely overlooked. Our results suggest the scopulae evolved as a substitute for silk in controlling prey and that the claw tufts are, in most cases, a secondary development. Evolutionary trends towards specialized claw tufts and their composition from a low number of enlarged setae to a dense array of slender ones, as well as the secondary loss of those pads are discussed further. Hypotheses about the origin of the adhesive setae and their diversification throughout evolution are provided.
Resumo:
Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.
Resumo:
In his in uential article about the evolution of the Web, Berners-Lee [1] envisions a Semantic Web in which humans and computers alike are capable of understanding and processing information. This vision is yet to materialize. The main obstacle for the Semantic Web vision is that in today's Web meaning is rooted most often not in formal semantics, but in natural language and, in the sense of semiology, emerges not before interpretation and processing. Yet, an automated form of interpretation and processing can be tackled by precisiating raw natural language. To do that, Web agents extract fuzzy grassroots ontologies through induction from existing Web content. Inductive fuzzy grassroots ontologies thus constitute organically evolved knowledge bases that resemble automated gradual thesauri, which allow precisiating natural language [2]. The Web agents' underlying dynamic, self-organizing, and best-effort induction, enable a sub-syntactical bottom up learning of semiotic associations. Thus, knowledge is induced from the users' natural use of language in mutual Web interactions, and stored in a gradual, thesauri-like lexical-world knowledge database as a top-level ontology, eventually allowing a form of computing with words [3]. Since when computing with words the objects of computation are words, phrases and propositions drawn from natural languages, it proves to be a practical notion to yield emergent semantics for the Semantic Web. In the end, an improved understanding by computers on the one hand should upgrade human- computer interaction on the Web, and, on the other hand allow an initial version of human- intelligence amplification through the Web.
Resumo:
The human face is a vital component of our identity and many people undergo medical aesthetics procedures in order to achieve an ideal or desired look. However, communication between physician and patient is fundamental to understand the patient’s wishes and to achieve the desired results. To date, most plastic surgeons rely on either “free hand” 2D drawings on picture printouts or computerized picture morphing. Alternatively, hardware dependent solutions allow facial shapes to be created and planned in 3D, but they are usually expensive or complex to handle. To offer a simple and hardware independent solution, we propose a web-based application that uses 3 standard 2D pictures to create a 3D representation of the patient’s face on which facial aesthetic procedures such as filling, skin clearing or rejuvenation, and rhinoplasty are planned in 3D. The proposed application couples a set of well-established methods together in a novel manner to optimize 3D reconstructions for clinical use. Face reconstructions performed with the application were evaluated by two plastic surgeons and also compared to ground truth data. Results showed the application can provide accurate 3D face representations to be used in clinics (within an average of 2 mm error) in less than 5 min.
Resumo:
This chapter presents fuzzy cognitive maps (FCM) as a vehicle for Web knowledge aggregation, representation, and reasoning. The corresponding Web KnowARR framework incorporates findings from fuzzy logic. To this end, a first emphasis is particularly on the Web KnowARR framework along with a stakeholder management use case to illustrate the framework’s usefulness as a second focal point. This management form is to help projects to acceptance and assertiveness where claims for company decisions are actively involved in the management process. Stakeholder maps visually (re-) present these claims. On one hand, they resort to non-public content and on the other they resort to content that is available to the public (mostly on the Web). The Semantic Web offers opportunities not only to present public content descriptively but also to show relationships. The proposed framework can serve as the basis for the public content of stakeholder maps.
Resumo:
Because the knowledge in the World Wide Web is continuously expanding, Web Knowledge Aggregation, Representation and Reasoning (abbreviated as KR) is becoming increasingly important. This article demonstrates how fuzzy ontologies can be used in KR to improve the interactions between humans and computers. The gap between the Social and Semantic Web can be reduced, and a Social Semantic Web may become possible. As an illustrative example, we demonstrate how fuzzy logic and KR can enhance technologies for cognitive cities. The underlying notion of these technologies is based on connectivism, which can be improved by incorporating the results of digital humanities research.
Resumo:
A vast amount of temporal information is provided on the Web. Even though many facts expressed in documents are time-related, the temporal properties of Web presentations have not received much attention. In database research, temporal databases have become a mainstream topic in recent years. In Web documents, temporal data may exist as meta data in the header and as user-directed data in the body of a document. Whereas temporal data can easily be identified in the semi-structured meta data, it is more difficult to determine temporal data and its role in the body. We propose procedures for maintaining temporal integrity of Web pages and outline different approaches of applying bitemporal data concepts for Web documents. In particular, we regard desirable functionalities of Web repositories and other Web-related tools that may support the Webmasters in managing the temporal data of their Web documents. Some properties of a prototype environment are described.
Resumo:
In Anlehnung an Begriffe wie Web 2.0, an dem sich Internet-Nutzer mit unterschiedlichen Aktivitäten beteiligen können, und Enterprise 2.0 wird das Konzept der „Wartung 2.0“ entwickelt. Bei diesem Ansatz steht die Einbeziehung von Nutzergemeinschaften zur Verbesserung des Webauftritts und insbesondere von B2C-Systemen im Vordergrund. Wartung 2.0 ist eine Komponente des Web Engineerings und damit auch ein Element des von Lutz J. Heinrich vertretenen Konzepts des Information Engineerings