844 resultados para New knowledge systems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software developers are often unsure of the exact name of the method they need to use to invoke the desired behavior in a given context. This results in a process of searching for the correct method name in documentation, which can be lengthy and distracting to the developer. We can decrease the method search time by enhancing the documentation of a class with the most frequently used methods. Usage frequency data for methods is gathered by analyzing other projects from the same ecosystem - written in the same language and sharing dependencies. We implemented a proof of concept of the approach for Pharo Smalltalk and Java. In Pharo Smalltalk, methods are commonly searched for using a code browser tool called "Nautilus", and in Java using a web browser displaying HTML based documentation - Javadoc. We developed plugins for both browsers and gathered method usage data from open source projects, in order to increase developer productivity by reducing method search time. A small initial evaluation has been conducted showing promising results in improving developer productivity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software developers are often unsure of the exact name of the API method they need to use to invoke the desired behavior. Most state-of-the-art documentation browsers present API artefacts in alphabetical order. Albeit easy to implement, alphabetical order does not help much: if the developer knew the name of the required method, he could have just searched for it in the first place. In a context where multiple projects use the same API, and their source code is available, we can improve the API presentation by organizing the elements in the order in which they are more likely to be used by the developer. Usage frequency data for methods is gathered by analyzing other projects from the same ecosystem and this data is used then to improve tools. We present a preliminary study on the potential of this approach to improve the API presentation by reducing the time it takes to find the method that implements a given feature. We also briefly present our experience with two proof-of-concept tools implemented for Smalltalk and Java.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dynamically typed languages lack information about the types of variables in the source code. Developers care about this information as it supports program comprehension. Ba- sic type inference techniques are helpful, but may yield many false positives or negatives. We propose to mine information from the software ecosys- tem on how frequently given types are inferred unambigu- ously to improve the quality of type inference for a single system. This paper presents an approach to augment existing type inference techniques by supplementing the informa- tion available in the source code of a project with data from other projects written in the same language. For all available projects, we track how often messages are sent to instance variables throughout the source code. Predictions for the type of a variable are made based on the messages sent to it. The evaluation of a proof-of-concept prototype shows that this approach works well for types that are sufficiently popular, like those from the standard librarie, and tends to create false positives for unpopular or domain specific types. The false positives are, in most cases, fairly easily identifiable. Also, the evaluation data shows a substantial increase in the number of correctly inferred types when compared to the non-augmented type inference.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Scoping behavioral variations to dynamic extents is useful to support non-functional requirements that otherwise result in cross-cutting code. Unfortunately, such variations are difficult to achieve with traditional reflection or aspects. We show that with a modification of dynamic proxies, called delegation proxies, it becomes possible to reflectively implement variations that propagate to all objects accessed in the dynamic extent of a message send. We demonstrate our approach with examples of variations scoped to dynamic extents that help simplify code related to safety, reliability, and monitoring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work, a method that synchronizes two video sequences is proposed. Unlike previous methods, which require the existence of correspondences between features tracked in the two sequences, and/or that the cameras are static or jointly moving, the proposed approach does not impose any of these constraints. It works when the cameras move independently, even if different features are tracked in the two sequences. The assumptions underlying the proposed strategy are that the intrinsic parameters of the cameras are known and that two rigid objects, with independent motions on the scene, are visible in both sequences. The relative motion between these objects is used as clue for the synchronization. The extrinsic parameters of the cameras are assumed to be unknown. A new synchronization algorithm for static or jointly moving cameras that see (possibly) different parts of a common rigidly moving object is also proposed. Proof-of-concept experiments that illustrate the performance of these methods are presented, as well as a comparison with a state-of-the-art approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The finite depth of field of a real camera can be used to estimate the depth structure of a scene. The distance of an object from the plane in focus determines the defocus blur size. The shape of the blur depends on the shape of the aperture. The blur shape can be designed by masking the main lens aperture. In fact, aperture shapes different from the standard circular aperture give improved accuracy of depth estimation from defocus blur. We introduce an intuitive criterion to design aperture patterns for depth from defocus. The criterion is independent of a specific depth estimation algorithm. We formulate our design criterion by imposing constraints directly in the data domain and optimize the amount of depth information carried by blurred images. Our criterion is a quadratic function of the aperture transmission values. As such, it can be numerically evaluated to estimate optimized aperture patterns quickly. The proposed mask optimization procedure is applicable to different depth estimation scenarios. We use it for depth estimation from two images with different focus settings, for depth estimation from two images with different aperture shapes as well as for depth estimation from a single coded aperture image. In this work we show masks obtained with this new evaluation criterion and test their depth discrimination capability using a state-of-the-art depth estimation algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Indoor localization systems become more interesting for researchers because of the attractiveness of business cases in various application fields. A WiFi-based passive localization system can provide user location information to third-party providers of positioning services. However, indoor localization techniques are prone to multipath and Non-Line Of Sight (NLOS) propagation, which lead to significant performance degradation. To overcome these problems, we provide a passive localization system for WiFi targets with several improved algorithms for localization. Through Software Defined Radio (SDR) techniques, we extract Channel Impulse Response (CIR) information at the physical layer. CIR is later adopted to mitigate the multipath fading problem. We propose to use a Nonlinear Regression (NLR) method to relate the filtered power information to propagation distances, which significantly improves the ranging accuracy compared to the commonly used log-distance path loss model. To mitigate the influence of ranging errors, a new trilateration algorithm is designed as well by combining Weighted Centroid and Constrained Weighted Least Square (WC-CWLS) algorithms. Experiment results show that our algorithm is robust against ranging errors and outperforms the linear least square algorithm and weighted centroid algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ethnobiology research contributes significantly to initiatives that aim to enhance food sovereignty among indigenous and/or traditional people. In Bolivia, one of the Latin-American countries that shows the highest poverty and undernourishment levels, the purpose of this research-action project was to enhance food sovereignty through the revitalization of the local ecological knowledge and to promote local technological innovation processes in the Andean community of Tallija-Confital. During a first step the endogenous knowledge and strategies related to food security and sovereignty were investigated, based on the principles and tools of the Revitalizing Participatory Research (RPR). In a second step local technical innovation processes were supported through a “knowledge dialogue” between exogenous and endogenous knowledge systems, focusing on the processing of the cañahua (Chenopodium pallidicaule Aellen) gluten. The research results demonstrate that Andean people have developed complex endogenous knowledge and strategies to adapt to socio-environmental changes that show a great potential to contribute to the enhancement of food sovereignty. Nevertheless, in the current globalized context that translates into new challenges for local communities, beyond the revitalization of local ecological knowledge, a dialogue between different knowledge systems can lead to important local technological innovation for the improvement of their well-being. Key words: food sovereignty, knowledge dialogue, endogenous development, technological innovation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Information-centric networking (ICN) is a new communication paradigm that aims at increasing security and efficiency of content delivery in communication networks. In recent years, many research efforts in ICN have focused on caching strategies to reduce traffic and increase overall performance by decreasing download times. Since caches need to operate at line speed, they have only a limited size and content can only be stored for a short time. However, if content needs to be available for a longer time, e.g., for delay-tolerant networking or to provide high content availability similar to content delivery networks (CDNs), persistent caching is required. We base our work on the Content-Centric Networking (CCN) architecture and investigate persistent caching by extending the current repository implementation in CCNx. We show by extensive evaluations in a YouTube and webserver traffic scenario that repositories can be efficiently used to increase content availability by significantly increasing cache hit rates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The new computing paradigm known as cognitive computing attempts to imitate the human capabilities of learning, problem solving, and considering things in context. To do so, an application (a cognitive system) must learn from its environment (e.g., by interacting with various interfaces). These interfaces can run the gamut from sensors to humans to databases. Accessing data through such interfaces allows the system to conduct cognitive tasks that can support humans in decision-making or problem-solving processes. Cognitive systems can be integrated into various domains (e.g., medicine or insurance). For example, a cognitive system in cities can collect data, can learn from various data sources and can then attempt to connect these sources to provide real time optimizations of subsystems within the city (e.g., the transportation system). In this study, we provide a methodology for integrating a cognitive system that allows data to be verbalized, making the causalities and hypotheses generated from the cognitive system more understandable to humans. We abstract a city subsystem—passenger flow for a taxi company—by applying fuzzy cognitive maps (FCMs). FCMs can be used as a mathematical tool for modeling complex systems built by directed graphs with concepts (e.g., policies, events, and/or domains) as nodes and causalities as edges. As a verbalization technique we introduce the restriction-centered theory of reasoning (RCT). RCT addresses the imprecision inherent in language by introducing restrictions. Using this underlying combinatorial design, our approach can handle large data sets from complex systems and make the output understandable to humans.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present applicative theories of words corresponding to weak, and especially logarithmic, complexity classes. The theories for the logarithmic hierarchy and alternating logarithmic time formalise function algebras with concatenation recursion as main principle. We present two theories for logarithmic space where the first formalises a new two-sorted algebra which is very similar to Cook and Bellantoni's famous two-sorted algebra B for polynomial time [4]. The second theory describes logarithmic space by formalising concatenation- and sharply bounded recursion. All theories contain the predicates WW representing words, and VV representing temporary inaccessible words. They are inspired by Cantini's theories [6] formalising B.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This book attempts to synthesize research that contributes to a better understanding of how to reach sustainable business value through information systems (IS) outsourcing. Important topics in this realm are how IS outsourcing can contribute to innovation, how it can be dynamically governed, how to cope with its increasing complexity through multi-vendor arrangements, how service quality standards can be met, how corporate social responsibility can be upheld, and how to cope with increasing demands of internationalization and new sourcing models, such as crowdsourcing and platform-based cooperation. These issues are viewed from either the client or vendor perspective, or both. The book should be of interest to all academics and students in the fields of Information Systems, Management, and Organization as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Information systems (IS) outsourcing projects often fail to achieve initial goals. To avoid project failure, managers need to design formal controls that meet the specific contextual demands of the project. However, the dynamic and uncertain nature of IS outsourcing projects makes it difficult to design such specific formal controls at the outset of a project. It is hence crucial to translate high-level project goals into specific formal controls during the course of a project. This study seeks to understand the underlying patterns of such translation processes. Based on a comparative case study of four outsourced software development projects, we inductively develop a process model that consists of three unique patterns. The process model shows that the performance implications of emergent controls with higher specificity depend on differences in the translation process. Specific formal controls have positive implications for goal achievement if only the stakeholder context is adapted, while they are negative for goal achievement if in the translation process tasks are unintendedly adapted. In the latter case projects incrementally drift away from their initial direction. Our findings help to better understand control dynamics in IS outsourcing projects. We contribute to a process theoretic understanding of IS outsourcing governance and we derive implications for control theory and the IS project escalation literature.