954 resultados para context-aware applications
Resumo:
The purpose of this project is to understand, under a social constructionist approach, what are the meanings that external facilitators and organizational members (sponsors) working with dialogic methods place on themselves and their work. Dialogic methods, with the objective of engaging groups in flows of conversations to envisage and co-create their own future, are growing fast within organizations as a means to achieve collective change. Sharing constructionist ideas about the possibility of multiple realities and language as constitutive of such realities, dialogue has turned into a promising way for transformation, especially in a macro context of constant change and increasing complexity, where traditional structures, relationships and forms of work are questioned. Research on the topic has mostly focused on specific methods or applications, with few attempts to study it in a broader sense. Also, despite the fact that dialogic methods work on the assumption that realities are socially constructed, few studies approach the topic from a social constructionist perspective, as a research methodology per se. Thus, while most existing research aims at explaining whether or how particular methods meet particular results, my intention is to explore the meanings sustaining these new forms of organizational practice. Data was collected through semi-structured interviews with 25 people working with dialogic methods: 11 facilitators and 14 sponsors, from 8 different organizations in Brazil. Firstly, the research findings indicate several contextual elements that seem to sustain the choices for dialogic methods. Within this context, there does not seem to be a clear or specific demand for dialogic methods, but a set of different motivations, objectives and focuses, bringing about several contrasts in the way participants name, describe and explain their experiences with such methods, including tensions on power relations, knowledge creation, identity and communication. Secondly, some central ideas or images were identified within such contrasts, pointing at both directions: dialogic methods as opportunities for the creation of new organizational realities (with images of a ‘door’ or a ‘flow’, for instance, which suggest that dialogic methods may open up the access to other perspectives and the creation of new realities); and dialogic methods as new instrumental mechanisms that seem to reproduce the traditional and non-dialogical forms of work and relationship. The individualistic tradition and its tendency for rational schematism - pointed out by social constructionist scholars as strong traditions in our Western Culture - could be observed in some participants’ accounts with the image of dialogic methods as a ‘gym’, for instance, in which dialogical – and idealized –‘abilities’ could be taught and trained, turning dialogue into a tool, rather than a means for transformation. As a conclusion, I discuss what the implications of such taken-for-granted assumptions may be, and offer some insights into dialogue (and dialogic methods) as ‘the art of being together’.
Resumo:
Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
By means of a mod(N)-invariant operator basis, s-parametrized phase-space functions associated with bounded operators in a finite-dimensional Hilbert space are introduced in the context of the extended Cahill-Glauber formalism, and their properties are discussed in details. The discrete Glauber-Sudarshan, Wigner, and Husimi functions emerge from this formalism as specific cases of s-parametrized phase-space functions where, in particular, a hierarchical process among them is promptly established. In addition, a phase-space description of quantum tomography and quantum teleportation is presented and new results are obtained.
Resumo:
By means of a well-established algebraic framework, Rogers-Szego functions associated with a circular geometry in the complex plane are introduced in the context of q-special functions, and their properties are discussed in detail. The eigenfunctions related to the coherent and phase states emerge from this formalism as infinite expansions of Rogers-Szego functions, the coefficients being determined through proper eigenvalue equations in each situation. Furthermore, a complementary study on the Robertson-Schrodinger and symmetrical uncertainty relations for the cosine, sine and nondeformed number operators is also conducted, corroborating, in this way, certain features of q-deformed coherent states.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In the last years there was an exponential growth in the offering of Web-enabled distance courses and in the number of enrolments in corporate and higher education using this modality. However, the lack of efficient mechanisms that assures user authentication in this sort of environment, in the system login as well as throughout his session, has been pointed out as a serious deficiency. Some studies have been led about possible biometric applications for web authentication. However, password based authentication still prevails. With the popularization of biometric enabled devices and resultant fall of prices for the collection of biometric traits, biometrics is reconsidered as a secure remote authentication form for web applications. In this work, the face recognition accuracy, captured on-line by a webcam in Internet environment, is investigated, simulating the natural interaction of a person in the context of a distance course environment. Partial results show that this technique can be successfully applied to confirm the presence of users throughout the course attendance in an educational distance course. An efficient client/server architecture is also proposed. © 2009 Springer Berlin Heidelberg.
Resumo:
This paper presents new inverter topologies based on the integration of a DC to DC Zeta or Cuk converter with a voltage source inverter (VSI). The proposed integration procedure aims to reduce the amount of components, meaning lower volume, weight and costs. In this context, new families of single-phase and three-phase integrated inverters are also presented. Therefore, considering the novelty for Zeta and Cuk integrated inverters structures, the proposed single-phase and three-phase inverters versions are analyzed for grid-tied and stand-alone applications. Furthermore, in order to demonstrate the feasibility of the proposal, the main simulation and experimental results are presented. © 2011 IEEE.
Resumo:
Digital data sets constitute rich sources of information, which can be extracted and evaluated applying computational tools, for example, those ones for Information Visualization. Web-based applications, such as social network environments, forums and virtual environments for Distance Learning, are good examples for such sources. The great amount of data has direct impact on processing and analysis tasks. This paper presents the computational tool Mapper, defined and implemented to use visual representations - maps, graphics and diagrams - for supporting the decision making process by analyzing data stored in Virtual Learning Environment TelEduc-Unesp. © 2012 IEEE.
Resumo:
Evolutionary algorithms have been widely used for Artificial Neural Networks (ANN) training, being the idea to update the neurons' weights using social dynamics of living organisms in order to decrease the classification error. In this paper, we have introduced Social-Spider Optimization to improve the training phase of ANN with Multilayer perceptrons, and we validated the proposed approach in the context of Parkinson's Disease recognition. The experimental section has been carried out against with five other well-known meta-heuristics techniques, and it has shown SSO can be a suitable approach for ANN-MLP training step.
Resumo:
In this paper we study the intersection of Knowledge Organization with Information Technologies and the challenges and opportunities for Knowledge Organization experts that, in our view, are important to be studied and for them to be aware of. We start by giving some definitions necessary for providing the context for our work. Then we review the history of the Web, beginning with the Internet and continuing with the World Wide Web, the Semantic Web, problems of Artificial Intelligence, Web 2.0, and Linked Data. Finally, we conclude our paper with IT applications for Knowledge Organization in libraries, such as FRBR, BIBFRAME, and several OCLC initiatives, as well as with some of the challenges and opportunities in which Knowledge Organization experts and researchers might play a key role in relation to the Semantic Web.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The clinical use of platelet-rich plasma (PRP) is based on the increase in the concentration of growth factors and in the secretion of proteins which are able to maximize the healing process at the cellular level. Since PRP is an autologous biologic material, it involves a minimum risk of immune reactions and transmission of infectious and contagious diseases, and it has been widely used for the recovery of musculoskeletal lesions. Despite the great potential for applicability, the implementation of the therapeutic employment of PRP as a clinical alternative has become difficult, due to the lack of studies related to the standardization of the techniques and/or insufficient description of the adopted procedures. Therefore, it is required establish standard criteria to be followed for obtaining a PRP of high quality, as well as a larger number of studies which should establish the proper concentration of platelets for the different clinical conditions. In this context, the purpose of this review is to discuss some methodological aspects used for achieving the PRP, as well as to discuss the bioactive properties of PRP, and to point out its therapeutic use in different fields of regenerative medicine.
Resumo:
Multicommodity flow (MF) problems have a wide variety of applications in areas such as VLSI circuit design, network design, etc., and are therefore very well studied. The fractional MF problems are polynomial time solvable while integer versions are NP-complete. However, exact algorithms to solve the fractional MF problems have high computational complexity. Therefore approximation algorithms to solve the fractional MF problems have been explored in the literature to reduce their computational complexity. Using these approximation algorithms and the randomized rounding technique, polynomial time approximation algorithms have been explored in the literature. In the design of high-speed networks, such as optical wavelength division multiplexing (WDM) networks, providing survivability carries great significance. Survivability is the ability of the network to recover from failures. It further increases the complexity of network design and presents network designers with more formidable challenges. In this work we formulate the survivable versions of the MF problems. We build approximation algorithms for the survivable multicommodity flow (SMF) problems based on the framework of the approximation algorithms for the MF problems presented in [1] and [2]. We discuss applications of the SMF problems to solve survivable routing in capacitated networks.
Resumo:
End-user programmers are increasingly relying on web authoring environments to create web applications. Although often consisting primarily of web pages, such applications are increasingly going further, harnessing the content available on the web through “programs” that query other web applications for information to drive other tasks. Unfortunately, errors can be pervasive in web applications, impacting their dependability. This paper reports the results of an exploratory study of end-user web application developers, performed with the aim of exposing prevalent classes of errors. The results suggest that end-users struggle the most with the identification and manipulation of variables when structuring requests to obtain data from other web sites. To address this problem, we present a family of techniques that help end user programmers perform this task, reducing possible sources of error. The techniques focus on simplification and characterization of the data that end-users must analyze while developing their web applications. We report the results of an empirical study in which these techniques are applied to several popular web sites. Our results reveal several potential benefits for end-users who wish to “engineer” dependable web applications.