25 resultados para Distributed knowledge
em Reposit
Resumo:
Develop a new model of Absorptive Capacity taking into account two variables namely Learning and knowledge to explain how companies transform information into knowledge
Resumo:
Today, information overload and the lack of systems that enable locating employees with the right knowledge or skills are common challenges that large organisations face. This makes knowledge workers to re-invent the wheel and have problems to retrieve information from both internal and external resources. In addition, information is dynamically changing and ownership of data is moving from corporations to the individuals. However, there is a set of web based tools that may cause a major progress in the way people collaborate and share their knowledge. This article aims to analyse the impact of ‘Web 2.0’ on organisational knowledge strategies. A comprehensive literature review was done to present the academic background followed by a review of current ‘Web 2.0’ technologies and assessment of their strengths and weaknesses. As the framework of this study is oriented to business applications, the characteristics of the involved segments and tools were reviewed from an organisational point of view. Moreover, the ‘Enterprise 2.0’ paradigm does not only imply tools but also changes the way people collaborate, the way the work is done (processes) and finally impacts on other technologies. Finally, gaps in the literature in this area are outlined.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
Knowledge on forced magma injection and magma flow in dykes is crucial for the understanding of how magmas migrate through the crust to the Earth's surface. Because many questions still persist, we used the long, thick, and deep-seated Foum Zguid dyke (Morocco) to investigate dyke emplacement and internal flow by means of magnetic methods, structural analysis, petrography, and scanning electron microscopy. We also investigated how the host rocks accommodated the intrusion. Regarding internal flow: 1. Important variations of the rock magnetic properties and magnetic fabric occur with distance from dyke wall; 2. anisotropy of anhysteretic remanent magnetization reveals that anisotropy of magnetic susceptibility (AMS) results mainly from the superposition of subfabrics with distinct coercivities and that the imbrication between magnetic foliation and dyke plane is more reliable to deduce flow than the orientation of the AMS maximum principal axis; and 3. a dominant upward flow near the margins can be inferred. The magnetic fabric closest to the dyke wall likely records magma flow best due to fast cooling, whereas in the core the magnetic properties have been affected by high-temperature exsolution and metasomatic effects due to slow cooling. Regarding dyke emplacement, this study shows that the thick forceful intrusion induced deformation by homogeneous flattening and/or folding of the host sedimentary strata. Dewatering related to heat, as recorded by thick quartz veins bordering the dyke in some localities, may have also helped accommodating dyke intrusion. The spatial arrangement of quartz veins and their geometrical relationship with the dyke indicate a preintrusive to synintrusive sinistral component of strike slip.
Resumo:
Integrated manufacturing constitutes a complex system made of heterogeneous information and control subsystems. Those subsystems are not designed to the cooperation. Typically each subsystem automates specific processes, and establishes closed application domains, therefore it is very difficult to integrate it with other subsystems in order to respond to the needed process dynamics. Furthermore, to cope with ever growing marketcompetition and demands, it is necessary for manufacturing/enterprise systems to increase their responsiveness based on up-to-date knowledge and in-time data gathered from the diverse information and control systems. These have created new challenges for manufacturing sector, and even bigger challenges for collaborative manufacturing. The growing complexity of the information and communication technologies when coping with innovative business services based on collaborative contributions from multiple stakeholders, requires novel and multidisciplinary approaches. Service orientation is a strategic approach to deal with such complexity, and various stakeholders' information systems. Services or more precisely the autonomous computational agents implementing the services, provide an architectural pattern able to cope with the needs of integrated and distributed collaborative solutions. This paper proposes a service-oriented framework, aiming to support a virtual organizations breeding environment that is the basis for establishing short or long term goal-oriented virtual organizations. The notion of integrated business services, where customers receive some value developed through the contribution from a network of companies is a key element.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
This work describes a methodology to extract symbolic rules from trained neural networks. In our approach, patterns on the network are codified using formulas on a Lukasiewicz logic. For this we take advantage of the fact that every connective in this multi-valued logic can be evaluated by a neuron in an artificial network having, by activation function the identity truncated to zero and one. This fact simplifies symbolic rule extraction and allows the easy injection of formulas into a network architecture. We trained this type of neural network using a back-propagation algorithm based on Levenderg-Marquardt algorithm, where in each learning iteration, we restricted the knowledge dissemination in the network structure. This makes the descriptive power of produced neural networks similar to the descriptive power of Lukasiewicz logic language, minimizing the information loss on the translation between connectionist and symbolic structures. To avoid redundance on the generated network, the method simplifies them in a pruning phase, using the "Optimal Brain Surgeon" algorithm. We tested this method on the task of finding the formula used on the generation of a given truth table. For real data tests, we selected the Mushrooms data set, available on the UCI Machine Learning Repository.
Resumo:
Mestrado em Fisioterapia.
Resumo:
A replicate evaluation of increased micronucleus (MN) frequencies in peripheral lymphocytes of workers occupationally exposed to formaldehyde (FA) was undertaken to verify the observed effect and to determine scoring variability. May–Grünwald–Giemsa-stained slides were obtained from a previously performed cytokinesis-block micronucleus test (CBMNT) with 56 workers in anatomy and pathology laboratories and 85 controls. The first evaluation by one scorer (scorer 1) had led to a highly significant difference between workers and controls (3.96 vs 0.81 MN per 1000 cells). The slides were coded before re-evaluation and the code was broken after the complete re-evaluation of the study. A total of 1000 binucleated cells (BNC) were analysed per subject and the frequency of MN (in ‰) was determined. Slides were distributed equally and randomly between two scorers, so that the scorers had no knowledge of the exposure status. Scorer 2 (32 exposed, 36 controls) measured increased MN frequencies in exposed workers (9.88 vs 6.81). Statistical analysis with the two-sample Wilcoxon test indicated that this difference was not significant (p = 0.17). Scorer 3 (20 exposed, 46 controls) obtained a similar result, but slightly higher values for the comparison of exposed and controls (19.0 vs 12.89; p = 0.089). Combining the results of the two scorers (13.38 vs 10.22), a significant difference between exposed and controls (p = 0.028) was obtained when the stratified Wilcoxon test with the scorers as strata was applied. Interestingly, the re-evaluation of the slides led to clearly higher MN frequencies for exposed and controls compared with the first evaluation. Bland–Altman plots indicated that the agreement between the measurements of the different scorers was very poor, as shown by mean differences of 5.9 between scorer 1 and scorer 2 and 13.0 between scorer 1 and scorer 3. Calculation of the intra-class correlation coefficient (ICC) revealed that all scorer comparisons in this study were far from acceptable for the reliability of this assay. Possible implications for the use of the CBMNT in human biomonitoring studies are discussed.
Resumo:
This paper presents a distributed model predictive control (DMPC) for indoor thermal comfort that simultaneously optimizes the consumption of a limited shared energy resource. The control objective of each subsystem is to minimize the heating/cooling energy cost while maintaining the indoor temperature and used power inside bounds. In a distributed coordinated environment, the control uses multiple dynamically decoupled agents (one for each subsystem/house) aiming to achieve satisfaction of coupling constraints. According to the hourly power demand profile, each house assigns a priority level that indicates how much is willing to bid in auction for consume the limited clean resource. This procedure allows the bidding value vary hourly and consequently, the agents order to access to the clean energy also varies. Despite of power constraints, all houses have also thermal comfort constraints that must be fulfilled. The system is simulated with several houses in a distributed environment.
Resumo:
In distributed video coding, motion estimation is typically performed at the decoder to generate the side information, increasing the decoder complexity while providing low complexity encoding in comparison with predictive video coding. Motion estimation can be performed once to create the side information or several times to refine the side information quality along the decoding process. In this paper, motion estimation is performed at the decoder side to generate multiple side information hypotheses which are adaptively and dynamically combined, whenever additional decoded information is available. The proposed iterative side information creation algorithm is inspired in video denoising filters and requires some statistics of the virtual channel between each side information hypothesis and the original data. With the proposed denoising algorithm for side information creation, a RD performance gain up to 1.2 dB is obtained for the same bitrate.
Resumo:
Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.
Resumo:
Processes are a central entity in enterprise collaboration. Collaborative processes need to be executed and coordinated in a distributed Computational platform where computers are connected through heterogeneous networks and systems. Life cycle management of such collaborative processes requires a framework able to handle their diversity based on different computational and communication requirements. This paper proposes a rational for such framework, points out key requirements and proposes it strategy for a supporting technological infrastructure. Beyond the portability of collaborative process definitions among different technological bindings, a framework to handle different life cycle phases of those definitions is presented and discussed. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a distributed predictive control methodology for indoor thermal comfort that optimizes the consumption of a limited shared energy resource using an integrated demand-side management approach that involves a power price auction and an appliance loads allocation scheme. The control objective for each subsystem (house or building) aims to minimize the energy cost while maintaining the indoor temperature inside comfort limits. In a distributed coordinated multi-agent ecosystem, each house or building control agent achieves its objectives while sharing, among them, the available energy through the introduction of particular coupling constraints in their underlying optimization problem. Coordination is maintained by a daily green energy auction bring in a demand-side management approach. Also the implemented distributed MPC algorithm is described and validated with simulation studies.