804 resultados para Data mining and knowledge discovery


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The chlorophyll fluorescence kinetics of marine red alga Grateloupia turutunt Yamada, green alga Ulva pertusa Kjellm and brown alga Laminaria japonica Aresch during natural sustained dehydration were monitored and investigated. The pulse amplified modulation (PAM) system was used to analyze the distinct fluorescence parameters during thallus dehydration. Results proved that the fluorescence kinetics of different seaweed all showed three patterns of transformation with sustained water loss. These were: 1) peak kinetic pattern (at the early stage of dehydration fluorescence enhanced and quenched subsequently, representing a normal physiological state). 2) plateau kinetic pattern (with sustained water loss fluorescence enhanced continuously but quenching became slower, finally reaching its maximum). 3) Platform kinetic pattern (fluorescence fell and the shape of kinetic curve was similar to plateau kinetic pattern). A critical water content (CWC) could be found and defined as the percentage of water content just prior to the fluorescence drop and to be a significant physiological index for evaluation of plant drought tolerance. Once thallus water content became lower than this value the normal peak pattern can not be recovered even through rehydration, indicating an irreversible damage to the thylakoid membrane. The CWC value corresponding to different marine species were varied and negatively correlated with their desiccation tolerance, for example. Laminaria japonica had the highest CWC value (around 90%) and the lowest dehydration tolerance of the three. In addition, a fluorescence "burst" was found only in red algae during rehydration. The different fluorescence parameters F-o, F-v and F-v, F-m were measured and compared during water loss. Both F-o and F-v increased in the first stage of dehydration but F-v/F-m. kept almost constant. So the immediate response of in vivo chlorophyll fluorescence to dehydration was an enhancement. Later with sustained dehydration F-o increased continuously while F-v decreased and tended to become smaller and smaller. The major changes in fluorescence (including fluorescence drop during dehydration and the burst during rehydration) were all attributed to the change in F-o instead of F-v This significance of F-o indicates that it is necessary to do more research on F-o as well as on its relationship with the state of thylakoid membrane.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

本文介绍了一种用于载人潜水器的导航传感器的数据采集及信息融合技术。航行控制计算机通过基于工业以太网的数据通信系统对各传感器进行数据采集,采用卡尔曼滤波器完成对各传感器数据信息的融合,以便提高数据的精度和控制系统的性能,并将结果送给监控计算机,用于载人潜水器的姿态显示。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report describes a system which maintains canonical expressions for designators under a set of equalities. Substitution is used to maintain all knowledge in terms of these canonical expressions. A partial order on designators, termed the better-name relation, is used in the choice of canonical expressions. It is shown that with an appropriate better-name relation an important engineering reasoning technique, propagation of constraints, can be implemented as a special case of this substitution process. Special purpose algebraic simplification procedures are embedded such that they interact effectively with the equality system. An electrical circuit analysis system is developed which relies upon constraint propagation and algebraic simplification as primary reasoning techniques. The reasoning is guided by a better-name relation in which referentially transparent terms are preferred to referentially opaque ones. Multiple description of subcircuits are shown to interact strongly with the reasoning mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pyatt, B. Gilmore, G. Grattan, J. Hunt, C. McLaren, S. An imperial legacy? An exploration of the environmental impact of ancient metal mining and smelting in southern Jordan. Journal of Archaeological Science. 2000. 27 pp 771-778

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seeley, H. & Urqhart, C. (2007). Action research in developing knowledge networks. In P. Bath, K. Albright & T. Norris (Eds.), Proceedings of ISHIMR 2007, The twelfth international symposium on health information management research (pp. 217-235.) Sheffield: Centre for Health Information Management Research, University of Sheffield.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

'Data retention and the war against terrorism - a considered and proportionate response'. Journal of Information Law & Technology 2004 (3) RAE2008

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless sensor networks are characterized by limited energy resources. To conserve energy, application-specific aggregation (fusion) of data reports from multiple sensors can be beneficial in reducing the amount of data flowing over the network. Furthermore, controlling the topology by scheduling the activity of nodes between active and sleep modes has often been used to uniformly distribute the energy consumption among all nodes by de-synchronizing their activities. We present an integrated analytical model to study the joint performance of in-network aggregation and topology control. We define performance metrics that capture the tradeoffs among delay, energy, and fidelity of the aggregation. Our results indicate that to achieve high fidelity levels under medium to high event reporting load, shorter and fatter aggregation/routing trees (toward the sink) offer the best delay-energy tradeoff as long as topology control is well coordinated with routing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral methods of graph partitioning have been shown to provide a powerful approach to the image segmentation problem. In this paper, we adopt a different approach, based on estimating the isoperimetric constant of an image graph. Our algorithm produces the high quality segmentations and data clustering of spectral methods, but with improved speed and stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fusion ARTMAP is a self-organizing neural network architecture for multi-channel, or multi-sensor, data fusion. Single-channel Fusion ARTMAP is functionally equivalent to Fuzzy ART during unsupervised learning and to Fuzzy ARTMAP during supervised learning. The network has a symmetric organization such that each channel can be dynamically configured to serve as either a data input or a teaching input to the system. An ART module forms a compressed recognition code within each channel. These codes, in turn, become inputs to a single ART system that organizes the global recognition code. When a predictive error occurs, a process called paraellel match tracking simultaneously raises vigilances in multiple ART modules until reset is triggered in one of them. Parallel match tracking hereby resets only that portion of the recognition code with the poorest match, or minimum predictive confidence. This internally controlled selective reset process is a type of credit assignment that creates a parsimoniously connected learned network. Fusion ARTMAP's multi-channel coding is illustrated by simulations of the Quadruped Mammal database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet and World Wide Web have had, and continue to have, an incredible impact on our civilization. These technologies have radically influenced the way that society is organised and the manner in which people around the world communicate and interact. The structure and function of individual, social, organisational, economic and political life begin to resemble the digital network architectures upon which they are increasingly reliant. It is increasingly difficult to imagine how our ‘offline’ world would look or function without the ‘online’ world; it is becoming less meaningful to distinguish between the ‘actual’ and the ‘virtual’. Thus, the major architectural project of the twenty-first century is to “imagine, build, and enhance an interactive and ever changing cyberspace” (Lévy, 1997, p. 10). Virtual worlds are at the forefront of this evolving digital landscape. Virtual worlds have “critical implications for business, education, social sciences, and our society at large” (Messinger et al., 2009, p. 204). This study focuses on the possibilities of virtual worlds in terms of communication, collaboration, innovation and creativity. The concept of knowledge creation is at the core of this research. The study shows that scholars increasingly recognise that knowledge creation, as a socially enacted process, goes to the very heart of innovation. However, efforts to build upon these insights have struggled to escape the influence of the information processing paradigm of old and have failed to move beyond the persistent but problematic conceptualisation of knowledge creation in terms of tacit and explicit knowledge. Based on these insights, the study leverages extant research to develop the conceptual apparatus necessary to carry out an investigation of innovation and knowledge creation in virtual worlds. The study derives and articulates a set of definitions (of virtual worlds, innovation, knowledge and knowledge creation) to guide research. The study also leverages a number of extant theories in order to develop a preliminary framework to model knowledge creation in virtual worlds. Using a combination of participant observation and six case studies of innovative educational projects in Second Life, the study yields a range of insights into the process of knowledge creation in virtual worlds and into the factors that affect it. The study’s contributions to theory are expressed as a series of propositions and findings and are represented as a revised and empirically grounded theoretical framework of knowledge creation in virtual worlds. These findings highlight the importance of prior related knowledge and intrinsic motivation in terms of shaping and stimulating knowledge creation in virtual worlds. At the same time, they highlight the importance of meta-knowledge (knowledge about knowledge) in terms of guiding the knowledge creation process whilst revealing the diversity of behavioural approaches actually used to create knowledge in virtual worlds and. This theoretical framework is itself one of the chief contributions of the study and the analysis explores how it can be used to guide further research in virtual worlds and on knowledge creation. The study’s contributions to practice are presented as actionable guide to simulate knowledge creation in virtual worlds. This guide utilises a theoretically based classification of four knowledge-creator archetypes (the sage, the lore master, the artisan, and the apprentice) and derives an actionable set of behavioural prescriptions for each archetype. The study concludes with a discussion of the study’s implications in terms of future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: X-ray computed tomography (CT) is widely used, both clinically and preclinically, for fast, high-resolution anatomic imaging; however, compelling opportunities exist to expand its use in functional imaging applications. For instance, spectral information combined with nanoparticle contrast agents enables quantification of tissue perfusion levels, while temporal information details cardiac and respiratory dynamics. The authors propose and demonstrate a projection acquisition and reconstruction strategy for 5D CT (3D+dual energy+time) which recovers spectral and temporal information without substantially increasing radiation dose or sampling time relative to anatomic imaging protocols. METHODS: The authors approach the 5D reconstruction problem within the framework of low-rank and sparse matrix decomposition. Unlike previous work on rank-sparsity constrained CT reconstruction, the authors establish an explicit rank-sparse signal model to describe the spectral and temporal dimensions. The spectral dimension is represented as a well-sampled time and energy averaged image plus regularly undersampled principal components describing the spectral contrast. The temporal dimension is represented as the same time and energy averaged reconstruction plus contiguous, spatially sparse, and irregularly sampled temporal contrast images. Using a nonlinear, image domain filtration approach, the authors refer to as rank-sparse kernel regression, the authors transfer image structure from the well-sampled time and energy averaged reconstruction to the spectral and temporal contrast images. This regularization strategy strictly constrains the reconstruction problem while approximately separating the temporal and spectral dimensions. Separability results in a highly compressed representation for the 5D data in which projections are shared between the temporal and spectral reconstruction subproblems, enabling substantial undersampling. The authors solved the 5D reconstruction problem using the split Bregman method and GPU-based implementations of backprojection, reprojection, and kernel regression. Using a preclinical mouse model, the authors apply the proposed algorithm to study myocardial injury following radiation treatment of breast cancer. RESULTS: Quantitative 5D simulations are performed using the MOBY mouse phantom. Twenty data sets (ten cardiac phases, two energies) are reconstructed with 88 μm, isotropic voxels from 450 total projections acquired over a single 360° rotation. In vivo 5D myocardial injury data sets acquired in two mice injected with gold and iodine nanoparticles are also reconstructed with 20 data sets per mouse using the same acquisition parameters (dose: ∼60 mGy). For both the simulations and the in vivo data, the reconstruction quality is sufficient to perform material decomposition into gold and iodine maps to localize the extent of myocardial injury (gold accumulation) and to measure cardiac functional metrics (vascular iodine). Their 5D CT imaging protocol represents a 95% reduction in radiation dose per cardiac phase and energy and a 40-fold decrease in projection sampling time relative to their standard imaging protocol. CONCLUSIONS: Their 5D CT data acquisition and reconstruction protocol efficiently exploits the rank-sparse nature of spectral and temporal CT data to provide high-fidelity reconstruction results without increased radiation dose or sampling time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Book review of: Chance Encounters: A First Course in Data Analysis and Inference by Christopher J. Wild and George A.F. Seber 2000, John Wiley & Sons Inc. Hard-bound, xviii + 612 pp ISBN 0-471-32936-3

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a formal method for representing and recognizing scenario patterns with rich internal temporal aspects. A scenario is presented as a collection of time-independent fluents, together with the corresponding temporal knowledge that can be relative and/or with absolute values. A graphical representation for temporal scenarios is introduced which supports consistence checking as for the temporal constraints. In terms of such a graphical representation, graph-matching algorithms/methodologies can be directly adopted for recognizing scenario patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of the generic attacks and countermeasures for block cipher based message authentication code algorithms (MAC) in sensor applications is undertaken; the conclusions are used in the design of two new MAC constructs Quicker Block Chaining MAC1 (QBC-MAC1) and Quicker Block Chaining MAC2 (QBC-MAC2). Using software simulation we show that our new constructs point to improvements in usage of CPU instruction clock cycle and energy requirement when benchmarked against the de facto Cipher Block Chaining MAC (CBC-MAC) based construct used in the TinySec security protocol for wireless sensor networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information on past trends is essential to inform future predictions and underpin attribution needed to drive policy responses. It has long been recognised that sustained observations are essential for disentangling climate-driven change from other regional and local-scale anthropogenic impacts and environmental fluctuations or cycles in natural systems. This paper highlights how data rescue and re-use have contributed to the debate on climate change responses of marine biodiversity and ecosystems. It also illustrates via two case studies the re-use of old data to address new policy concerns. The case studies focus on (1) plankton, fish and benthos from the Western English Channel and (2) broad-scale and long-term studies of intertidal species around the British Isles. Case study 1 using the Marine Biological Association of the UK's English Channel data has shown the influence of climatic fluctuations on phenology (migration and breeding patterns) and has also helped to disentangle responses to fishing pressure from those driven by climate, and provided insights into ecosystem-level change in the English Channel. Case study 2 has shown recent range extensions, increases of abundance and changes in phenology (breeding patterns) of southern, warm-water intertidal species in relation to recent rapid climate change and fluctuations in northern and southern barnacle species, enabling modelling and prediction of future states. The case is made for continuing targeted sustained observations and their importance for marine management and policy development.