975 resultados para semantic content annotation
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.
Resumo:
There are three key driving forces behind the development of Internet Content Management Systems (CMS) - a desire to manage the explosion of content, a desire to provide structure and meaning to content in order to make it accessible, and a desire to work collaboratively to manipulate content in some meaningful way. Yet the traditional CMS has been unable to meet the latter of these requirements, often failing to provide sufficient tools for collaboration in a distributed context. Peer-to-Peer (P2P) systems are networks in which every node is an equal participant (whether transmitting data, exchanging content, or invoking services) and there is an absence of any centralised administrative or coordinating authorities. P2P systems are inherently more scalable than equivalent client-server implementations as they tend to use resources at the edge of the network much more effectively. This paper details the rationale and design of a P2P middleware for collaborative content management.
Resumo:
The THz water content index of a sample is defined and advantages in using such metric in estimating a sample's relative water content are discussed. The errors from reflectance measurements performed at two different THz frequencies using a quasi-optical null-balance reflectometer are propagated to the errors in estimating the sample water content index.
Resumo:
Information provision to address the changing requirements can be best supported by content management. The Current information technology enables information to be stored and provided from various distributed sources. To identify and retrieve relevant information requires effective mechanisms for information discovery and assembly. This paper presents a method, which enables the design of such mechanisms, with a set of techniques for articulating and profiling users' requirements, formulating information provision specifications, realising management of information content in repositories, and facilitating response to the user's requirements dynamically during the process of knowledge construction. These functions are represented in an ontology which integrates the capability of the mechanisms. The ontological modelling in this paper has adopted semiotics principles with embedded norms to ensure coherent course of actions represented in these mechanisms. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Over the last decade, there has been an increasing body of work that explores whether sensory and motor information is a necessary part of semantic representation and processing. This is the embodiment hypothesis. This paper presents a theoretical review of this work that is intended to be useful for researchers in the neurosciences and neuropsychology. Beginning with a historical perspective, relevant theories are placed on a continuum from strongly embodied to completely unembodied representations. Predictions are derived and neuroscientific and neuropsychological evidence that could support different theories is reviewed; finally, criticisms of embodiment are discussed. We conclude that strongly embodied and completely disembodied theories are not supported, and that the remaining theories agree that semantic representation involves some form of Convergence Zones (Damasio, 1989) and the activation of modal content. For the future, research must carefully define the boundaries of semantic processing and tackle the representation of abstract entities.
Resumo:
We provide a unified framework for a range of linear transforms that can be used for the analysis of terahertz spectroscopic data, with particular emphasis on their application to the measurement of leaf water content. The use of linear transforms for filtering, regression, and classification is discussed. For illustration, a classification problem involving leaves at three stages of drought and a prediction problem involving simulated spectra are presented. Issues resulting from scaling the data set are discussed. Using Lagrange multipliers, we arrive at the transform that yields the maximum separation between the spectra and show that this optimal transform is equivalent to computing the Euclidean distance between the samples. The optimal linear transform is compared with the average for all the spectra as well as with the Karhunen–Loève transform to discriminate a wet leaf from a dry leaf. We show that taking several principal components into account is equivalent to defining new axes in which data are to be analyzed. The procedure shows that the coefficients of the Karhunen–Loève transform are well suited to the process of classification of spectra. This is in line with expectations, as these coefficients are built from the statistical properties of the data set analyzed.
Resumo:
A novel technique for the noninvasive continuous measurement of leaf water content is presented. The technique is based on transmission measurements of terahertz radiation with a null-balance quasi-optical transmissometer operating at 94 GHz. A model for the propagation of terahertz radiation through leaves is presented. This, in conjunction with leaf thickness information determined separately, may be used to quantitatively relate transmittance measurements to leaf water content. Measurements using a dispersive Fourier transform spectrometer in the range of 100 GHz-500 GHz using Phormium tenax and Fatsia japonica leaves are also reported.
Resumo:
Aluminium is not a physiological component of the breast but has been measured recently in human breast tissues and breast cyst fluids at levels above those found in blood serum or milk. Since the presence of aluminium can lead to iron dyshomeostasis, levels of aluminium and iron-binding proteins (ferritin, transferrin) were measured in nipple aspirate fluid (NAF), a fluid present in the breast duct tree and mirroring the breast microenvironment. NAFs were collected noninvasively from healthy women (NoCancer; n = 16) and breast cancer-affected women (Cancer; n = 19), and compared with levels in serum (n = 15) and milk (n = 45) from healthy subjects. The mean level of aluminium, measured by ICP-mass spectrometry, was significantly higher in Cancer NAF (268.4 ± 28.1 μg l−1; n = 19) than in NoCancer NAF (131.3 ± 9.6 μg l−1; n = 16; P < 0.0001). The mean level of ferritin, measured through immunoassay, was also found to be higher in Cancer NAF (280.0 ± 32.3 μg l−1) than in NoCancer NAF (55.5 ± 7.2 μg l−1), and furthermore, a positive correlation was found between levels of aluminium and ferritin in the Cancer NAF (correlation coefficient R = 0.94, P < 0.001). These results may suggest a role for raised levels of aluminium and modulation of proteins that regulate iron homeostasis as biomarkers for identification of women at higher risk of developing breast cancer. The reasons for the high levels of aluminium in NAF remain unknown but possibilities include either exposure to aluminium-based antiperspirant salts in the adjacent underarm area and/or preferential accumulation of aluminium by breast tissues.
Resumo:
Explosive volcanic eruptions cause episodic negative radiative forcing of the climate system. Using coupled atmosphere-ocean general circulation models (AOGCMs) subjected to historical forcing since the late nineteenth century, previous authors have shown that each large volcanic eruption is associated with a sudden drop in ocean heat content and sea-level from which the subsequent recovery is slow. Here we show that this effect may be an artefact of experimental design, caused by the AOGCMs not having been spun up to a steady state with volcanic forcing before the historical integrations begin. Because volcanic forcing has a long-term negative average, a cooling tendency is thus imposed on the ocean in the historical simulation. We recommend that an extra experiment be carried out in parallel to the historical simulation, with constant time-mean historical volcanic forcing, in order to correct for this effect and avoid misinterpretation of ocean heat content changes