939 resultados para Evidence Containers, Representation, Provenance, Tool Interoperability
Resumo:
A single and very easy to use Graphical User Interface (GUI- MATLAB) based on the topological information contained in the Gibbs energy of mixing function has been developed as a friendly tool to check the coherence of NRTL parameters obtained in a correlation data procedure. Thus, the analysis of the GM/RT surface, the GM/RT for the binaries and the GM/RT in planes containing the tie lines should be necessary to validate the obtained parameters for the different models for correlating phase equlibrium data.
Resumo:
In order to characterize the provenance of lithogenic surface sediments from the Eastern Mediterranean Sea (EMS), residual (leached) fraction of 34 surface samples have been analysed for their 143Nd/144Nd and 87Sr/86Sr isotope ratios. The sample locations bracket all important entrances of riverine suspended matter into the EMS as well as all sub-basins of the EMS. The combined analyses of these two isotope ratios provide a precise characterization of the lithogenic fraction of surface sediments and record their dilution towards the central sub-basins. We reconstruct provenance and possible pathways of riverine dispersal and current redistribution, assuming more or less homogenous isotopic signatures and flux rates of the eolian fraction over the EMS. Lithogenic sediments entering the Ionian Sea from the Calabrian Arc and the Adriatic Sea are characterized by high 87Sr/86Sr isotope ratios and low epsilon-Nd(0) values (average 87Sr/86Sr=0.718005 and epsilon-Nd(0)=-11.06, n=5). Aegean Sea terrigenous sediments show an average ratio of 87Sr/86Sr=0.713089 (n=5) and values of epsilon-Nd(0)=-7.89 (n=5). The Aegean isotopic signature is traceable up to the southwest, south, and southeast of Crete. The sediment loads entering the EMS via the Aegean Sea are low and spread out mainly through the Strait of Casos (east of Crete). Surface sediments from the eastern Levantine Basin are marked by the highest epsilon-Nd(0) values (-3.3, n=6) and lowest 87Sr/86Sr isotope ratios (average 0.709541, n=6), reflecting the predominant input of the Nile sediment. The influence of the Nile sediment is traceable up to the NE-trending, eastern flank of the Mediterranean Ridge. The characterization of the modern riverine dispersal and eolian flux, based on isotope data, may serve as a tool to reconstruct climate-coupled variations of lithogenic sediment input into the EMS.
Resumo:
"February 1974."
Resumo:
Plants accumulate isotopes of carbon at different rates because of discrimination against C-13 relative to C-12. In plants that fix carbon by the C-3 pathway, the amount of discrimination correlates negatively with transpiration efficiency (TE) where TE is the amount of dry matter accumulated per unit water transpired. Therefore, carbon isotope discrimination (Delta) has become a useful tool for selecting genotypes with improved TE and performance in dry environments. Surveys of 161 sunflower (Helianthus spp.) genotypes of diverse origin revealed a large and unprecedented range of genetic variation for Delta (19.5-23.8parts per thousand). A strong negative genetic correlation (r(g)) between TE and Delta (r(g) = -0.87, P < 0.001) was observed in glasshouse studies. Gas exchange measurements of field grown plants indicated that Delta was strongly correlated with stomatal conductance to water vapor (g), (r(g) 0.64, P < 0.01), and the ratio of net assimilation rate (A) to g, (r(g) = 0.86, P < 0.001), an instantaneous measure of TE. Genotype CMSHA89MAX1 had the lowest TE (and highest Delta) of all genotypes tested in these studies and low yields in hybrid combination. Backcrossing studies showed that the TE of this genotype was due to an adverse effect of the MAX1 cytoplasm, which was inherited from the diploid perennial H. maximiliani Schrader. Overall, these studies suggested that there is an excellent opportunity for breeders to develop sunflower germplasm with improved TE. This can be achieved, in part, by avoiding cytoplasms such as the MAX1 cytoplasm.
Resumo:
Enterprise systems interoperability (ESI) is an important topic for business currently. This situation is evidenced, at least in part, by the number and extent of potential candidate protocols for such process interoperation, viz., ebXML, BPML, BPEL, and WSCI. Wide-ranging support for each of these candidate standards already exists. However, despite broad acceptance, a sound theoretical evaluation of these approaches has not yet been provided. We use the Bunge-Wand-Weber (BWW) models, in particular, the representation model, to provide the basis for such a theoretical evaluation. We, and other researchers, have shown the usefulness of the representation model for analyzing, evaluating, and engineering techniques in the areas of traditional and structured systems analysis, object-oriented modeling, and process modeling. In this work, we address the question, what are the potential semantic weaknesses of using ebXML alone for process interoperation between enterprise systems? We find that users will lack important implementation information because of representational deficiencies; due to ontological redundancy, the complexity of the specification is unnecessarily increased; and, users of the specification will have to bring in extra-model knowledge to understand constructs in the specification due to instances of ontological excess.
Resumo:
Objectives: To report the research and development of a new approach to Functional Capacity Evaluation, the Gibson Approach to Functional Capacity Evaluation (GAPP FCE) for chronic back pain clients. Methods: Four Studies, including pilot and feasibility testing, expert review, and preliminary interrater reliability examination, are described here. Participants included 7 healthy young adults and 19 rehabilitation clients with back pain who underwent assessment using the GAPP FCE. Thirteen therapists were trained in the approach and were silently observed administering the Functional Capacity Evalutions by at least 1 other trained therapists or the first investigator Or both. An expert review using 5 expert occupational therapists was also conducted. Results: Study 1, the pilot with healthy individuals, indicated that the GAPP FCE was a feasible approach with good utility. Study 2, a pilot using 2 trained therapists assessing 5 back pain clients, supported the clinical feasibility of the approach. The expert review in Study 3 found support for GAPP FCE. Study 4, a trial of the approach with 14 rehabilitation clients, found support for the interrater reliability of recommendations for return to work based on performance in the GAPP FCE. Discussion: The evidence thus far available supports the GAPP FCE as ail approach that provides a Sound method for evaluating the performance of the physical demands of work with clients with chronic back pain. The tool has been shown to have good face and content validity, to meet acceptable test standards, and to have reasonable interrater reliability. Further research is occurring to look at a larger interrater reliability study, to further examine content validity, and to examine predictive validity.
Resumo:
We review recent findings that, using fractal analysis, have demonstrated systematic regional and species differences in the branching complexity of neocortical pyramidal neurons. In particular, attention is focused on how fractal analysis is being applied to the study of specialization in pyramidal cell structure during the evolution of the primate cerebral cortex. These studies reveal variation in pyramidal cell phenotype that cannot be attributed solely to increasing brain volume. Moreover, the results of these studies suggest that the primate cerebral cortex is composed of neurons of different structural complexity. There is growing evidence to suggest that regional and species differences in neuronal structure influence function at both the cellular and circuit levels. These data challenge the prevailing dogma for cortical uniformity.
Resumo:
Objectives. It has been proposed that disruption of the internal proprioceptive representation, via incongruent sensory input, may underpin pathological pain states, but experimental evidence relies on conflicting visual input, which is not clinically relevant. We aimed to determine the symptomatic effect of incongruent proprioceptive input, imparted by vibration of the wrist tendons, which evokes the illusion of perpetual wrist flexion and disrupts cortical proprioceptive representation. Methods. Twenty-nine healthy and naive volunteers reported symptoms during five conditions: control, active and passive wrist flexion, extensor carpi radialis tendon vibration to evoke illusion of perpetual wrist flexion, and ulnar styloid (sham) vibration. No advice was given about possible illusions. Results. Twenty-one subjects reported the illusion of perpetual wrist flexion during tendon vibration. There was no effect of condition or of whether or not subjects reported an illusion on discomfort/pain (P > 0.28). Peculiarity, swelling and foreignness were greater during tendon vibration than during the other conditions, and greater during tendon vibration in those who reported an illusion of wrist flexion than in those who did not (P < 0.05 for all). Symptoms were reported by at least two subjects in each condition and four subjects reported systemic symptoms (e.g. nausea). Conclusions. In healthy volunteers, incongruent proprioceptive input does not cause discomfort or pain but does evoke feelings of peculiarity, swelling and foreignness in the limb.
Resumo:
Conventionally, document classification researches focus on improving the learning capabilities of classifiers. Nevertheless, according to our observation, the effectiveness of classification is limited by the suitability of document representation. Intuitively, the more features that are used in representation, the more comprehensive that documents are represented. However, if a representation contains too many irrelevant features, the classifier would suffer from not only the curse of high dimensionality, but also overfitting. To address this problem of suitableness of document representations, we present a classifier-independent approach to measure the effectiveness of document representations. Our approach utilises a labelled document corpus to estimate the distribution of documents in the feature space. By looking through documents in this way, we can clearly identify the contributions made by different features toward the document classification. Some experiments have been performed to show how the effectiveness is evaluated. Our approach can be used as a tool to assist feature selection, dimensionality reduction and document classification.
Resumo:
Current database technologies do not support contextualised representations of multi-dimensional narratives. This paper outlines a new approach to this problem using a multi-dimensional database served in a 3D game environment. Preliminary results indicate it is a particularly efficient method for the types of contextualised narratives used by Australian Aboriginal peoples to tell their stories about their traditional landscapes and knowledge practices. We discuss the development of a tool that complements rather than supplants direct experience of these traditional knowledge practices.
Resumo:
In this paper, we present a formal hardware verification framework linking ASM with MDG. ASM (Abstract State Machine) is a state based language for describing transition systems. MDG (Multiway Decision Graphs) provides symbolic representation of transition systems with support of abstract sorts and functions. We implemented a transformation tool that automatically generates MDG models from ASM specifications, then formal verification techniques provided by the MDG tool, such as model checking or equivalence checking, can be applied on the generated models. We support this work with a case study of an Island Tunnel Controller, which behavior and structure were specified in ASM then using our ASM-MDG tool successfully verified within the MDG tool.
Resumo:
In this paper we describe an approach to interface Abstract State Machines (ASM) with Multiway Decision Graphs (MDG) to enable tool support for the formal verification of ASM descriptions. ASM is a specification method for software and hardware providing a powerful means of modeling various kinds of systems. MDGs are decision diagrams based on abstract representation of data and axe used primarily for modeling hardware systems. The notions of ASM and MDG axe hence closely related to each other, making it appealing to link these two concepts. The proposed interface between ASM and MDG uses two steps: first, the ASM model is transformed into a flat, simple transition system as an intermediate model. Second, this intermediate model is transformed into the syntax of the input language of the MDG tool, MDG-HDL. We have successfully applied this transformation scheme on a case study, the Island Tunnel Controller, where we automatically generated the corresponding MDG-HDL models from ASM specifications.
Resumo:
In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).
Resumo:
This paper uses evidence gathered in two perception studies ofAustralasian and British accounting academics to reflect on aspects of the knowledge production systemwithin accounting academe. We provide evidence of the representation of multiple paradigms in many journals that are scored by participants as being of high quality. Indeed most of the journals we surveyed are perceived by accounting academics as incorporating research from more than one paradigm. It is argued that this ‘catholic’ approach by journal editors and the willingness of many respondents in our surveys to score journals highly on material they publish from both paradigm categories reflects a balanced acceptance of the multi-paradigmatic state of accounting research. Our analysis is set within an understanding of systems of accounting knowledge production as socially constructed and as playing an important role in the distribution of power and reward in the academy. We explore the impact of our results on concerns emerging from the work of a number of authors who carefully expose localised 'elites'. The possibilities for a closer relationship between research emerging from a multi-paradigm discipline and policy setting and practice are also discussed. The analysis provides a sense of optimism that the broad constituency of accounting academics operates within an environment conducive for the exchange of ideas. That optimism is dampened by concerns about the impact of local 'elites' and the need for more research on their impact on accounting academe.