12 resultados para software, translation, validation tool, VMNET, Wikipedia, XML
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this paper we present an architecture for network and applications management, which is based on the Active Networks paradigm and shows the advantages of network programmability. The stimulus to develop this architecture arises from an actual need to manage a cluster of active nodes, where it is often required to redeploy network assets and modify nodes connectivity. In our architecture, a remote front-end of the managing entity allows the operator to design new network topologies, to check the status of the nodes and to configure them. Moreover, the proposed framework allows to explore an active network, to monitor the active applications, to query each node and to install programmable traps. In order to take advantage of the Active Networks technology, we introduce active SNMP-like MIBs and agents, which are dynamic and programmable. The programmable management agents make tracing distributed applications a feasible task. We propose a general framework that can inter-operate with any active execution environment. In this framework, both the manager and the monitor front-ends communicate with an active node (the Active Network Access Point) through the XML language. A gateway service performs the translation of the queries from XML to an active packet language and injects the code in the network. We demonstrate the implementation of an active network gateway for PLAN (Packet Language for Active Networks) in a forty active nodes testbed. Finally, we discuss an application of the active management architecture to detect the causes of network failures by tracing network events in time.
Resumo:
Motivation: There is a frequent need to apply a large range of local or remote prediction and annotation tools to one or more sequences. We have created a tool able to dispatch one or more sequences to assorted services by defining a consistent XML format for data and annotations. Results: By analyzing annotation tools, we have determined that annotations can be described using one or more of the six forms of data: numeric or textual annotation of residues, domains (residue ranges) or whole sequences. With this in mind, XML DTDs have been designed to store the input and output of any server. Plug-in wrappers to a number of services have been written which are called from a master script. The resulting APATML is then formatted for display in HTML. Alternatively further tools may be written to perform post-analysis.
Resumo:
Among the more peculiar literary papyri uncovered in the past century are numerous bilingual texts of Virgil and Cicero, with the Latin original and a Greek translation arranged in distinctive narrow columns. These materials, variously classified as texts with translations or as glossaries, were evidently used by Greek-speaking students when they first started to read Latin literature. They thus provide a unique window into the experience of the first of many groups of non-native Latin speakers to struggle with reading the classics of Latin literature.
Resumo:
Results are presented from a new web application called OceanDIVA - Ocean Data Intercomparison and Visualization Application. This tool reads hydrographic profiles and ocean model output and presents the data on either depth levels or isotherms for viewing in Google Earth, or as probability density functions (PDFs) of regional model-data misfits. As part of the CLIVAR Global Synthesis and Observations Panel, an intercomparison of water mass properties of various ocean syntheses has been undertaken using OceanDIVA. Analysis of model-data misfits reveals significant differences between the water mass properties of the syntheses, such as the ability to capture mode water properties.
Resumo:
The density (BSG) of bone increases, at the osteon scale, during lifetime aging within the bone. In addition, post-mortem diagenetic change due to microbial attack produces denser bioapatite. Thus, fractionation of finely powdered bone on the basis of density should not only enable younger and older populations of osteons to be separated but also make it possible to separate out a less diagenetically altered component. We show that the density fractionation method can be used as a tool to investigate the isotopic history within an individual's lifetime, both in recent and archaeological contexts, and we use the bomb C-14 atmospheric pulse for validating the method.
Resumo:
A desktop tool for replay and analysis of gaze-enhanced multiparty virtual collaborative sessions is described. We linked three CAVE (TM)-like environments, creating a multiparty collaborative virtual space where avatars are animated with 3D gaze as well as head and hand motions in real time. Log files are recorded for subsequent playback and analysis Using the proposed software tool. During replaying the user can rotate the viewpoint and navigate in the simulated 3D scene. The playback mechanism relies on multiple distributed log files captured at every site. This structure enables an observer to experience latencies of movement and information transfer for every site as this is important fir conversation analysis. Playback uses an event-replay algorithm, modified to allow fast traversal of the scene by selective rendering of nodes, and to simulate fast random access. The tool's is analysis module can show each participant's 3D gaze points and areas where gaze has been concentrated.
An empirical study of process-related attributes in segmented software cost-estimation relationships
Resumo:
Parametric software effort estimation models consisting on a single mathematical relationship suffer from poor adjustment and predictive characteristics in cases in which the historical database considered contains data coming from projects of a heterogeneous nature. The segmentation of the input domain according to clusters obtained from the database of historical projects serves as a tool for more realistic models that use several local estimation relationships. Nonetheless, it may be hypothesized that using clustering algorithms without previous consideration of the influence of well-known project attributes misses the opportunity to obtain more realistic segments. In this paper, we describe the results of an empirical study using the ISBSG-8 database and the EM clustering algorithm that studies the influence of the consideration of two process-related attributes as drivers of the clustering process: the use of engineering methodologies and the use of CASE tools. The results provide evidence that such consideration conditions significantly the final model obtained, even though the resulting predictive quality is of a similar magnitude.
Resumo:
The construction sector is under growing pressure to increase productivity and improve quality, most notably in reports by Latham (1994, Constructing the Team, HMSO, London) and Egan (1998, Rethinking Construction, HMSO, London). A major problem for construction companies is the lack of project predictability. One method of increasing predictability and delivering increased customer value is through the systematic management of construction processes. However, the industry has no methodological mechanism to assess process capability and prioritise process improvements. Standardized Process Improvement for Construction Enterprises (SPICE) is a research project that is attempting to develop a stepwise process improvement framework for the construction industry, utilizing experience from the software industry, and in particular the Capability Maturity Model (CMM), which has resulted in significant productivity improvements in the software industry. This paper introduces SPICE concepts and presents the results from two case studies conducted on design and build projects. These studies have provided further in-sight into the relevance and accuracy of the framework, as well as its value for the construction sector.
Resumo:
To investigate the perception of emotional facial expressions, researchers rely on shared sets of photos or videos, most often generated by actor portrayals. The drawback of such standardized material is a lack of flexibility and controllability, as it does not allow the systematic parametric manipulation of specific features of facial expressions on the one hand, and of more general properties of the facial identity (age, ethnicity, gender) on the other. To remedy this problem, we developed FACSGen: a novel tool that allows the creation of realistic synthetic 3D facial stimuli, both static and dynamic, based on the Facial Action Coding System. FACSGen provides researchers with total control over facial action units, and corresponding informational cues in 3D synthetic faces. We present four studies validating both the software and the general methodology of systematically generating controlled facial expression patterns for stimulus presentation.
Resumo:
The Virtual Lightbox for Museums and Archives (VLMA) is a tool for collecting and reusing, in a structured fashion, the online contents of museums and archive datasets. It is not restricted to datasets with visual components although VLMA includes a lightbox service that enables comparison and manipulation of visual information. With VLMA, one can browse and search collections, construct personal collections, annotate them, export these collections to XML or Impress (Open Office) presentation format, and share collections with other VLMA users. VLMA was piloted as an e-Learning tool as part of JISC’s e-Learning focus in its first phase (2004-2005) and in its second phase (2005-2006) it has incorporated new partner collections while improving and expanding interfaces and services. This paper concerns its development as a research and teaching tool, especially to teachers using museum collections, and discusses the recent development of VLMA.
Resumo:
This paper assesses the performance of a vocabulary test designed to measure second language productive vocabulary knowledge.The test, Lex30, uses a word association task to elicit vocabulary, and uses word frequency data to measure the vocabulary produced. Here we report firstly on the reliability of the test as measured by a test-retest study, a parallel test forms experiment and an internal consistency measure. We then investigate the construct validity of the test by looking at changes in test performance over time, analyses of correlations with scores on similar tests, and comparison of spoken and written test performance. Last, we examine the theoretical bases of the two main test components: eliciting vocabulary and measuring vocabulary. Interpretations of our findings are discussed in the context of test validation research literature. We conclude that the findings reported here present a robust argument for the validity of the test as a research tool, and encourage further investigation of its validity in an instructional context
Resumo:
The general circulation models used to simulate global climate typically feature resolution too coarse to reproduce many smaller-scale processes, which are crucial to determining the regional responses to climate change. A novel approach to downscale climate change scenarios is presented which includes the interactions between the North Atlantic Ocean and the European shelves as well as their impact on the North Atlantic and European climate. The goal of this paper is to introduce the global ocean-regional atmosphere coupling concept and to show the potential benefits of this model system to simulate present-day climate. A global ocean-sea ice-marine biogeochemistry model (MPIOM/HAMOCC) with regionally high horizontal resolution is coupled to an atmospheric regional model (REMO) and global terrestrial hydrology model (HD) via the OASIS coupler. Moreover, results obtained with ROM using NCEP/NCAR reanalysis and ECHAM5/MPIOM CMIP3 historical simulations as boundary conditions are presented and discussed for the North Atlantic and North European region. The validation of all the model components, i.e., ocean, atmosphere, terrestrial hydrology, and ocean biogeochemistry is performed and discussed. The careful and detailed validation of ROM provides evidence that the proposed model system improves the simulation of many aspects of the regional climate, remarkably the ocean, even though some biases persist in other model components, thus leaving potential for future improvement. We conclude that ROM is a powerful tool to estimate possible impacts of climate change on the regional scale.