193 resultados para Reti calcolatori Protocolli comunicazione Gerarchie protocolli Software Defined Networking Internet
Resumo:
Porosity is one of the key parameters of the macroscopic structure of porous media, generally defined as the ratio of the free spaces occupied (by the volume of air) within the material to the total volume of the material. Porosity is determined by measuring skeletal volume and the envelope volume. Solid displacement method is one of the inexpensive and easy methods to determine the envelope volume of a sample with an irregular shape. In this method, generally glass beads are used as a solid due to their uniform size, compactness and fluidity properties. The smaller size of the glass beads means that they enter into the open pores which have a larger diameter than the glass beads. Although extensive research has been carried out on porosity determination using displacement method, no study exists which adequately reports micro-level observation of the sample during measurement. This study set out with the aim of assessing the accuracy of solid displacement method of bulk density measurement of dried foods by micro-level observation. Solid displacement method of porosity determination was conducted using a cylindrical vial (cylindrical plastic container) and 57 µm glass beads in order to measure the bulk density of apple slices at different moisture contents. A scanning electron microscope (SEM), a profilometer and ImageJ software were used to investigate the penetration of glass beads into the surface pores during the determination of the porosity of dried food. A helium pycnometer was used to measure the particle density of the sample. Results show that a significant number of pores were large enough to allow the glass beads to enter into the pores, thereby causing some erroneous results. It was also found that coating the dried sample with appropriate coating material prior to measurement can resolve this problem.
Resumo:
PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.
Resumo:
This paper describes a software architecture for real-world robotic applications. We discuss issues of software reliability, testing and realistic off-line simulation that allows the majority of the automation system to be tested off-line in the laboratory before deployment in the field. A recent project, the automation of a very large mining machine is used to illustrate the discussion.
Resumo:
In 2005, Ginger Myles and Hongxia Jin proposed a software watermarking scheme based on converting jump instructions or unconditional branch statements (UBSs) by calls to a fingerprint branch function (FBF) that computes the correct target address of the UBS as a function of the generated fingerprint and integrity check. If the program is tampered with, the fingerprint and integrity checks change and the target address will not be computed correctly. In this paper, we present an attack based on tracking stack pointer modifications to break the scheme and provide implementation details. The key element of the attack is to remove the fingerprint and integrity check generating code from the program after disassociating the target address from the fingerprint and integrity value. Using the debugging tools that give vast control to the attacker to track stack pointer operations, we perform both subtractive and watermark replacement attacks. The major steps in the attack are automated resulting in a fast and low-cost attack.
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
Recent models of language comprehension have assumed a tight coupling between the semantic representations of action words and cortical motor areas. We combined functional MRI with cytoarchitectonically defined probabilistic maps of left hemisphere primary and premotor cortices to analyse responses of functionally delineated execution- and observation-related regions during comprehension of action word meanings associated with specific effectors (e.g., punch, bite or stomp) and processing of items with various levels of lexical information (non body part-related meanings, nonwords, and visual character strings). The comprehension of effector specific action word meanings did not elicit preferential activity corresponding to the somatotopic organisation of effectors in either primary or premotor cortex. However, generic action word meanings did show increased BOLD signal responses compared to all other classes of lexical stimuli in the pre-SMA. As expected, the majority of the BOLD responses elicited by the lexical stimuli were in association cortex adjacent to the motor areas. We contrast our results with those of previous studies reporting significant effects for only 1 or 2 effectors outside cytoarchitectonically defined motor regions and discuss the importance of controlling for potentially confounding lexical variables such as imageability. We conclude that there is no strong evidence for a somatotopic organisation of action word meaning representations and argue the pre-SMA might have a role in maintaining abstract representations of action words as instructional cues.
Resumo:
Modularity has been suggested to be connected to evolvability because a higher degree of independence among parts allows them to evolve as separate units. Recently, the Escoufier RV coefficient has been proposed as a measure of the degree of integration between modules in multivariate morphometric datasets. However, it has been shown, using randomly simulated datasets, that the value of the RV coefficient depends on sample size. Also, so far there is no statistical test for the difference in the RV coefficient between a priori defined groups of observations. Here, we (1), using a rarefaction analysis, show that the value of the RV coefficient depends on sample size also in real geometric morphometric datasets; (2) propose a permutation procedure to test for the difference in the RV coefficient between a priori defined groups of observations; (3) show, through simulations, that such a permutation procedure has an appropriate Type I error; (4) suggest that a rarefaction procedure could be used to obtain sample-size-corrected values of the RV coefficient; and (5) propose a nearest-neighbor procedure that could be used when studying the variation of modularity in geographic space. The approaches outlined here, readily extendable to non-morphometric datasets, allow study of the variation in the degree of integration between a priori defined modules. A Java application – that will allow performance of the proposed test using a software with graphical user interface – has also been developed and is available at the Morphometrics at Stony Brook Web page (http://life.bio.sunysb.edu/morph/).
Resumo:
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.
Resumo:
This research explored how small and medium enterprises can achieve success with software as a service (SaaS) applications from cloud. Based upon an empirical investigation of six growth oriented and early technology adopting small and medium enterprises, this study proposes a SaaS for small and medium enterprise success model with two approaches: one for basic and one for advanced benefits. The basic model explains the effective use of SaaS for achieving informational and transactional benefits. The advanced model explains the enhanced use of software as a service for achieving strategic and transformational benefits. Both models explicate the information systems capabilities and organizational complementarities needed for achieving success with SaaS.
Resumo:
Bug fixing is a highly cooperative work activity where developers, testers, product managers and other stake-holders collaborate using a bug tracking system. In the context of Global Software Development (GSD), where software development is distributed across different geographical locations, we focus on understanding the role of bug trackers in supporting software bug fixing activities. We carried out a small-scale ethnographic fieldwork in a software product team distributed between Finland and India at a multinational engineering company. Using semi-structured interviews and in-situ observations of 16 bug cases, we show that the bug tracker 1) supported information needs of different stake holder, 2) established common-ground, and 3) reinforced issues related to ownership, performance and power. Consequently, we provide implications for design around these findings.
Resumo:
Information sharing in distance collaboration: A software engineering perspective, QueenslandFactors in software engineering workgroups such as geographical dispersion and background discipline can be conceptually characterized as "distances", and they are obstructive to team collaboration and information sharing. This thesis focuses on information sharing across multidimensional distances and develops an information sharing distance model, with six core dimensions: geography, time zone, organization, multi-discipline, heterogeneous roles, and varying project tenure. The research suggests that the effectiveness of workgroups may be improved through mindful conducts of information sharing, especially proactive consideration of, and explicit adjustment for, the distances of the recipient when sharing information.
Resumo:
This demonstration highlights the applications of our research work i.e. second generation (Scalable Fault Tolerant Agent Grooming Environment - SAGE) Multi Agent System, Integration of Software Agents and Grid Computing and Autonomous Agent Architecture in the Agent Platform. It is a conference planner application that uses collaborative effort of services deployed geographically wide in different technologies i.e. Software Agents, Grid computing and Web services to perform useful tasks as required. Copyright 2005 ACM.
Resumo:
Free software is viewed as a revolutionary and subversive practice, and in particular has dealt a strong blow to the traditional conception of intellectual property law (although in its current form could be considered a 'hack' of IP rights). However, other (capitalist) areas of law have been swift to embrace free software, or at least incorporate it into its own tenets. One area in particular is that of competition (antitrust) law, which itself has long been in theoretical conflict with intellectual property, due to the restriction on competition inherent in the grant of ‘monopoly’ rights by copyrights, patents and trademarks. This contribution will examine how competition law has approached free software by examining instances in which courts have had to deal with such initiatives, for instance in the Oracle Sun Systems merger, and the implications that these decisions have on free software initiatives. The presence or absence of corporate involvement in initiatives will be an important factor in this investigation, with it being posited that true instances of ‘commons-based peer production’ can still subvert the capitalist system, including perplexing its laws beyond intellectual property.