897 resultados para Contracts of execution
Resumo:
This paper presents the use of a multiprocessor architecture for the performance improvement of tomographic image reconstruction. Image reconstruction in computed tomography (CT) is an intensive task for single-processor systems. We investigate the filtered image reconstruction suitability based on DSPs organized for parallel processing and its comparison with the Message Passing Interface (MPI) library. The experimental results show that the speedups observed for both platforms were increased in the same direction of the image resolution. In addition, the execution time to communication time ratios (Rt/Rc) as a function of the sample size have shown a narrow variation for the DSP platform in comparison with the MPI platform, which indicates its better performance for parallel image reconstruction.
Resumo:
Specific choices about how to represent complex networks can have a substantial impact on the execution time required for the respective construction and analysis of those structures. In this work we report a comparison of the effects of representing complex networks statically by adjacency matrices or dynamically by adjacency lists. Three theoretical models of complex networks are considered: two types of Erdos-Renyi as well as the Barabasi-Albert model. We investigated the effect of the different representations with respect to the construction and measurement of several topological properties (i.e. degree, clustering coefficient, shortest path length, and betweenness centrality). We found that different forms of representation generally have a substantial effect on the execution time, with the sparse representation frequently resulting in remarkably superior performance. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The InteGrade project is a multi-university effort to build a novel grid computing middleware based on the opportunistic use of resources belonging to user workstations. The InteGrade middleware currently enables the execution of sequential, bag-of-tasks, and parallel applications that follow the BSP or the MPI programming models. This article presents the lessons learned over the last five years of the InteGrade development and describes the solutions achieved concerning the support for robust application execution. The contributions cover the related fields of application scheduling, execution management, and fault tolerance. We present our solutions, describing their implementation principles and evaluation through the analysis of several experimental results. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
In this project, two broad facets in the design of a methodology for performance optimization of indexable carbide inserts were examined. They were physical destructive testing and software simulation.For the physical testing, statistical research techniques were used for the design of the methodology. A five step method which began with Problem definition, through System identification, Statistical model formation, Data collection and Statistical analyses and results was indepthly elaborated upon. Set-up and execution of an experiment with a compression machine together with roadblocks and possible solution to curb road blocks to quality data collection were examined. 2k factorial design was illustrated and recommended for process improvement. Instances of first-order and second-order response surface analyses were encountered. In the case of curvature, test for curvature significance with center point analysis was recommended. Process optimization with method of steepest ascent and central composite design or process robustness studies of response surface analyses were also recommended.For the simulation test, AdvantEdge program was identified as the most used software for tool development. Challenges to the efficient application of this software were identified and possible solutions proposed. In conclusion, software simulation and physical testing were recommended to meet the objective of the project.
Resumo:
Recent developments in biological research, has shown that the initial maximum permissible exposure (MPE) limits for protection of workers from risks associated with artificial optical radiations were more stringent than needed. Using the most recent MPE limits for artificial optical radiation this piece of work was focused on the investigation of the level of visible light attenuation needed by automatic welding filters in case of switching failure. Results from the comparison of different exposure standards were employed in investigating the need of Vis/IR and blue light transmittance requirement for automatic welding filters. Real and arbitrary spectra were taken into consideration for the worst and best case scenarios of artificial optical radiations. An excel worksheet developed during the execution of this project took into consideration the exposure from different light sources and the precision of the spectrometer used in measuring the transmittances of a welding filter. The worksheet was developed and tested with known product properties to investigate the validity of its formulation. The conclusion drawn from this project was that attenuation in the light state will be needed for products with the darkest state shade 11 or higher. Also shown is that current welding filter protects the eye well enough even in the case of switching failure.
Resumo:
E-Science experiments typically involve many distributed services maintained by different organisations. After an experiment has been executed, it is useful for a scientist to verify that the execution was performed correctly or is compatible with some existing experimental criteria or standards, not necessarily anticipated prior to execution. Scientists may also want to review and verify experiments performed by their colleagues. There are no existing frameworks for validating such experiments in today's e-Science systems. Users therefore have to rely on error checking performed by the services, or adopt other ad hoc methods. This paper introduces a platform-independent framework for validating workflow executions. The validation relies on reasoning over the documented provenance of experiment results and semantic descriptions of services advertised in a registry. This validation process ensures experiments are performed correctly, and thus results generated are meaningful. The framework is tested in a bioinformatics application that performs protein compressibility analysis.
Resumo:
Electronic contracts are a means of representing agreed responsibilities and expected behaviour of autonomous agents acting on behalf of businesses. They can be used to regulate behaviour by providing negative consequences, penalties, where the responsibilities and expectations are not met, i.e. the contract is violated. However, long-term business relationships require some flexibility in the face of circumstances that do not conform to the assumptions of the contract, that is, mitigating circumstances. In this paper, we describe how contract parties can represent and enact policies on mitigating circumstances. As part of this, we require records of what has occurred within the system leading up to a violation: the provenance of the violation. We therefore bring together contract-based and provenance systems to solve the issue of mitigating circumstances.
Resumo:
Mirroring the paper versions exchanged between businesses today, electronic contracts offer the possibility of dynamic, automatic creation and enforcement of restrictions and compulsions on agent behaviour that are designed to ensure business objectives are met. However, where there are many contracts within a particular application, it can be difficult to determine whether the system can reliably fulfil them all; computer-parsable electronic contracts may allow such verification to be automated. In this paper, we describe a conceptual framework and architecture specification in which normative business contracts can be electronically represented, verified, established, renewed, etc. In particular, we aim to allow systems containing multiple contracts to be checked for conflicts and violations of business objectives. We illustrate the framework and architecture with an aerospace example.
Resumo:
Electronic contracts mirror the paper versions exchanged between businesses today, and offer the possibility of dynamic, automatic creation and enforcement of restrictions and compulsions on service behaviour that are designed to ensure business objectives are met. Where there are many contracts within a particular application, it can be difficult to determine whether the system can reliably fulfil them all, yet computer-parsable electronic contracts may allow such verification to be automated. In this chapter, we describe a conceptual framework and architecture specification in which normative business contracts can be electronically represented, verified, established, renewed, and so on. In particular, we aim to allow systems containing multiple contracts to be checked for conflicts and violations of business objectives. We illustrate the framework and architecture with an aerospace aftermarket example.
Resumo:
Before signing electronic contracts, a rational agent should estimate the expected utilities of these contracts and calculate the violation risks related to them. In order to perform such pre-signing procedures, this agent has to be capable of computing a policy taking into account the norms and sanctions in the contracts. In relation to this, the contribution of this work is threefold. First, we present the Normative Markov Decision Process, an extension of the Markov Decision Process for explicitly representing norms. In order to illustrate the usage of our framework, we model an example in a simulated aerospace aftermarket. Second, we specify an algorithm for identifying the states of the process which characterize the violation of norms. Finally, we show how to compute policies with our framework and how to calculate the risk of violating the norms in the contracts by adopting a particular policy.
Resumo:
The current system of controlling oil spills involves a complex relationship of international, federal and state law, which has not proven to be very effective. The multiple layers of regulation often leave shipowners unsure of the laws facing them. Furthemore, nations have had difficulty enforcing these legal requirements. This thesis deals with the role marine insurance can play within the existing system of legislation to provide a strong preventative influence that is simple and cost-effective to enforce. In principle, insurance has two ways of enforcing higher safety standards and limiting the risk of an accident occurring. The first is through the use of insurance premiums that are based on the level of care taken by the insured. This means that a person engaging in riskier behavior faces a higher insurance premium, because their actions increase the probability of an accident occurring. The second method, available to the insurer, is collectively known as cancellation provisions or underwriting clauses. These are clauses written into an insurance contract that invalidates the agreement when certain conditions are not met by the insured The problem has been that obtaining information about the behavior of an insured party requires monitoring and that incurs a cost to the insurer. The application of these principles proves to be a more complicated matter. The modern marine insurance industry is a complicated system of multiple contracts, through different insurers, that covers the many facets of oil transportation. Their business practices have resulted in policy packages that cross the neat bounds of individual, specific insurance coverage. This paper shows that insurance can improve safety standards in three general areas -crew training, hull and equipment construction and maintenance, and routing schemes and exclusionary zones. With crew, hull and equipment, underwriting clauses can be used to ensure that minimum standards are met by the insured. Premiums can then be structured to reflect the additional care taken by the insured above and beyond these minimum standards. Routing schemes are traffic flow systems applied to congested waterways, such as the entrance to New York harbor. Using natural obstacles or manmade dividers, ships are separated into two lanes of opposing traffic, similar to a road. Exclusionary zones are marine areas designated off limits to tanker traffic either because of a sensitive ecosystem or because local knowledge is required of the region to ensure safe navigation. Underwriting clauses can be used to nullify an insurance contract when a tanker is not in compliance with established exclusionary zones or routing schemes.