862 resultados para Deadlock Analysis, Distributed Systems, Concurrent Systems, Formal Languages
Resumo:
Traditionally, geostatistical algorithms are contained within specialist GIS and spatial statistics software. Such packages are often expensive, with relatively complex user interfaces and steep learning curves, and cannot be easily integrated into more complex process chains. In contrast, Service Oriented Architectures (SOAs) promote interoperability and loose coupling within distributed systems, typically using XML (eXtensible Markup Language) and Web services. Web services provide a mechanism for a user to discover and consume a particular process, often as part of a larger process chain, with minimal knowledge of how it works. Wrapping current geostatistical algorithms with a Web service layer would thus increase their accessibility, but raises several complex issues. This paper discusses a solution to providing interoperable, automatic geostatistical processing through the use of Web services, developed in the INTAMAP project (INTeroperability and Automated MAPping). The project builds upon Open Geospatial Consortium standards for describing observations, typically used within sensor webs, and employs Geography Markup Language (GML) to describe the spatial aspect of the problem domain. Thus the interpolation service is extremely flexible, being able to support a range of observation types, and can cope with issues such as change of support and differing error characteristics of sensors (by utilising descriptions of the observation process provided by SensorML). XML is accepted as the de facto standard for describing Web services, due to its expressive capabilities which allow automatic discovery and consumption by ‘naive’ users. Any XML schema employed must therefore be capable of describing every aspect of a service and its processes. However, no schema currently exists that can define the complex uncertainties and modelling choices that are often present within geostatistical analysis. We show a solution to this problem, developing a family of XML schemata to enable the description of a full range of uncertainty types. These types will range from simple statistics, such as the kriging mean and variances, through to a range of probability distributions and non-parametric models, such as realisations from a conditional simulation. By employing these schemata within a Web Processing Service (WPS) we show a prototype moving towards a truly interoperable geostatistical software architecture.
Resumo:
Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
* The work is partially supported by the grant of National Academy of Science of Ukraine for the support of scientific researches by young scientists No 24-7/05, " Розробка Desktop Grid-системи і оптимізація її продуктивності ”.
Resumo:
We propose and experimentally demonstrate a new method to extend the range of Brillouin optical time domain analysis (BOTDA) systems. It exploits the virtual transparency created by second-order Raman pumping in optical fibers. The idea is theoretically analyzed and experimentally demonstrated in a 50 km fiber. By working close to transparency, we also show that the measurement length of the BOTDA can be increased up to 100 km with 2 meter resolution. We envisage extensions of this technique to measurement lengths well beyond this value, as long as the issue of relative intensity noise (RIN) of the primary Raman pump can be avoided. © 2010 Optical Society of America.
Resumo:
The reverse time migration algorithm (RTM) has been widely used in the seismic industry to generate images of the underground and thus reduce the risk of oil and gas exploration. Its widespread use is due to its high quality in underground imaging. The RTM is also known for its high computational cost. Therefore, parallel computing techniques have been used in their implementations. In general, parallel approaches for RTM use a coarse granularity by distributing the processing of a subset of seismic shots among nodes of distributed systems. Parallel approaches with coarse granularity for RTM have been shown to be very efficient since the processing of each seismic shot can be performed independently. For this reason, RTM algorithm performance can be considerably improved by using a parallel approach with finer granularity for the processing assigned to each node. This work presents an efficient parallel algorithm for 3D reverse time migration with fine granularity using OpenMP. The propagation algorithm of 3D acoustic wave makes up much of the RTM. Different load balancing were analyzed in order to minimize possible losses parallel performance at this stage. The results served as a basis for the implementation of other phases RTM: backpropagation and imaging condition. The proposed algorithm was tested with synthetic data representing some of the possible underground structures. Metrics such as speedup and efficiency were used to analyze its parallel performance. The migrated sections show that the algorithm obtained satisfactory performance in identifying subsurface structures. As for the parallel performance, the analysis clearly demonstrate the scalability of the algorithm achieving a speedup of 22.46 for the propagation of the wave and 16.95 for the RTM, both with 24 threads.
Resumo:
Secure Access For Everyone (SAFE), is an integrated system for managing trust
using a logic-based declarative language. Logical trust systems authorize each
request by constructing a proof from a context---a set of authenticated logic
statements representing credentials and policies issued by various principals
in a networked system. A key barrier to practical use of logical trust systems
is the problem of managing proof contexts: identifying, validating, and
assembling the credentials and policies that are relevant to each trust
decision.
SAFE addresses this challenge by (i) proposing a distributed authenticated data
repository for storing the credentials and policies; (ii) introducing a
programmable credential discovery and assembly layer that generates the
appropriate tailored context for a given request. The authenticated data
repository is built upon a scalable key-value store with its contents named by
secure identifiers and certified by the issuing principal. The SAFE language
provides scripting primitives to generate and organize logic sets representing
credentials and policies, materialize the logic sets as certificates, and link
them to reflect delegation patterns in the application. The authorizer fetches
the logic sets on demand, then validates and caches them locally for further
use. Upon each request, the authorizer constructs the tailored proof context
and provides it to the SAFE inference for certified validation.
Delegation-driven credential linking with certified data distribution provides
flexible and dynamic policy control enabling security and trust infrastructure
to be agile, while addressing the perennial problems related to today's
certificate infrastructure: automated credential discovery, scalable
revocation, and issuing credentials without relying on centralized authority.
We envision SAFE as a new foundation for building secure network systems. We
used SAFE to build secure services based on case studies drawn from practice:
(i) a secure name service resolver similar to DNS that resolves a name across
multi-domain federated systems; (ii) a secure proxy shim to delegate access
control decisions in a key-value store; (iii) an authorization module for a
networked infrastructure-as-a-service system with a federated trust structure
(NSF GENI initiative); and (iv) a secure cooperative data analytics service
that adheres to individual secrecy constraints while disclosing the data. We
present empirical evaluation based on these case studies and demonstrate that
SAFE supports a wide range of applications with low overhead.
Resumo:
In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.
Resumo:
The challenge of detecting a change in the distribution of data is a sequential decision problem that is relevant to many engineering solutions, including quality control and machine and process monitoring. This dissertation develops techniques for exact solution of change-detection problems with discrete time and discrete observations. Change-detection problems are classified as Bayes or minimax based on the availability of information on the change-time distribution. A Bayes optimal solution uses prior information about the distribution of the change time to minimize the expected cost, whereas a minimax optimal solution minimizes the cost under the worst-case change-time distribution. Both types of problems are addressed. The most important result of the dissertation is the development of a polynomial-time algorithm for the solution of important classes of Markov Bayes change-detection problems. Existing techniques for epsilon-exact solution of partially observable Markov decision processes have complexity exponential in the number of observation symbols. A new algorithm, called constellation induction, exploits the concavity and Lipschitz continuity of the value function, and has complexity polynomial in the number of observation symbols. It is shown that change-detection problems with a geometric change-time distribution and identically- and independently-distributed observations before and after the change are solvable in polynomial time. Also, change-detection problems on hidden Markov models with a fixed number of recurrent states are solvable in polynomial time. A detailed implementation and analysis of the constellation-induction algorithm are provided. Exact solution methods are also established for several types of minimax change-detection problems. Finite-horizon problems with arbitrary observation distributions are modeled as extensive-form games and solved using linear programs. Infinite-horizon problems with linear penalty for detection delay and identically- and independently-distributed observations can be solved in polynomial time via epsilon-optimal parameterization of a cumulative-sum procedure. Finally, the properties of policies for change-detection problems are described and analyzed. Simple classes of formal languages are shown to be sufficient for epsilon-exact solution of change-detection problems, and methods for finding minimally sized policy representations are described.
Resumo:
The PROSPER (Proof and Specification Assisted Design Environments) project advocates the use of toolkits which allow existing verification tools to be adapted to a more flexible format so that they may be treated as components. A system incorporating such tools becomes another component that can be embedded in an application. This paper describes the PROSPER Toolkit which enables this. The nature of communication between components is specified in a language-independent way. It is implemented in several common programming languages to allow a wide variety of tools to have access to the toolkit.
Resumo:
To analyze the characteristics and predict the dynamic behaviors of complex systems over time, comprehensive research to enable the development of systems that can intelligently adapt to the evolving conditions and infer new knowledge with algorithms that are not predesigned is crucially needed. This dissertation research studies the integration of the techniques and methodologies resulted from the fields of pattern recognition, intelligent agents, artificial immune systems, and distributed computing platforms, to create technologies that can more accurately describe and control the dynamics of real-world complex systems. The need for such technologies is emerging in manufacturing, transportation, hazard mitigation, weather and climate prediction, homeland security, and emergency response. Motivated by the ability of mobile agents to dynamically incorporate additional computational and control algorithms into executing applications, mobile agent technology is employed in this research for the adaptive sensing and monitoring in a wireless sensor network. Mobile agents are software components that can travel from one computing platform to another in a network and carry programs and data states that are needed for performing the assigned tasks. To support the generation, migration, communication, and management of mobile monitoring agents, an embeddable mobile agent system (Mobile-C) is integrated with sensor nodes. Mobile monitoring agents visit distributed sensor nodes, read real-time sensor data, and perform anomaly detection using the equipped pattern recognition algorithms. The optimal control of agents is achieved by mimicking the adaptive immune response and the application of multi-objective optimization algorithms. The mobile agent approach provides potential to reduce the communication load and energy consumption in monitoring networks. The major research work of this dissertation project includes: (1) studying effective feature extraction methods for time series measurement data; (2) investigating the impact of the feature extraction methods and dissimilarity measures on the performance of pattern recognition; (3) researching the effects of environmental factors on the performance of pattern recognition; (4) integrating an embeddable mobile agent system with wireless sensor nodes; (5) optimizing agent generation and distribution using artificial immune system concept and multi-objective algorithms; (6) applying mobile agent technology and pattern recognition algorithms for adaptive structural health monitoring and driving cycle pattern recognition; (7) developing a web-based monitoring network to enable the visualization and analysis of real-time sensor data remotely. Techniques and algorithms developed in this dissertation project will contribute to research advances in networked distributed systems operating under changing environments.
Resumo:
Purpose – The purpose of this paper is to propose a theoretical framework, based on contemporary philosophical aesthetics, from which principled assessments of the aesthetic value of information organization frameworks may be conducted.Design/methodology/approach – This paper identifies appropriate discourses within the field of philosophical aesthetics, constructs from them a framework for assessing aesthetic properties of information organization frameworks. This framework is then applied in two case studies examining the Library of Congress Subject Headings (LCSH), and Sexual Nomenclature: A Thesaurus. Findings – In both information organization frameworks studied, the aesthetic analysis was useful in identifying judgments of the frameworks as aesthetic judgments, in promoting discovery of further areas of aesthetic judgments, and in prompting reflection on the nature of these aesthetic judgments. Research limitations/implications – This study provides proof-of-concept for the aesthetic evaluation of information organization frameworks. Areas of future research are identified as the role of cultural relativism in such aesthetic evaluation and identification of appropriate aesthetic properties of information organization frameworks.Practical implications – By identifying a subset of judgments of information organization frameworks as aesthetic judgments, aesthetic evaluation of such frameworks can be made explicit and principled. Aesthetic judgments can be separated from questions of economic feasibility, functional requirements, and user-orientation. Design and maintenance of information organization frameworks can be based on these principles.Originality/value – This study introduces a new evaluative axis for information organization frameworks based on philosophical aesthetics. By improving the evaluation of such novel frameworks, design and maintenance can be guided by these principles.Keywords Evaluation, Research methods, Analysis, Bibliographic systems, Indexes, Retrieval languages
Resumo:
Purpose – The purpose of this paper is to propose a theoretical framework, based on contemporary philosophical aesthetics, from which principled assessments of the aesthetic value of information organization frameworks may be conducted.Design/methodology/approach – This paper identifies appropriate discourses within the field of philosophical aesthetics, constructs from them a framework for assessing aesthetic properties of information organization frameworks. This framework is then applied in two case studies examining the Library of Congress Subject Headings (LCSH), and Sexual Nomenclature: A Thesaurus. Findings – In both information organization frameworks studied, the aesthetic analysis was useful in identifying judgments of the frameworks as aesthetic judgments, in promoting discovery of further areas of aesthetic judgments, and in prompting reflection on the nature of these aesthetic judgments. Research limitations/implications – This study provides proof-of-concept for the aesthetic evaluation of information organization frameworks. Areas of future research are identified as the role of cultural relativism in such aesthetic evaluation and identification of appropriate aesthetic properties of information organization frameworks.Practical implications – By identifying a subset of judgments of information organization frameworks as aesthetic judgments, aesthetic evaluation of such frameworks can be made explicit and principled. Aesthetic judgments can be separated from questions of economic feasibility, functional requirements, and user-orientation. Design and maintenance of information organization frameworks can be based on these principles.Originality/value – This study introduces a new evaluative axis for information organization frameworks based on philosophical aesthetics. By improving the evaluation of such novel frameworks, design and maintenance can be guided by these principles.Keywords Evaluation, Analysis, Bibliographic systems, Indexes, Retrieval languages, Philosophy
Resumo:
Exact analytical solutions of the critical Rayleigh numbers have been obtained for a hydrothermal system consisting of a horizontal porous layer with temperature-dependent viscosity. The boundary conditions considered are constant temperature and zero vertical Darcy velocity at both the top and bottom of the layer. Not only can the derived analytical solutions be readily used to examine the effect of the temperature-dependent viscosity on the temperature-gradient driven convective flow, but also they can be used to validate the numerical methods such as the finite-element method and finite-difference method for dealing with the same kind of problem. The related analytical and numerical results demonstrated that the temperature-dependent viscosity destabilizes the temperature-gradient driven convective flow and therefore, may affect the ore body formation and mineralization in the upper crust of the Earth. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Consumers worldwide are increasingly concerned with sustainable production and consumption. Recently, a comprehensive study ranked 17 countries in regard to their environmentally friendly behaviour among consumers. Brazil was one of the top countries in the list. Yet, several studies highlight significant differences between consumers` intentions to consume ethically, and their actual purchase behaviour: the so-called `Attitude-Behaviour Gap`. In developing countries, few studies have been conducted on this issue. The objective of this study is therefore to investigate the gap between citizens` sustainability-related attitudes and food purchasing behaviour using empirical data from Brazil. To this end, Brazilian citizens` attitudes towards pig production systems were mapped through conjoint analysis and their coexistence with relevant pork product-related purchasing behaviour of consumers was investigated through cluster analysis. The conjoint experiment was carried Out with empirical data collected from 475 respondents surveyed in the South and Center-West regions of Brazil. The results of the conjoint analysis were used for a subsequent cluster analysis in order to identify clusters of Brazilian citizens with diversified attitudes towards pig production systems, using socio-demographics, attitudes towards sustainability-related themes that are expected to influence the way they evaluate pig production systems, and consumption frequency of various pork products as clusters` background information. Three clusters were identified as `indifferent`, `environmental conscious` and `sustainability-oriented` citizens. Although attitudes towards environment and nature had indeed an influence on citizens` specific attitudes towards pig farming at the cluster level, the relationship between `citizenship` and consumption behaviour was found to be weak. This finding is similar to previous research conducted with European consumers: what people (in their role of citizens) think about pig production systems does not appear to significantly influence their pork consumption choices. Improvements in the integrated management of this chain would better meet consumers` sustainability-related expectations towards pig production systems.
Resumo:
The popular Newmark algorithm, used for implicit direct integration of structural dynamics, is extended by means of a nodal partition to permit use of different timesteps in different regions of a structural model. The algorithm developed has as a special case an explicit-explicit subcycling algorithm previously reported by Belytschko, Yen and Mullen. That algorithm has been shown, in the absence of damping or other energy dissipation, to exhibit instability over narrow timestep ranges that become narrower as the number of degrees of freedom increases, making them unlikely to be encountered in practice. The present algorithm avoids such instabilities in the case of a one to two timestep ratio (two subcycles), achieving unconditional stability in an exponential sense for a linear problem. However, with three or more subcycles, the trapezoidal rule exhibits stability that becomes conditional, falling towards that of the central difference method as the number of subcycles increases. Instabilities over narrow timestep ranges, that become narrower as the model size increases, also appear with three or more subcycles. However by moving the partition between timesteps one row of elements into the region suitable for integration with the larger timestep these the unstable timestep ranges become extremely narrow, even in simple systems with a few degrees of freedom. As well, accuracy is improved. Use of a version of the Newmark algorithm that dissipates high frequencies minimises or eliminates these narrow bands of instability. Viscous damping is also shown to remove these instabilities, at the expense of having more effect on the low frequency response.