897 resultados para Abstraction.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Jackson System Development (JSD) is an operational software development method which addresses most of the software lifecycle either directly or by providing a framework into which more specialised techniques can fit. The method has two major phases: first an abstract specification is derived that is in principle executable; second the specification is implemented using a variety of transformations. The object oriented paradigm is based on data abstraction and encapsulation coupled to an inheritance architecture that is able to support software reuse. Its claims of improved programmer productivity and easier program maintenance make it an important technology to be considered for building complex software systems. The mapping of JSD specifications into procedural languages typified by Cobol, Ada, etc., involves techniques such as inversion and state vector separation to produce executable systems of acceptable performance. However, at present, no strategy exists to map JSD specifications into object oriented languages. The aim of this research is to investigate the relationship between JSD and the object oriented paradigm, and to identify and implement transformations capable of mapping JSD specifications into an object oriented language typified by Smalltalk-80. The direction which the transformational strategy follows is one whereby the concurrency of a specification is removed. Two approaches implementing inversion - an architectural transformation resulting in a simulated coroutine mechanism being generated - are described in detail. The first approach directly realises inversions by manipulating Smalltalk-80 system contexts. This is possible in Smalltalk-80 because contexts are first class objects and are accessible to the user like any other system object. However, problems associated with this approach are expounded. The second approach realises coroutine-like behaviour in a structure called a `followmap'. A followmap is the results of a transformation on a JSD process in which a collection of followsets is generated. Each followset represents all possible state transitions a process can undergo from the current state of the process. Followsets, together with exploitation of the class/instance mechanism for implementing state vector separation, form the basis for mapping JSD specifications into Smalltalk-80. A tool, which is also built in Smalltalk-80, supports these derived transformations and enables a user to generate Smalltalk-80 prototypes of JSD specifications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research investigates the general user interface problems in using networked services. Some of the problems are: users have to recall machine names and procedures to. invoke networked services; interactions with some of the services are by means of menu-based interfaces which are quite cumbersome to use; inconsistencies exist between the interfaces for different services because they were developed independently. These problems have to be removed so that users can use the services effectively. A prototype system has been developed to help users interact with networked services. This consists of software which gives the user an easy and consistent interface with the various services. The prototype is based on a graphical user interface and it includes the following appJications: Bath Information & Data Services; electronic mail; file editor. The prototype incorporates an online help facility to assist users using the system. The prototype can be divided into two parts: the user interface part that manages interactlon with the user; the communicatIon part that enables the communication with networked services to take place. The implementation is carried out using an object-oriented approach where both the user interface part and communication part are objects. The essential characteristics of object-orientation, - abstraction, encapsulation, inheritance and polymorphism - can all contribute to the better design and implementation of the prototype. The Smalltalk Model-View-Controller (MVC) methodology has been the framework for the construction of the prototype user interface. The purpose of the development was to study the effectiveness of users interaction to networked services. Having completed the prototype, tests users were requested to use the system to evaluate its effectiveness. The evaluation of the prototype is based on observation, i.e. observing the way users use the system and the opinion rating given by the users. Recommendations to improve further the prototype are given based on the results of the evaluation. based on the results of the evah:1ation. . .'. " "', ':::' ,n,<~;'.'

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Classification of metamorphic rocks is normally carried out using a poorly defined, subjective classification scheme making this an area in which many undergraduate geologists experience difficulties. An expert system to assist in such classification is presented which is capable of classifying rocks and also giving further details about a particular rock type. A mixed knowledge representation is used with frame, semantic and production rule systems available. Classification in the domain requires that different facets of a rock be classified. To implement this, rocks are represented by 'context' frames with slots representing each facet. Slots are satisfied by calling a pre-defined ruleset to carry out the necessary inference. The inference is handled by an interpreter which uses a dependency graph representation for the propagation of evidence. Uncertainty is handled by the system using a combination of the MYCIN certainty factor system and the Dempster-Shafer range mechanism. This allows for positive and negative reasoning, with rules capable of representing necessity and sufficiency of evidence, whilst also allowing the implementation of an alpha-beta pruning algorithm to guide question selection during inference. The system also utilizes a semantic net type structure to allow the expert to encode simple relationships between terms enabling rules to be written with a sensible level of abstraction. Using frames to represent rock types where subclassification is possible allows the knowledge base to be built in a modular fashion with subclassification frames only defined once the higher level of classification is functioning. Rulesets can similarly be added in modular fashion with the individual rules being essentially declarative allowing for simple updating and maintenance. The knowledge base so far developed for metamorphic classification serves to demonstrate the performance of the interpreter design whilst also moving some way towards providing a useful assistant to the non-expert metamorphic petrologist. The system demonstrates the possibilities for a fully developed knowledge base to handle the classification of igneous, sedimentary and metamorphic rocks. The current knowledge base and interpreter have been evaluated by potential users and experts. The results of the evaluation show that the system performs to an acceptable level and should be of use as a tool for both undergraduates and researchers from outside the metamorphic petrography field. .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been a resurgence of interest in values in recent public administration research, based on two distinct arguments. For different reasons, neither approach is likely to secure a robust normative basis for public endeavours. These reasons are assessed, using an alternative body of theory rooted in contemporary social theory that we term, 'new pragmatism'. New pragmatic ideas are deployed to critique the divorce of values from facts; the abstraction of values from concrete situations; the anthropocentric foundation to social choice; the poorly developed understanding of the process of governance, with its inherent pluralism; and the seeming reluctance to articulate principles of political discourse. © 2010 The Authors. Public Administration © 2010 Blackwell Publishing Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis includes analysis of disordered spin ensembles corresponding to Exact Cover, a multi-access channel problem, and composite models combining sparse and dense interactions. The satisfiability problem in Exact Cover is addressed using a statistical analysis of a simple branch and bound algorithm. The algorithm can be formulated in the large system limit as a branching process, for which critical properties can be analysed. Far from the critical point a set of differential equations may be used to model the process, and these are solved by numerical integration and exact bounding methods. The multi-access channel problem is formulated as an equilibrium statistical physics problem for the case of bit transmission on a channel with power control and synchronisation. A sparse code division multiple access method is considered and the optimal detection properties are examined in typical case by use of the replica method, and compared to detection performance achieved by interactive decoding methods. These codes are found to have phenomena closely resembling the well-understood dense codes. The composite model is introduced as an abstraction of canonical sparse and dense disordered spin models. The model includes couplings due to both dense and sparse topologies simultaneously. The new type of codes are shown to outperform sparse and dense codes in some regimes both in optimal performance, and in performance achieved by iterative detection methods in finite systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many areas of northern India, salinity renders groundwater unsuitable for drinking and even for irrigation. Though membrane treatment can be used to remove the salt, there are some drawbacks to this approach e.g. (1) depletion of the groundwater due to over-abstraction, (2) saline contamination of surface water and soil caused by concentrate disposal and (3) high electricity usage. To address these issues, a system is proposed in which a photovoltaic-powered reverse osmosis (RO) system is used to irrigate a greenhouse (GH) in a stand-alone arrangement. The concentrate from the RO is supplied to an evaporative cooling system, thus reducing the volume of the concentrate so that finally it can be evaporated in a pond to solid for safe disposal. Based on typical meteorological data for Delhi, calculations based on mass and energy balance are presented to assess the sizing and cost of the system. It is shown that solar radiation, freshwater output and evapotranspiration demand are readily matched due to the approximately linear relation among these variables. The demand for concentrate varies independently, however, thus favouring the use of a variable recovery arrangement. Though enough water may be harvested from the GH roof to provide year-round irrigation, this would require considerable storage. Some practical options for storage tanks are discussed. An alternative use of rainwater is in misting to reduce peak temperatures in the summer. An example optimised design provides internal temperatures below 30EC (monthly average daily maxima) for 8 months of the year and costs about €36,000 for the whole system with GH floor area of 1000 m2 . Further work is needed to assess technical risks relating to scale-deposition in the membrane and evaporative pads, and to develop a business model that will allow such a project to succeed in the Indian rural context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the dearth of research into material artifacts and how they are engaged in strategizing activities. Building on the strategy-as-practice perspective, and the notion of epistemic objects, we develop a typology of strategy practices that show how managers use material artifacts to strategize by a dual process of knowledge abstraction and substitution. Empirically, we study the practice of underwriting managers in reinsurance companies. Our findings first identify the artifacts – pictures, maps, data packs, spreadsheets and graphs – that these managers use to appraise reinsurance deals. Second, the analysis of each artifact’s situated use led to the identification of five practices for doing strategy with artifacts: physicalizing, locating, enumerating, analyzing, and selecting. Last, we developed a typology that shows how practices vary in terms of their level of abstraction from the physical properties of the risk being reinsured and unfold through a process of substituting. Our conceptual framework extends existing work in the strategy-as-practice field that calls for research into the role of material artifacts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Classification is the most basic method for organizing resources in the physical space, cyber space, socio space and mental space. To create a unified model that can effectively manage resources in different spaces is a challenge. The Resource Space Model RSM is to manage versatile resources with a multi-dimensional classification space. It supports generalization and specialization on multi-dimensional classifications. This paper introduces the basic concepts of RSM, and proposes the Probabilistic Resource Space Model, P-RSM, to deal with uncertainty in managing various resources in different spaces of the cyber-physical society. P-RSM’s normal forms, operations and integrity constraints are developed to support effective management of the resource space. Characteristics of the P-RSM are analyzed through experiments. This model also enables various services to be described, discovered and composed from multiple dimensions and abstraction levels with normal form and integrity guarantees. Some extensions and applications of the P-RSM are introduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Much research pursues machine intelligence through better representation of semantics. What is semantics? People in different areas view semantics from different facets although it accompanies interaction through civilization. Some researchers believe that humans have some innate structure in mind for processing semantics. Then, what the structure is like? Some argue that humans evolve a structure for processing semantics through constant learning. Then, how the process is like? Humans have invented various symbol systems to represent semantics. Can semantics be accurately represented? Turing machines are good at processing symbols according to algorithms designed by humans, but they are limited in ability to process semantics and to do active interaction. Super computers and high-speed networks do not help solve this issue as they do not have any semantic worldview and cannot reflect themselves. Can future cyber-society have some semantic images that enable machines and individuals (humans and agents) to reflect themselves and interact with each other with knowing social situation through time? This paper concerns these issues in the context of studying an interactive semantics for the future cyber-society. It firstly distinguishes social semantics from natural semantics, and then explores the interactive semantics in the category of social semantics. Interactive semantics consists of an interactive system and its semantic image, which co-evolve and influence each other. The semantic worldview and interactive semantic base are proposed as the semantic basis of interaction. The process of building and explaining semantic image can be based on an evolving structure incorporating adaptive multi-dimensional classification space and self-organized semantic link network. A semantic lens is proposed to enhance the potential of the structure and help individuals build and retrieve semantic images from different facets, abstraction levels and scales through time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote sensing data is routinely used in ecology to investigate the relationship between landscape pattern as characterised by land use and land cover maps, and ecological processes. Multiple factors related to the representation of geographic phenomenon have been shown to affect characterisation of landscape pattern resulting in spatial uncertainty. This study investigated the effect of the interaction between landscape spatial pattern and geospatial processing methods statistically; unlike most papers which consider the effect of each factor in isolation only. This is important since data used to calculate landscape metrics typically undergo a series of data abstraction processing tasks and are rarely performed in isolation. The geospatial processing methods tested were the aggregation method and the choice of pixel size used to aggregate data. These were compared to two components of landscape pattern, spatial heterogeneity and the proportion of landcover class area. The interactions and their effect on the final landcover map were described using landscape metrics to measure landscape pattern and classification accuracy (response variables). All landscape metrics and classification accuracy were shown to be affected by both landscape pattern and by processing methods. Large variability in the response of those variables and interactions between the explanatory variables were observed. However, even though interactions occurred, this only affected the magnitude of the difference in landscape metric values. Thus, provided that the same processing methods are used, landscapes should retain their ranking when their landscape metrics are compared. For example, highly fragmented landscapes will always have larger values for the landscape metric "number of patches" than less fragmented landscapes. But the magnitude of difference between the landscapes may change and therefore absolute values of landscape metrics may need to be interpreted with caution. The explanatory variables which had the largest effects were spatial heterogeneity and pixel size. These explanatory variables tended to result in large main effects and large interactions. The high variability in the response variables and the interaction of the explanatory variables indicate it would be difficult to make generalisations about the impact of processing on landscape pattern as only two processing methods were tested and it is likely that untested processing methods will potentially result in even greater spatial uncertainty. © 2013 Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reactions of chloroform over triphenylphosphine-protected Au nanoparticles have been studied using electron paramagnetic resonance (EPR) spectroscopy and a spin trapping technique. Two competing reactions, abstraction of hydrogen and halogen atoms, were identified. The hydrogen abstraction reaction showed an inverse kinetic isotope effect. Treatment of nanoparticles with oxidizing or reducing reagents made it possible to tune the selectivity of radical formation from halogen to hydrogen (deuterium) abstraction. Treatment with PbO2 promoted the deuterium abstraction reaction followed by the loss of nanoparticle activity, whereas treatment with NaBH4 regenerated the nanoparticle activity towards Cl atom abstraction. X-ray photoelectron spectroscopy showed an increased Au:P ratio upon treatment with oxidizing reagents. This is likely due to the oxidation of some phosphine ligands to phosphine oxides which then desorb from the nanoparticle surface. © 2009 The Royal Societ of Chemistry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In different fields a conception of granules is applied both as a group of elements defined by internal properties and as something inseparable whole reflecting external properties. Granular computing may be interpreted in terms of abstraction, generalization, clustering, levels of abstraction, levels of detail, and so on. We have proposed to use multialgebraic systems as a mathematical tool for synthesis and analysis of granules and granule structures. The theorem of necessary and sufficient conditions for multialgebraic systems existence has been proved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are a great deal of approaches in artificial intelligence, some of them also coming from biology and neirophysiology. In this paper we are making a review, discussing many of them, and arranging our discussion around the autonomous agent research. We highlight three aspect in our classification: type of abstraction applied for representing agent knowledge, the implementation of hypothesis processing mechanism, allowed degree of freedom in behaviour and self-organizing. Using this classification many approaches in artificial intelligence are evaluated. Then we summarize all discussed ideas and propose a series of general principles for building an autonomous adaptive agent.