803 resultados para nature-based
Resumo:
Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called 'business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. © 2006 Springer-Verlag Berlin Heidelberg.
Resumo:
The majority of the literature about CBM is American in origin, and (inter alia) notes that there were differing uses of similar technology, indicating that context has an important role to play in the use of CBM. The literature maps the psychological effects of CBM in considerable detail, but only two published studies examine the context of CBM. These grounded results provide scant support for any systematic, quantitative, large scale analysis of computer based monitoring in the UK context. This thesis thus aims to systemically examine the context of CBM using discourse analysis. Forty four interviewees were theoretically sampled using a structured sample technique in four organizations. All were national or multinational enterprises. The interviews were semi structured in nature and divided into three sections. The first addressed the respondents' thoughts and perceptions about CBM, the second elicited talk about the departmental context (focusing the management - worker relationship), and the final section addressed the organizational context. The cases demonstrated variation in the use of CBM, measured according to the criteria of Westin (1987, 1988) and according to the interpretive repertoires used by the respondents in each case. Seven analytical categories of talk emerged from the data: three at the organizational level and four at the departmental level of analysis. Discourse analysis revealed two discrete interpretive repertories - the procedural and the substantive repertoires - in respondents' talk whose main variation occurred at the departmental level of analysis. Furthermore, patterns were found in the use of these repertories within cases and between categories. Between the cases, variation in the use of the repertories matched the between case variation according to the criteria of Westin. It would thus appear that the source of variation in the use of CBM lies in its context, more specifically in the relative emphasis of humanistic, interpersonal and idiosyncratic values within the management worker relationship.
Resumo:
With the growing appreciation of the contribution of small technology-based ventures to a healthy economy, an analysis of the individual who initiates and manages such ventures - the technical entrepreneur - is highly desirable, predominantly because of the influence of such an individual on the management and future strategy of the venture. An examination of recent research has indicated that a study of the previous experience and expertise of the entrepreneur, gained in previous occupations, may be highly relevant in determining the possible success of a new venture. This is particularly true where the specific expertise of the entrepreneur forms the main strategic advantage of the business, as in the case of small technology-based firms. Despite this, there has been very little research which has attempted to examine the relationship between the previous occupational background of the technical entrepreneur, and the management of the small technology-based firm. This thesis will examine this relationship, as well as providing an original contribution to the study of technical entrepreneurship in the UK. Consequently, the exploratory nature of the research prompted an inductive qualitative approach being adopted for the thesis. Through a two stage, multiple-site research approach, an examination was made of technical entrepreneurs heading award-winning technology-based small firms in the UK. The main research questions focused on management within the firm, the novelty and origin of the technology adopted, and the personal characteristics of the entrepreneur under study. The results of this study led to the creation of a specific typology for technical entrepreneurs, based on the individual's role in the development of technology within his previous occupation.
Resumo:
Some of the problems arising from the inherent instability of emulsions are discussed. Aspects of emulsion stability are described and particular attention is given to the influence of the chemical nature of the dispersed phase on adsorbed film structure and stability, Emulsion stability has been measured by a photomicrographic technique. Electrophoresis, interfacial tension and droplet rest-time data were also obtained. Emulsions were prepared using a range of oils, including aliphatic and aromatic hydrocarbons, dispersed In a solution of sodium dodecyl sulphate. In some cases a small amount of alkane or alkanol was incorporated into the oil phase. In general the findings agree with the classical view that the stability of oil-in-water emulsions is favoured by a closely packed interfacial film and appreciable electric charge on the droplets. The inclusion of non-ionic alcohol leads to enhanced stability, presumably owing to the formation of a "mixed" interfacial film which is more closely packed and probably more coherent than that of the anionic surfactant alone. In some instances differences in stability cannot he accounted for simply by differences in interfacial adsorption or droplet charge. Alternative explanations are discussed and it is postulated that the coarsening of emulsions may occur not only hy coalescence but also through the migration of oil from small droplets to larger ones by molecular diffusion. The viability of using the coalescence rates of droplets at a plane interface as a guide to emulsion stability has been researched. The construction of a suitable apparatus and the development of a standard testing procedure are described. Coalescence-time distributions may be correlated by equations similar to those presented by other workers, or by an analysis based upon the log-normal function. Stability parameters for a range of oils are discussed in terms of differences in film drainage and the natl1re of the interfacial film. Despite some broad correlations there is generally poor agreement between droplet and emulsion stabilities. It is concluded that hydrodynamic factors largely determine droplet stability in the systems studied. Consequently droplet rest-time measurements do not provide a sensible indication of emulsion stability,
Resumo:
This thesis explores how the world-wide-web can be used to support English language teachers doing further studies at a distance. The future of education worldwide is moving towards a requirement that we, as teacher educators, use the latest web technology not as a gambit, but as a viable tool to improve learning. By examining the literature on knowledge, teacher education and web training, a model of teacher knowledge development, along with statements of advice for web developers based upon the model are developed. Next, the applicability and viability of both the model and statements of advice are examined by developing a teacher support site (bttp://www. philseflsupport. com) according to these principles. The data collected from one focus group of users from sixteen different countries, all studying on the same distance Masters programme, is then analysed in depth. The outcomes from the research are threefold: A functioning website that is averaging around 15, 000 hits a month provides a professional contribution. An expanded model of teacher knowledge development that is based upon five theoretical principles that reflect the ever-expanding cyclical nature of teacher learning provides an academic contribution. A series of six statements of advice for developers of teacher support sites. These statements are grounded in the theoretical principles behind the model of teacher knowledge development and incorporate nine keys to effective web facilitation. Taken together, they provide a forward-looking contribution to the praxis of web supported teacher education, and thus to the potential dissemination of the research presented here. The research has succeeded in reducing the proliferation of terminology in teacher knowledge into a succinct model of teacher knowledge development. The model may now be used to further our understanding of how teachers learn and develop as other research builds upon the individual study here. NB: Appendix 4 is only available only available for consultation at Aston University Library with prior arrangement.
Resumo:
The work described was carried out as part of a collaborative Alvey software engineering project (project number SE057). The project collaborators were the Inter-Disciplinary Higher Degrees Scheme of the University of Aston in Birmingham, BIS Applied Systems Ltd. (BIS) and the British Steel Corporation. The aim of the project was to investigate the potential application of knowledge-based systems (KBSs) to the design of commercial data processing (DP) systems. The work was primarily concerned with BIS's Structured Systems Design (SSD) methodology for DP systems development and how users of this methodology could be supported using KBS tools. The problems encountered by users of SSD are discussed and potential forms of computer-based support for inexpert designers are identified. The architecture for a support environment for SSD is proposed based on the integration of KBS and non-KBS tools for individual design tasks within SSD - The Intellipse system. The Intellipse system has two modes of operation - Advisor and Designer. The design, implementation and user-evaluation of Advisor are discussed. The results of a Designer feasibility study, the aim of which was to analyse major design tasks in SSD to assess their suitability for KBS support, are reported. The potential role of KBS tools in the domain of database design is discussed. The project involved extensive knowledge engineering sessions with expert DP systems designers. Some practical lessons in relation to KBS development are derived from this experience. The nature of the expertise possessed by expert designers is discussed. The need for operational KBSs to be built to the same standards as other commercial and industrial software is identified. A comparison between current KBS and conventional DP systems development is made. On the basis of this analysis, a structured development method for KBSs in proposed - the POLITE model. Some initial results of applying this method to KBS development are discussed. Several areas for further research and development are identified.
Resumo:
Swarm intelligence is a popular paradigm for algorithm design. Frequently drawing inspiration from natural systems, it assigns simple rules to a set of agents with the aim that, through local interactions, they collectively solve some global problem. Current variants of a popular swarm based optimization algorithm, particle swarm optimization (PSO), are investigated with a focus on premature convergence. A novel variant, dispersive PSO, is proposed to address this problem and is shown to lead to increased robustness and performance compared to current PSO algorithms. A nature inspired decentralised multi-agent algorithm is proposed to solve a constrained problem of distributed task allocation. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. New rules for specialisation are proposed and are shown to exhibit improved eciency and exibility compared to existing ones. These new rules are compared with a market based approach to agent control. The eciency (average number of tasks performed), the exibility (ability to react to changes in the environment), and the sensitivity to load (ability to cope with differing demands) are investigated in both static and dynamic environments. A hybrid algorithm combining both approaches, is shown to exhibit improved eciency and robustness. Evolutionary algorithms are employed, both to optimize parameters and to allow the various rules to evolve and compete. We also observe extinction and speciation. In order to interpret algorithm performance we analyse the causes of eciency loss, derive theoretical upper bounds for the eciency, as well as a complete theoretical description of a non-trivial case, and compare these with the experimental results. Motivated by this work we introduce agent "memory" (the possibility for agents to develop preferences for certain cities) and show that not only does it lead to emergent cooperation between agents, but also to a signicant increase in efficiency.
Resumo:
Adaptability for distributed object-oriented enterprise frameworks is a critical mission for system evolution. Today, building adaptive services is a complex task due to lack of adequate framework support in the distributed computing environment. In this thesis, we propose a Meta Level Component-Based Framework (MELC) which uses distributed computing design patterns as components to develop an adaptable pattern-oriented framework for distributed computing applications. We describe our novel approach of combining a meta architecture with a pattern-oriented framework, resulting in an adaptable framework which provides a mechanism to facilitate system evolution. The critical nature of distributed technologies requires frameworks to be adaptable. Our framework employs a meta architecture. It supports dynamic adaptation of feasible design decisions in the framework design space by specifying and coordinating meta-objects that represent various aspects within the distributed environment. The meta architecture in MELC framework can provide the adaptability for system evolution. This approach resolves the problem of dynamic adaptation in the framework, which is encountered in most distributed applications. The concept of using a meta architecture to produce an adaptable pattern-oriented framework for distributed computing applications is new and has not previously been explored in research. As the framework is adaptable, the proposed architecture of the pattern-oriented framework has the abilities to dynamically adapt new design patterns to address technical system issues in the domain of distributed computing and they can be woven together to shape the framework in future. We show how MELC can be used effectively to enable dynamic component integration and to separate system functionality from business functionality. We demonstrate how MELC provides an adaptable and dynamic run time environment using our system configuration and management utility. We also highlight how MELC will impose significant adaptability in system evolution through a prototype E-Bookshop application to assemble its business functions with distributed computing components at the meta level in MELC architecture. Our performance tests show that MELC does not entail prohibitive performance tradeoffs. The work to develop the MELC framework for distributed computing applications has emerged as a promising way to meet current and future challenges in the distributed environment.
Resumo:
Multi-agent algorithms inspired by the division of labour in social insects and by markets, are applied to a constrained problem of distributed task allocation. The efficiency (average number of tasks performed), the flexibility (ability to react to changes in the environment), and the sensitivity to load (ability to cope with differing demands) are investigated in both static and dynamic environments. A hybrid algorithm combining both approaches, is shown to exhibit improved efficiency and robustness. We employ nature inspired particle swarm optimisation to obtain optimised parameters for all algorithms in a range of representative environments. Although results are obtained for large population sizes to avoid finite size effects, the influence of population size on the performance is also analysed. From a theoretical point of view, we analyse the causes of efficiency loss, derive theoretical upper bounds for the efficiency, and compare these with the experimental results.
Resumo:
Objective: Recently, much research has been proposed using nature inspired algorithms to perform complex machine learning tasks. Ant colony optimization (ACO) is one such algorithm based on swarm intelligence and is derived from a model inspired by the collective foraging behavior of ants. Taking advantage of the ACO in traits such as self-organization and robustness, this paper investigates ant-based algorithms for gene expression data clustering and associative classification. Methods and material: An ant-based clustering (Ant-C) and an ant-based association rule mining (Ant-ARM) algorithms are proposed for gene expression data analysis. The proposed algorithms make use of the natural behavior of ants such as cooperation and adaptation to allow for a flexible robust search for a good candidate solution. Results: Ant-C has been tested on the three datasets selected from the Stanford Genomic Resource Database and achieved relatively high accuracy compared to other classical clustering methods. Ant-ARM has been tested on the acute lymphoblastic leukemia (ALL)/acute myeloid leukemia (AML) dataset and generated about 30 classification rules with high accuracy. Conclusions: Ant-C can generate optimal number of clusters without incorporating any other algorithms such as K-means or agglomerative hierarchical clustering. For associative classification, while a few of the well-known algorithms such as Apriori, FP-growth and Magnum Opus are unable to mine any association rules from the ALL/AML dataset within a reasonable period of time, Ant-ARM is able to extract associative classification rules.
Resumo:
Two types of sodium carbonate powder produced by spray drying (SD) and dry neutralisation (DN) were studied for their compaction properties using a uniaxial compression tester. Dry neutralised sodium carbonate showed a greater resistance to compression and also produced a weaker compact when compressed to 100kPa. Differential Scanning Calorimetry (DSC) showed that both types of powder were predominantly amorphous in nature. Moisture sorption measurements showed that both powders behaved in a similar way below 50% RH. However, dry neutralised sodium carbonate had a high moisture affinity above this RH. On examining the particle structures using Scanning Electron Microscopy (SEM), the most likely explanation for the increased tendency of spray dried sodium carbonate to form strong compacts was the hollow particle structure.
Resumo:
The focus of this study is development of parallelised version of severely sequential and iterative numerical algorithms based on multi-threaded parallel platform such as a graphics processing unit. This requires design and development of a platform-specific numerical solution that can benefit from the parallel capabilities of the chosen platform. Graphics processing unit was chosen as a parallel platform for design and development of a numerical solution for a specific physical model in non-linear optics. This problem appears in describing ultra-short pulse propagation in bulk transparent media that has recently been subject to several theoretical and numerical studies. The mathematical model describing this phenomenon is a challenging and complex problem and its numerical modeling limited on current modern workstations. Numerical modeling of this problem requires a parallelisation of an essentially serial algorithms and elimination of numerical bottlenecks. The main challenge to overcome is parallelisation of the globally non-local mathematical model. This thesis presents a numerical solution for elimination of numerical bottleneck associated with the non-local nature of the mathematical model. The accuracy and performance of the parallel code is identified by back-to-back testing with a similar serial version.
Resumo:
In response to the increasing interest in the growth and developments in the Indian economy, and the dynamic nature of the rapidly changing Indian business environment, this textbook is designed to provide a comprehensive guide to doing business in the Indian context. Written by academic experts in their respective fields, this book is divided into three parts: the Indian business context, conducting business in India, and India and the world. Key information is presented on a wide range of topics, including: •Both the shortcomings and opportunities associated with the Indian business environment •The economic development model in India •Critical skills for negotiation and incentives for foreign investors, including case studies of Italian companies that have entered the Indian market in different ways •Business culture in India, including particular customs and etiquette In addition to the pedagogical features, each chapter contains a set of key issues, and there is also a list of useful websites covering a wide range of business needs. This book introduces students to business in India, and will be also be of use to investors, organisations and managers who are already doing business, or intend to start one, in India.
Resumo:
In repetitive operations the productivity dilemma has been widely studied, but there is a lack of research in non-repetitive operations, such as in project-based firms. This paper investigates why project-based firms foster or hinder project flexibility through an embedded multi-case study with six projects within a large German project-based firm. The results suggest that although such firms have projects as their key source of revenue, their focus lies in longevity and survival and this logic is, in some instances, at odds with the temporary nature of the project context.
Resumo:
In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties