895 resultados para data types and operators


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SPOT simulation imagery was acquired for a test site in the Forest of Dean in Gloucestershire, U.K. This data was qualitatively and quantitatively evaluated for its potential application in forest resource mapping and management. A variety of techniques are described for enhancing the image with the aim of providing species level discrimination within the forest. Visual interpretation of the imagery was more successful than automated classification. The heterogeneity within the forest classes, and in particular between the forest and urban class, resulted in poor discrimination using traditional `per-pixel' automated methods of classification. Different means of assessing classification accuracy are proposed. Two techniques for measuring textural variation were investigated in an attempt to improve classification accuracy. The first of these, a sequential segmentation method, was found to be beneficial. The second, a parallel segmentation method, resulted in little improvement though this may be related to a combination of resolution in size of the texture extraction area. The effect on classification accuracy of combining the SPOT simulation imagery with other data types is investigated. A grid cell encoding technique was selected as most appropriate for storing digitised topographic (elevation, slope) and ground truth data. Topographic data were shown to improve species-level classification, though with sixteen classes overall accuracies were consistently below 50%. Neither sub-division into age groups or the incorporation of principal components and a band ratio significantly improved classification accuracy. It is concluded that SPOT imagery will not permit species level classification within forested areas as diverse as the Forest of Dean. The imagery will be most useful as part of a multi-stage sampling scheme. The use of texture analysis is highly recommended for extracting maximum information content from the data. Incorporation of the imagery into a GIS will both aid discrimination and provide a useful management tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the role of diagenesis in forming ore deposits. Two sedimentary 'ore-types' have been examined; the Proterozoic copper-cobalt orebodies of the Konkola Basin on the Zambian Copperbelt, and the Permian Marl Slate of North East England. Facies analysis of the Konkola Basin shows the Ore-Shale to have formed in a subtidal to intertidal environment. A sequence of diagenetic events is outlined from which it is concluded that the sulphide ores are an integral part of the diagenetic process. Sulphur isotope data establish that the sulphides formed as a consequence of the bacterial reduction of sulphate, while the isotopic and geochemical composition of carbonates is shown to reflect changes in the compositions of diagenetic pore fluids. Geochemical studies indicate that the copper and cobalt bearing mineralising fluids probably had different sources. Veins which crosscut the orebodies contain hydrocarbon inclusions, and are shown to be of late diagenetic lateral secretion origin. RbiSr dating indicates that the Ore-Shale was subject to metamorphism at 529 A- 20 myrs. The sedimentology and petrology of the Marl Slate are described. Textural and geochemical studies suggest that much of the pyrite (framboidal) in the Marl Slate formed in an anoxic water column, while euhedral pyrite and base metal sulphides formed within the sediment during early diagenesis. Sulphur isotope data confirm that conditions were almost "ideal" for sulphide formation during Marl Slate deposition, the limiting factors in ore formation being the restricted supply of chalcophile elements. Carbon and oxygen isotope data, along with petrographic observations, indicate that much of the calcite and dolomite occurring in the Marl Slate is primary, and probably formed in isotopic equilibrium. A depositional model is proposed which explains all of the data presented and links the lithological variations with fluctuations in the anoxicioxic boundary layer of the water column.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is a study of specific aspects of classroom interaction primary school level in Kenya. The study entailed the identification of the sources of particular communication problems during the change-over period from Kiswahili to English medium teaching in two primary schools. There was subsequently an examination of the language resources which were employed by teachers to maintain pupil participation in communication in the light of the occurrence of possibility of occurrence of specific communication problems. The language resources which were found to be significant in this regard concerned firstly the use of different elicitation types by teachers to stimulate pupils into giving responses and secondly teachers' recourse to code-switching from English to Kiswahili and vice-versa. It was also found in this study that although the use of English as the medium of instruction in the classrooms which were observed resulted in certain communication problems, some of these problems need not have arisen if teachers had been more careful in their use of language. The consideration of this finding, after taking into account the role of different elicitation types and code-switching as interpretable from data samples had certain implications which are specified in the study for teaching in Kenyan primary schools. The corpus for the study consisted of audio-recordings of English, Science and Number-Work lessons which were later transcribed. Relevant data samples were subsequently extracted from transcripts for analysis. Many of the samples have examples of cases of communication breakdowns, but they also illustrate how teachers maintained interaction with pupils who had yet to acquire an operational mastery of English. This study thus differs from most studies on classroom interaction because of its basic concern with the examination of the resources available to teachers for overcoming the problem areas of classroom communication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper assesses the impact of regional technological diversification on the emergence of new innovators across EU regions. Integrating analyses from regional economics, economic geography and technological change literatures, we explore the role that the regional embeddedness of actors characterised by diverse technological competencies may have in fostering novel and sustained interactions leading to new technological combinations. In particular, we test whether greater technological diversification improve regional ‘combinatorial’ opportunities leading to the emergence of new innovators. The analysis is based on panel data obtained merging regional economic data from Eurostat and patent data from the CRIOS-PATSTAT database over the period 1997–2006, covering 178 regions across 10 EU Countries. Accounting for different measures of economic and innovative activity at the NUTS2 level, our findings suggest that the regional co-location of diverse technological competencies contributes to the entry of new innovators, thereby shaping technological change and industry dynamics. Thus, this paper brings to the fore a better understanding of the relationship between regional diversity and technological change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the important bioinformatics problem of predicting protein function from a protein's primary sequence. We consider the functional classification of G-Protein-Coupled Receptors (GPCRs), whose functions are specified in a class hierarchy. We tackle this task using a novel top-down hierarchical classification system where, for each node in the class hierarchy, the predictor attributes to be used in that node and the classifier to be applied to the selected attributes are chosen in a data-driven manner. Compared with a previous hierarchical classification system selecting classifiers only, our new system significantly reduced processing time without significantly sacrificing predictive accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Securities and Exchange Commission (SEC) in the United States mandated a new digital reporting system for US companies in late 2008. The new generation of information provision has been dubbed by Chairman Cox, ‘interactive data’ (SEC, 2006a). Despite the promise of its name, we find that in the development of the project retail investors are invoked as calculative actors rather than engaged in dialogue. Similarly, the potential for the underlying technology to be applied in ways to encourage new forms of accountability appears to be forfeited in the interests of enrolling company filers. We theorise the activities of the SEC and in particular its chairman at the time, Christopher Cox, over a three year period, both prior to and following the ‘credit crisis’. We argue that individuals and institutions play a central role in advancing the socio-technical project that is constituted by interactive data. We adopt insights from ANT (Callon, 1986; Latour, 1987, 2005b) and governmentality (Miller, 2008; Miller and Rose, 2008) to show how regulators and the proponents of the technology have acted as spokespersons for the interactive data technology and the retail investor. We examine the way in which calculative accountability has been privileged in the SEC’s construction of the retail investor as concerned with atomised, quantitative data (Kamuf, 2007; Roberts, 2009; Tsoukas, 1997). We find that the possibilities for the democratising effects of digital information on the Internet has not been realised in the interactive data project and that it contains risks for the very investors the SEC claims to seek to protect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although crisp data are fundamentally indispensable for determining the profit Malmquist productivity index (MPI), the observed values in real-world problems are often imprecise or vague. These imprecise or vague data can be suitably characterized with fuzzy and interval methods. In this paper, we reformulate the conventional profit MPI problem as an imprecise data envelopment analysis (DEA) problem, and propose two novel methods for measuring the overall profit MPI when the inputs, outputs, and price vectors are fuzzy or vary in intervals. We develop a fuzzy version of the conventional MPI model by using a ranking method, and solve the model with a commercial off-the-shelf DEA software package. In addition, we define an interval for the overall profit MPI of each decision-making unit (DMU) and divide the DMUs into six groups according to the intervals obtained for their overall profit efficiency and MPIs. We also present two numerical examples to demonstrate the applicability of the two proposed models and exhibit the efficacy of the procedures and algorithms. © 2011 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the aims of the Science and Technology Committee (STC) of the Group on Earth Observations (GEO) was to establish a GEO Label- a label to certify geospatial datasets and their quality. As proposed, the GEO Label will be used as a value indicator for geospatial data and datasets accessible through the Global Earth Observation System of Systems (GEOSS). It is suggested that the development of such a label will significantly improve user recognition of the quality of geospatial datasets and that its use will help promote trust in datasets that carry the established GEO Label. Furthermore, the GEO Label is seen as an incentive to data providers. At the moment GEOSS contains a large amount of data and is constantly growing. Taking this into account, a GEO Label could assist in searching by providing users with visual cues of dataset quality and possibly relevance; a GEO Label could effectively stand as a decision support mechanism for dataset selection. Currently our project - GeoViQua, - together with EGIDA and ID-03 is undertaking research to define and evaluate the concept of a GEO Label. The development and evaluation process will be carried out in three phases. In phase I we have conducted an online survey (GEO Label Questionnaire) to identify the initial user and producer views on a GEO Label or its potential role. In phase II we will conduct a further study presenting some GEO Label examples that will be based on Phase I. We will elicit feedback on these examples under controlled conditions. In phase III we will create physical prototypes which will be used in a human subject study. The most successful prototypes will then be put forward as potential GEO Label options. At the moment we are in phase I, where we developed an online questionnaire to collect the initial GEO Label requirements and to identify the role that a GEO Label should serve from the user and producer standpoint. The GEO Label Questionnaire consists of generic questions to identify whether users and producers believe a GEO Label is relevant to geospatial data; whether they want a single "one-for-all" label or separate labels that will serve a particular role; the function that would be most relevant for a GEO Label to carry; and the functionality that users and producers would like to see from common rating and review systems they use. To distribute the questionnaire, relevant user and expert groups were contacted at meetings or by email. At this stage we successfully collected over 80 valid responses from geospatial data users and producers. This communication will provide a comprehensive analysis of the survey results, indicating to what extent the users surveyed in Phase I value a GEO Label, and suggesting in what directions a GEO Label may develop. Potential GEO Label examples based on the results of the survey will be presented for use in Phase II.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project consists of an experimental and numerical modelling study of the applications of ultra-long Raman fibre laser (URFL) based amplification techniques for high-speed multi-wavelength optical communications systems. The research is focused in telecommunications C-band 40 Gb/s transmission data rates with direct and coherent detection. The optical transmission performance of URFL based systems in terms of optical noise, gain bandwidth and gain flatness for different system configurations is evaluated. Systems with different overall span lengths, transmission fibre types and data modulation formats are investigated. Performance is compared with conventional Erbium doped fibre amplifier based system to evaluate system configurations where URFL based amplification provide performance or commercial advantages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mobile technologies have yet to be widely adopted by the Architectural, Engineering, and Construction (AEC) industry despite being one of the major growth areas in computing in recent years. This lack of uptake in the AEC industry is likely due, in large part, to the combination of small screen size and inappropriate interaction demands of current mobile technologies. This paper discusses the scope for multimodal interaction design with a specific focus on speech-based interaction to enhance the suitability of mobile technology use within the AEC industry by broadening the field data input capabilities of such technologies. To investigate the appropriateness of using multimodal technology for field data collection in the AEC industry, we have developed a prototype Multimodal Field Data Entry (MFDE) application. This application, which allows concrete testing technicians to record quality control data in the field, has been designed to support two different modalities of data input speech-based data entry and stylus-based data entry. To compare the effectiveness or usability of, and user preference for, the different input options, we have designed a comprehensive lab-based evaluation of the application. To appropriately reflect the anticipated context of use within the study design, careful consideration had to be given to the key elements of a construction site that would potentially influence a test technician's ability to use the input techniques. These considerations and the resultant evaluation design are discussed in detail in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research has been undertaken to determine how successful multi-organisational enterprise strategy is reliant on the correct type of Enterprise Resource Planning (ERP) information systems being used. However there appears to be a dearth of research as regards strategic alignment between ERP systems development and multi-organisational enterprise governance as guidelines and frameworks to assist practitioners in making decision for multi-organisational collaboration supported by different types of ERP systems are still missing from theoretical and empirical perspectives. This calls for this research which investigates ERP systems development and emerging practices in the management of multi-organisational enterprises (i.e. parts of companies working with parts of other companies to deliver complex product-service systems) and identify how different ERP systems fit into different multi-organisational enterprise structures, in order to achieve sustainable competitive success. An empirical inductive study was conducted using the Grounded Theory-based methodological approach based on successful manufacturing and service companies in the UK and China. This involved an initial pre-study literature review, data collection via 48 semi-structured interviews with 8 companies delivering complex products and services across organisational boundaries whilst adopting ERP systems to support their collaborative business strategies – 4 cases cover printing, semiconductor manufacturing, and parcel distribution industries in the UK and 4 cases cover crane manufacturing, concrete production, and banking industries in China in order to form a set of 29 tentative propositions that have been validated via a questionnaire receiving 116 responses from 16 companies. The research has resulted in the consolidation of the validated propositions into a novel concept referred to as the ‘Dynamic Enterprise Reference Grid for ERP’ (DERG-ERP) which draws from multiple theoretical perspectives. The core of the DERG-ERP concept is a contingency management framework which indicates that different multi-organisational enterprise paradigms and the supporting ERP information systems are not the result of different strategies, but are best considered part of a strategic continuum with the same overall business purpose of multi-organisational cooperation. At different times and circumstances in a partnership lifecycle firms may prefer particular multi-organisational enterprise structures and the use of different types of ERP systems to satisfy business requirements. Thus the DERG-ERP concept helps decision makers in selecting, managing and co-developing the most appropriate multi-organistional enterprise strategy and its corresponding ERP systems by drawing on core competence, expected competitiveness, and information systems strategic capabilities as the main contingency factors. Specifically, this research suggests that traditional ERP(I) systems are associated with Vertically Integrated Enterprise (VIE); whilst ERPIIsystems can be correlated to Extended Enterprise (EE) requirements and ERPIII systems can best support the operations of Virtual Enterprise (VE). The contribution of this thesis is threefold. Firstly, this work contributes to a gap in the extant literature about the best fit between ERP system types and multi-organisational enterprise structure types; and proposes a new contingency framework – the DERG-ERP, which can be used to explain how and why enterprise managers need to change and adapt their ERP information systems in response to changing business and operational requirements. Secondly, with respect to a priori theoretical models, the new DERG-ERP has furthered multi-organisational enterprise management thinking by incorporating information system strategy, rather than purely focusing on strategy, structural, and operational aspects of enterprise design and management. Simultaneously, the DERG-ERP makes theoretical contributions to the current IS Strategy Formulation Model which does not explicitly address multi-organisational enterprise governance. Thirdly, this research clarifies and emphasises the new concept and ideas of future ERP systems (referred to as ERPIII) that are inadequately covered in the extant literature. The novel DERG-ERP concept and its elements have also been applied to 8 empirical cases to serve as a practical guide for ERP vendors, information systems management, and operations managers hoping to grow and sustain their competitive advantage with respect to effective enterprise strategy, enterprise structures, and ERP systems use; referred to in this thesis as the “enterprisation of operations”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A spatial object consists of data assigned to points in a space. Spatial objects, such as memory states and three dimensional graphical scenes, are diverse and ubiquitous in computing. We develop a general theory of spatial objects by modelling abstract data types of spatial objects as topological algebras of functions. One useful algebra is that of continuous functions, with operations derived from operations on space and data, and equipped with the compact-open topology. Terms are used as abstract syntax for defining spatial objects and conditional equational specifications are used for reasoning. We pose a completeness problem: Given a selection of operations on spatial objects, do the terms approximate all the spatial objects to arbitrary accuracy? We give some general methods for solving the problem and consider their application to spatial objects with real number attributes. © 2011 British Computer Society.