976 resultados para Information Requirements: Data Availability


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydrographers have traditionally referred to the nearshore area as the "white ribbon" area due to the challenges associated with the collection of elevation data in this highly dynamic transitional zone between terrestrial and marine environments. Accordingly, available information in this zone is typically characterised by a range of datasets from disparate sources. In this paper we propose a framework to 'fill' the white ribbon area of a coral reef system by integrating multiple elevation and bathymetric datasets acquired by a suite of remote-sensing technologies into a seamless digital elevation model (DEM). A range of datasets are integrated, including field-collected GPS elevation points, terrestrial and bathymetric LiDAR, single and multibeam bathymetry, nautical chart depths and empirically derived bathymetry estimations from optical remote sensing imagery. The proposed framework ranks data reliability internally, thereby avoiding the requirements to quantify absolute error and results in a high resolution, seamless product. Nested within this approach is an effective spatially explicit technique for improving the accuracy of bathymetry estimates derived empirically from optical satellite imagery through modelling the spatial structure of residuals. The approach was applied to data collected on and around Lizard Island in northern Australia. Collectively, the framework holds promise for filling the white ribbon zone in coastal areas characterised by similar data availability scenarios. The seamless DEM is referenced to the horizontal coordinate system MGA Zone 55 - GDA 1994, mean sea level (MSL) vertical datum and has a spatial resolution of 20 m.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Remotely sensed data have been used extensively for environmental monitoring and modeling at a number of spatial scales; however, a limited range of satellite imaging systems often. constrained the scales of these analyses. A wider variety of data sets is now available, allowing image data to be selected to match the scale of environmental structure(s) or process(es) being examined. A framework is presented for use by environmental scientists and managers, enabling their spatial data collection needs to be linked to a suitable form of remotely sensed data. A six-step approach is used, combining image spatial analysis and scaling tools, within the context of hierarchy theory. The main steps involved are: (1) identification of information requirements for the monitoring or management problem; (2) development of ideal image dimensions (scene model), (3) exploratory analysis of existing remotely sensed data using scaling techniques, (4) selection and evaluation of suitable remotely sensed data based on the scene model, (5) selection of suitable spatial analytic techniques to meet information requirements, and (6) cost-benefit analysis. Results from a case study show that the framework provided an objective mechanism to identify relevant aspects of the monitoring problem and environmental characteristics for selecting remotely sensed data and analysis techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design and implementation of data bases involve, firstly, the formulation of a conceptual data model by systematic analysis of the structure and information requirements of the organisation for which the system is being designed; secondly, the logical mapping of this conceptual model onto the data structure of the target data base management system (DBMS); and thirdly, the physical mapping of this structured model into storage structures of the target DBMS. The accuracy of both the logical and physical mapping determine the performance of the resulting systems. This thesis describes research which develops software tools to facilitate the implementation of data bases. A conceptual model describing the information structure of a hospital is derived using the Entity-Relationship (E-R) approach and this model forms the basis for mapping onto the logical model. Rules are derived for automatically mapping the conceptual model onto relational and CODASYL types of data structures. Further algorithms are developed for partly automating the implementation of these models onto INGRES, MIMER and VAX-11 DBMS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of data independence designates the techniques that allow data to be changed without affecting the applications that process it. The different structures of the information bases require corresponded tools for supporting data independence. A kind of information bases (the Multi-dimensional Numbered Information Spaces) are pointed in the paper. The data independence in such information bases is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Local Government Authorities (LGAs) are mainly characterised as information-intensive organisations. To satisfy their information requirements, effective information sharing within and among LGAs is necessary. Nevertheless, the dilemma of Inter-Organisational Information Sharing (IOIS) has been regarded as an inevitable issue for the public sector. Despite a decade of active research and practice, the field lacks a comprehensive framework to examine the factors influencing Electronic Information Sharing (EIS) among LGAs. The research presented in this paper contributes towards resolving this problem by developing a conceptual framework of factors influencing EIS in Government-to-Government (G2G) collaboration. By presenting this model, we attempt to clarify that EIS in LGAs is affected by a combination of environmental, organisational, business process, and technological factors and that it should not be scrutinised merely from a technical perspective. To validate the conceptual rationale, multiple case study based research strategy was selected. From an analysis of the empirical data from two case organisations, this paper exemplifies the importance (i.e. prioritisation) of these factors in influencing EIS by utilising the Analytical Hierarchy Process (AHP) technique. The intent herein is to offer LGA decision-makers with a systematic decision-making process in realising the importance (i.e. from most important to least important) of EIS influential factors. This systematic process will also assist LGA decision-makers in better interpreting EIS and its underlying problems. The research reported herein should be of interest to both academics and practitioners who are involved in IOIS, in general, and collaborative e-Government, in particular. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The virtual quadrilateral is the coalescence of novel data structures that reduces the storage requirements of spatial data without jeopardizing the quality and operability of the inherent information. The data representative of the observed area is parsed to ascertain the necessary contiguous measures that, when contained, implicitly define a quadrilateral. The virtual quadrilateral then represents a geolocated area of the observed space where all of the measures are the same. The area, contoured as a rectangle, is pseudo-delimited by the opposite coordinates of the bounding area. Once defined, the virtual quadrilateral is representative of an area in the observed space and is represented in a database by the attributes of its bounding coordinates and measure of its contiguous space. Virtual quadrilaterals have been found to ensure a lossless reduction of the physical storage, maintain the implied features of the data, facilitate the rapid retrieval of vast amount of the represented spatial data and accommodate complex queries. The methods presented herein demonstrate that virtual quadrilaterals are created quite easily, are stable and versatile objects in a database and have proven to be beneficial to exigent spatial data applications such as geographic information systems. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time Series Analysis of multispectral satellite data offers an innovative way to extract valuable information of our changing planet. This is now a real option for scientists thanks to data availability as well as innovative cloud-computing platforms, such as Google Earth Engine. The integration of different missions would mitigate known issues in multispectral time series construction, such as gaps due to clouds or other atmospheric effects. With this purpose, harmonization among Landsat-like missions is possible through statistical analysis. This research offers an overview of the different instruments from Landsat and Sentinel missions (TM, ETM, OLI, OLI-2 and MSI sensors) and products levels (Collection-2 Level-1 and Surface Reflectance for Landsat and Level-1C and Level-2A for Sentinel-2). Moreover, a cross-sensors comparison was performed to assess the interoperability of the sensors on-board Landsat and Sentinel-2 constellations, having in mind a possible combined use for time series analysis. Firstly, more than 20,000 pairs of images almost simultaneously acquired all over Europe were selected over a period of several years. The study performed a cross-comparison analysis on these data, and provided an assessment of the calibration coefficients that can be used to minimize differences in the combined use. Four of the most popular vegetation indexes were selected for the study: NDVI, EVI, SAVI and NDMI. As a result, it is possible to reconstruct a longer and denser harmonized time series since 1984, useful for vegetation monitoring purposes. Secondly, the spectral characteristics of the recent Landsat-9 mission were assessed for a combined use with Landsat-8 and Sentinel-2. A cross-sensor analysis of common bands of more than 3,000 almost simultaneous acquisitions verified a high consistency between datasets. The most relevant discrepancy has been observed in the blue and SWIRS bands, often used in vegetation and water related studies. This analysis was supported with spectroradiometer ground measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As end-user computing becomes more pervasive, an organization's success increasingly depends on the ability of end-users, usually in managerial positions, to extract appropriate data from both internal and external sources. Many of these data sources include or are derived from the organization's accounting information systems. Managerial end-users with different personal characteristics and approaches are likely to compose queries of differing levels of accuracy when searching the data contained within these accounting information systems. This research investigates how cognitive style elements of personality influence managerial end-user performance in database querying tasks. A laboratory experiment was conducted in which participants generated queries to retrieve information from an accounting information system to satisfy typical information requirements. The experiment investigated the influence of personality on the accuracy of queries of varying degrees of complexity. Relying on the Myers–Briggs personality instrument, results show that perceiving individuals (as opposed to judging individuals) who rely on intuition (as opposed to sensing) composed queries more accurately. As expected, query complexity and academic performance also explain the success of data extraction tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In-network storage of data in wireless sensor networks contributes to reduce the communications inside the network and to favor data aggregation. In this paper, we consider the use of n out of m codes and data dispersal in combination to in-network storage. In particular, we provide an abstract model of in-network storage to show how n out of m codes can be used, and we discuss how this can be achieved in five cases of study. We also define a model aimed at evaluating the probability of correct data encoding and decoding, we exploit this model and simulations to show how, in the cases of study, the parameters of the n out of m codes and the network should be configured in order to achieve correct data coding and decoding with high probability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atualmente, as Tecnologias de Informação (TI) são cada vez mais vitais dentro das organizações. As TI são o motor de suporte do negócio. Para grande parte das organizações, o funcionamento e desenvolvimento das TI têm como base infraestruturas dedicadas (internas ou externas) denominadas por Centro de Dados (CD). Nestas infraestruturas estão concentrados os equipamentos de processamento e armazenamento de dados de uma organização, por isso, são e serão cada vez mais desafiadas relativamente a diversos fatores tais como a escalabilidade, disponibilidade, tolerância à falha, desempenho, recursos disponíveis ou disponibilizados, segurança, eficiência energética e inevitavelmente os custos associados. Com o aparecimento das tecnologias baseadas em computação em nuvem e virtualização, abrese todo um leque de novas formas de endereçar os desafios anteriormente descritos. Perante este novo paradigma, surgem novas oportunidades de consolidação dos CD que podem representar novos desafios para os gestores de CD. Por isso, é no mínimo irrealista para as organizações simplesmente eliminarem os CD ou transforma-los segundo os mais altos padrões de qualidade. As organizações devem otimizar os seus CD, contudo um projeto eficiente desta natureza, com capacidade para suportar as necessidades impostas pelo mercado, necessidades dos negócios e a velocidade da evolução tecnológica, exigem soluções complexas e dispendiosas tanto para a sua implementação como a sua gestão. É neste âmbito que surge o presente trabalho. Com o objetivo de estudar os CD inicia-se um estudo sobre esta temática, onde é detalhado o seu conceito, evolução histórica, a sua topologia, arquitetura e normas existentes que regem os mesmos. Posteriormente o estudo detalha algumas das principais tendências condicionadoras do futuro dos CD. Explorando o conhecimento teórico resultante do estudo anterior, desenvolve-se uma metodologia de avaliação dos CD baseado em critérios de decisão. O estudo culmina com uma análise sobre uma nova solução tecnológica e a avaliação de três possíveis cenários de implementação: a primeira baseada na manutenção do atual CD; a segunda baseada na implementação da nova solução em outro CD em regime de hosting externo; e finalmente a terceira baseada numa implementação em regime de IaaS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT: The aim of this analysis was to analyze and describe the steps that have been taken in the development of the mental health policy in Suriname after the WHO AIMS. The objectives are: 1.To review the steps to be taken in developing a mental health policy and plan for a country 2.To gather information and data concerning mental health policy and plan development in Suriname 3.To draw conclusion from the experience gained that can be applied to other countries. In general, the information that was gathered from the four countries Guyana, Barbados, Trinidad & Tobago and Suriname, was compared with the WHO steps for developing a mental health policy and plan. Were these steps taken into consideration, when developing their mental health policy and plan? If not, what were the reasons why it did not happen? The checklist for evaluating a mental health plan was used in Suriname. This checklist assisted to see if the results of the recommendations given by the WHO AIMS to develop a effective and balanced mental health plan were taken into consideration. The mayor findings of the analysis are that Suriname as well as Guyana used the steps in developing their mental health policy and plan. Barbados and Trinidad & Tobago did not develop a mental health policy and plan. Suriname and Guyana have a mental health coordinating body at the Ministry of Health. Trinidad & Tobago as well as Barbados have a mental health focal person at the Ministry of Health of the respective countries. It can be concluded that successfully improving of health systems and services for mental health is combining theoretical concepts, expert knowledge and cooperation of many stakeholders. The appointment of a mental health coordinating unit at the Ministry of Health is crucial for the development of mental health in a country. Furthermore, mental health is everyone’s business and responsibility. Implementing the steps to be taken when developing a mental health policy and plan as recommended by WHO may be a slow process requiring the mobilization of political will. That’s why it is crucial that persons responsible for this process work close with all stakeholders in relevant sectors, taking their needs into consideration and try to translate that in clear objectives. It is common knowledge that improving the quality of mental health must be accompanied by the availability of financial and human resources. Finally, a mental health policy and plan should be one document tackling all aspects of mental health of a community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.