949 resultados para GIS-portal
Resumo:
Esta pesquisa apresenta estudo de caso cujo objetivo foi analisar a aceitação do Portal Inovação, identificando os fatores preditivos da intenção comportamental de uso e do comportamento de uso direcionadores da adoção da tecnologia por seus usuários via extensão do Modelo Unificado de Aceitação de Tecnologia, denominado pela sigla UTAUT (Unified Theory of Acceptance and Use of Technololgy) de Venkatesh et al. (2003). O objeto da pesquisa o Portal Inovação foi desenvolvido pelo Ministério da Ciência, Tecnologia e Inovação (MCTI) em parceria com o Centro de Gestão e Estudos Estratégicos (CGEE), Associação Brasileira de Desenvolvimento Industrial (ABDI) e Instituto Stela, visando atender às demandas do Sistema Nacional de Ciência, Tecnologia e Inovação (SNCTI) do País. Para atingir os objetivos propostos, recorreu-se às abordagens qualitativa, que foi subsidiada pelo método estudo de caso (YIN, 2005) e quantitativa, apoiada pela metodologia UTAUT, aplicada a usuários do portal e que contemplou o resultado de 264 respondentes validados. Quanto ao material de análise, utilizou-se da pesquisa bibliográfica sobre governo eletrônico (e-Gov), Internet, Sistema Nacional de Inovação, modelos de aceitação de tecnologia, dados oficiais públicos e legislações atinentes ao setor de inovação tecnológica. A técnica de análise empregada quantitativamente consistiu no uso de modelagem por equações estruturais, com base no algoritmo PLS (Partial Least Square) com bootstrap de 1.000 reamostragens. Os principais resultados obtidos demonstraram alta magnitude e significância preditiva sobre a Intenção Comportamental de Uso do Portal pelos fatores: Expectativa de Desempenho e Influência Social. Além de evidenciarem que as condições facilitadoras impactam significativamente sobre o Comportamento de Uso dos usuários. A conclusão principal do presente estudo é a de que ao considerarmos a aceitação de um portal governamental em que a adoção é voluntária, o fator social é altamente influente na intenção de uso da tecnologia, bem como os aspectos relacionados à produtividade consequente do usuário e o senso de utilidade; além da facilidade de interação e domínio da ferramenta. Tais constatações ensejam em novas perspectivas de pesquisa e estudos no âmbito das ações de e-Gov, bem como no direcionamento adequado do planejamento, monitoramento e avaliação de projetos governamentais.
Resumo:
A Constituição de 1988 e leis subsequentes determinam que o Estado preste informações aos cidadãos e favoreça a sua participação nas questões públicas trata-se do princípio legal da Transparência Administrativa, que compreende os seguintes subprincípios: (1) Informação; (2) Motivação e, o mais importante, (3) Participação e interatividade cidadãs. O alto investimento na Comunicação Estatal e os avanços tecnológicos, por si sós, não garantem a prática da transparência pública ou da democratização da informação. Sob uma perspectiva multidisciplinar, esta pesquisa discutiu o princípio legal de Transparência Administrativa, comparativamente à Teoria da Comunicação, com o objetivo de propor um conceito de Comunicação Estatal que, de fato, corresponda aos ideais e à ética necessários à Comunicação Pública. Para o desenvolvimento deste estudo foi investigada a relação da comunicação com o grau de transparência alcançado no portal do Senado Federal. O estudo analisou a tramitação da reforma do Poder Judiciário no período de 2000 a 2004, tendo em vista os três subprincípios legais da Transparência Pública. A análise contemplou, no portal do Senado, o trabalho jornalístico e a disponibilização on-line de textos digitais referentes a documentos originais, tais como atas públicas e notas taquigráficas. A metodologia, de enfoques quantitativo e qualitativo, teve como instrumento principal a Nova Retórica, para análise de matérias jornalísticas e textos documentais. Para averiguação da interatividade conceito que fundamenta o ideal de justiça , foram estabelecidos critérios analíticos a partir da intersecção entre os conceitos de transparência e E-parliaments. Constatou-se que o portal do Senado, no referente à reforma da Justiça, alcançou graus de transparência, atendendo mais aos subprincípios da informação e da motivação em detrimento aos da participação e interatividade cidadãs.(AU)
Resumo:
A educação contemporânea está inserida num contexto de velozes e dinâmicas transformações sociais e culturais, principalmente com o avanço e incorporação das Tecnologias Digitais de Informação e Comunicação (TDIC) no cotidiano das pessoas. Na Sociedade da Informação, na Era do Conhecimento, é preciso ir além do saber ler, escrever e digitar. A escola, por sua vez, de maneira ainda morosa, busca adequar-se às exigências do universo digital do qual participam seus agentes. O Ensino Médio, foco de preocupação e reflexão por todos os envolvidos no processo educativo dessa modalidade, tenta alcançar sua proposta de formação integral dos jovens para o exercício do trabalho e da cidadania. À disciplina Língua Portuguesa reserva-se a missão de conciliar o ensino da norma-culta com os gêneros discursivos de tal forma a promover a inclusão digital dos alunos nas diversas circunstâncias de letramento às quais são submetidos. Nesse âmbito, este trabalho investigou: Que percepções dos processos formativos emergem quando os alunos refletem acerca das práticas pedagógicas e das vivências nas aulas de Língua Portuguesa em atividades mediadas por portal educacional? O objetivo geral da pesquisa é provocar a reflexão nos professores, de tal forma que repensem suas práticas pedagógicas e seu papel no processo educativo a fim de promover uma experiência educativa mais condizente com a realidade dos alunos. A metodologia adotada foi a pesquisa qualitativa de cunho investigativo, na modalidade narrativa, sob a luz de Clandinin e Connelly (2011). Pela inserção no cenário e proximidade afetiva com os participantes, assumiu-se o desafio de desenvolver uma pesquisa-ação, para isso, os instrumentos investigativos adotados foram: entrevista semiestruturada, diário de bordo, atividades realizadas no portal, conversas informais e caderno de campo. A análise dos dados permitiu a elaboração de oito categorias de análise, emergentes das narrativas dos participantes: interação e comunicação; sala de aula ampliada; gestão da aprendizagem; o registro de si e do outro; aprendizagem colaborativa e transformadora; incentivo à pesquisa; estudo autônomo; e, desafios. Os resultados alcançados apontaram para reflexões que não se encerram nas páginas deste trabalho, dentre elas destacam-se: a importância de ouvir o aluno para que as propostas pedagógicas sejam revistas e melhoradas; o testar, nas práticas diárias, é fundamental, é o buscar algo além do tradicional, em prol de um objetivo de aprendizagem definido; o desejo de aprender pode despertar no aluno o interesse pelo conhecimento, tornando-o mais autônomo em suas escolhas e caminhos; as TDIC podem colaborar com o processo de ensino e de aprendizagem, porém exigem envolvimento dos sujeitos, pois elas, enquanto instrumentos, não configuram o conhecimento, são os agentes que ao apropriar-se delas têm condições de obter o melhor de suas potencialidades. Futuros trabalhos poderão dar continuidade a este estudo e trazer grandes acréscimos ao contemplar as influências do uso das TDIC no cotidiano da escola, o que, certamente, será de grande contribuição para o cenário atual da Educação brasileira.
Resumo:
The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.
Resumo:
Overlaying maps using a desktop GIS is often the first step of a multivariate spatial analysis. The potential of this operation has increased considerably as data sources and Web services to manipulate them are becoming widely available via the Internet. Standards from the OGC enable such geospatial mashups to be seamless and user driven, involving discovery of thematic data. The user is naturally inclined to look for spatial clusters and correlation of outcomes. Using classical cluster detection scan methods to identify multivariate associations can be problematic in this context, because of a lack of control on or knowledge about background populations. For public health and epidemiological mapping, this limiting factor can be critical but often the focus is on spatial identification of risk factors associated with health or clinical status. Spatial entropy index HSu for the ScankOO analysis of the hypothetical dataset using a vicinity which is fixed by the number of points without distinction between their labels. (The size of the labels is proportional to the inverse of the index) In this article we point out that this association itself can ensure some control on underlying populations, and develop an exploratory scan statistic framework for multivariate associations. Inference using statistical map methodologies can be used to test the clustered associations. The approach is illustrated with a hypothetical data example and an epidemiological study on community MRSA. Scenarios of potential use for online mashups are introduced but full implementation is left for further research.
Resumo:
In recent years there has been a great effort to combine the technologies and techniques of GIS and process models. This project examines the issues of linking a standard current generation 2½d GIS with several existing model codes. The focus for the project has been the Shropshire Groundwater Scheme, which is being developed to augment flow in the River Severn during drought periods by pumping water from the Shropshire Aquifer. Previous authors have demonstrated that under certain circumstances pumping could reduce the soil moisture available for crops. This project follows earlier work at Aston in which the effects of drawdown were delineated and quantified through the development of a software package that implemented a technique which brought together the significant spatially varying parameters. This technique is repeated here, but using a standard GIS called GRASS. The GIS proved adequate for the task and the added functionality provided by the general purpose GIS - the data capture, manipulation and visualisation facilities - were of great benefit. The bulk of the project is concerned with examining the issues of the linkage of GIS and environmental process models. To this end a groundwater model (Modflow) and a soil moisture model (SWMS2D) were linked to the GIS and a crop model was implemented within the GIS. A loose-linked approach was adopted and secondary and surrogate data were used wherever possible. The implications of which relate to; justification of a loose-linked versus a closely integrated approach; how, technically, to achieve the linkage; how to reconcile the different data models used by the GIS and the process models; control of the movement of data between models of environmental subsystems, to model the total system; the advantages and disadvantages of using a current generation GIS as a medium for linking environmental process models; generation of input data, including the use of geostatistic, stochastic simulation, remote sensing, regression equations and mapped data; issues of accuracy, uncertainty and simply providing adequate data for the complex models; how such a modelling system fits into an organisational framework.
Resumo:
Forests play a pivotal role in timber production, maintenance and development of biodiversity and in carbon sequestration and storage in the context of the Kyoto Protocol. Policy makers and forest experts therefore require reliable information on forest extent, type and change for management, planning and modeling purposes. It is becoming increasingly clear that such forest information is frequently inconsistent and unharmonised between countries and continents. This research paper presents a forest information portal that has been developed in line with the GEOSS and INSPIRE frameworks. The web portal provides access to forest resources data at a variety of spatial scales, from global through to regional and local, as well as providing analytical capabilities for monitoring and validating forest change. The system also allows for the utilisation of forest data and processing services within other thematic areas. The web portal has been developed using open standards to facilitate accessibility, interoperability and data transfer.
Resumo:
Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.
Resumo:
* This research is partially supported by a grant (bourse Lavoisier) from the French Ministry of Foreign Affairs (Ministère des Affaires Etrangères).
Resumo:
The evaluation of geospatial data quality and trustworthiness presents a major challenge to geospatial data users when making a dataset selection decision. The research presented here therefore focused on defining and developing a GEO label – a decision support mechanism to assist data users in efficient and effective geospatial dataset selection on the basis of quality, trustworthiness and fitness for use. This thesis thus presents six phases of research and development conducted to: (a) identify the informational aspects upon which users rely when assessing geospatial dataset quality and trustworthiness; (2) elicit initial user views on the GEO label role in supporting dataset comparison and selection; (3) evaluate prototype label visualisations; (4) develop a Web service to support GEO label generation; (5) develop a prototype GEO label-based dataset discovery and intercomparison decision support tool; and (6) evaluate the prototype tool in a controlled human-subject study. The results of the studies revealed, and subsequently confirmed, eight geospatial data informational aspects that were considered important by users when evaluating geospatial dataset quality and trustworthiness, namely: producer information, producer comments, lineage information, compliance with standards, quantitative quality information, user feedback, expert reviews, and citations information. Following an iterative user-centred design (UCD) approach, it was established that the GEO label should visually summarise availability and allow interrogation of these key informational aspects. A Web service was developed to support generation of dynamic GEO label representations and integrated into a number of real-world GIS applications. The service was also utilised in the development of the GEO LINC tool – a GEO label-based dataset discovery and intercomparison decision support tool. The results of the final evaluation study indicated that (a) the GEO label effectively communicates the availability of dataset quality and trustworthiness information and (b) GEO LINC successfully facilitates ‘at a glance’ dataset intercomparison and fitness for purpose-based dataset selection.
Resumo:
* The work is partly supported by RFFI grant 08-07-00062-a
Resumo:
Agile methodologies are becoming more popular in the software development process nowadays. The iterative development lifecycle, openness to frequent changes, tight cooperation with the client and among the software engineers are turning into more and more effective practices and respond to a higher extend to the current business needs. It is natural to raise the question which methodology is the most suitable for use when starting and managing a project. This depends on many factors—product characteristics, technologies used, client’s and developer’s experience, project type. A systematic analysis of the most common problems appearing when developing a particular type of projects—public portal solutions, is proposed. In the case at hand a very close interaction with various types of end users is observed. This is a prerequisite for permanent changes during the development and support cycles, which makes them ideal candidates for using an agile methodology. We will compare the ways in which each methodology addresses the specific problems arising and will finish with ranking them according to their relevance. This might help the project manager in choosing one or a combination of the methodologies.
Resumo:
This is an extended version of an article presented at the Second International Conference on Software, Services and Semantic Technologies, Sofia, Bulgaria, 11–12 September 2010.
Resumo:
Floods are one of the most dangerous and common disasters worldwide, and these disasters are closely linked to the geography of the affected area. As a result, several papers in the academic field of humanitarian logistics have incorporated the use of Geographical Information Systems (GIS) for disaster management. However, most of the contributions in the literature are using these systems for network analysis and display, with just a few papers exploiting the capabilities of GIS to improve planning and preparedness. To show the capabilities of GIS for disaster management, this paper uses raster GIS to analyse potential flooding scenarios and provide input to an optimisation model. The combination is applied to two real-world floods in Mexico to evaluate the value of incorporating GIS for disaster planning. The results provide evidence that including GIS analysis for a decision-making tool in disaster management can improve the outcome of disaster operations by reducing the number of facilities used at risk of flooding. Empirical results imply the importance of the integration of advanced remote sensing images and GIS for future systems in humanitarian logistics.