918 resultados para Spatial analysis statistics -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern geographical databases, which are at the core of geographic information systems (GIS), store a rich set of aspatial attributes in addition to geographic data. Typically, aspatial information comes in textual and numeric format. Retrieving information constrained on spatial and aspatial data from geodatabases provides GIS users the ability to perform more interesting spatial analyses, and for applications to support composite location-aware searches; for example, in a real estate database: “Find the nearest homes for sale to my current location that have backyard and whose prices are between $50,000 and $80,000”. Efficient processing of such queries require combined indexing strategies of multiple types of data. Existing spatial query engines commonly apply a two-filter approach (spatial filter followed by nonspatial filter, or viceversa), which can incur large performance overheads. On the other hand, more recently, the amount of geolocation data has grown rapidly in databases due in part to advances in geolocation technologies (e.g., GPS-enabled smartphones) that allow users to associate location data to objects or events. The latter poses potential data ingestion challenges of large data volumes for practical GIS databases. In this dissertation, we first show how indexing spatial data with R-trees (a typical data pre-processing task) can be scaled in MapReduce—a widely-adopted parallel programming model for data intensive problems. The evaluation of our algorithms in a Hadoop cluster showed close to linear scalability in building R-tree indexes. Subsequently, we develop efficient algorithms for processing spatial queries with aspatial conditions. Novel techniques for simultaneously indexing spatial with textual and numeric data are developed to that end. Experimental evaluations with real-world, large spatial datasets measured query response times within the sub-second range for most cases, and up to a few seconds for a small number of cases, which is reasonable for interactive applications. Overall, the previous results show that the MapReduce parallel model is suitable for indexing tasks in spatial databases, and the adequate combination of spatial and aspatial attribute indexes can attain acceptable response times for interactive spatial queries with constraints on aspatial data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The full-scale base-isolated structure studied in this dissertation is the only base-isolated building in South Island of New Zealand. It sustained hundreds of earthquake ground motions from September 2010 and well into 2012. Several large earthquake responses were recorded in December 2011 by NEES@UCLA and by GeoNet recording station nearby Christchurch Women's Hospital. The primary focus of this dissertation is to advance the state-of-the art of the methods to evaluate performance of seismic-isolated structures and the effects of soil-structure interaction by developing new data processing methodologies to overcome current limitations and by implementing advanced numerical modeling in OpenSees for direct analysis of soil-structure interaction.

This dissertation presents a novel method for recovering force-displacement relations within the isolators of building structures with unknown nonlinearities from sparse seismic-response measurements of floor accelerations. The method requires only direct matrix calculations (factorizations and multiplications); no iterative trial-and-error methods are required. The method requires a mass matrix, or at least an estimate of the floor masses. A stiffness matrix may be used, but is not necessary. Essentially, the method operates on a matrix of incomplete measurements of floor accelerations. In the special case of complete floor measurements of systems with linear dynamics, real modes, and equal floor masses, the principal components of this matrix are the modal responses. In the more general case of partial measurements and nonlinear dynamics, the method extracts a number of linearly-dependent components from Hankel matrices of measured horizontal response accelerations, assembles these components row-wise and extracts principal components from the singular value decomposition of this large matrix of linearly-dependent components. These principal components are then interpolated between floors in a way that minimizes the curvature energy of the interpolation. This interpolation step can make use of a reduced-order stiffness matrix, a backward difference matrix or a central difference matrix. The measured and interpolated floor acceleration components at all floors are then assembled and multiplied by a mass matrix. The recovered in-service force-displacement relations are then incorporated into the OpenSees soil structure interaction model.

Numerical simulations of soil-structure interaction involving non-uniform soil behavior are conducted following the development of the complete soil-structure interaction model of Christchurch Women's Hospital in OpenSees. In these 2D OpenSees models, the superstructure is modeled as two-dimensional frames in short span and long span respectively. The lead rubber bearings are modeled as elastomeric bearing (Bouc Wen) elements. The soil underlying the concrete raft foundation is modeled with linear elastic plane strain quadrilateral element. The non-uniformity of the soil profile is incorporated by extraction and interpolation of shear wave velocity profile from the Canterbury Geotechnical Database. The validity of the complete two-dimensional soil-structure interaction OpenSees model for the hospital is checked by comparing the results of peak floor responses and force-displacement relations within the isolation system achieved from OpenSees simulations to the recorded measurements. General explanations and implications, supported by displacement drifts, floor acceleration and displacement responses, force-displacement relations are described to address the effects of soil-structure interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing offers massive scalability and elasticity required by many scien-tific and commercial applications. Combining the computational and data handling capabilities of clouds with parallel processing also has the potential to tackle Big Data problems efficiently. Science gateway frameworks and workflow systems enable application developers to implement complex applications and make these available for end-users via simple graphical user interfaces. The integration of such frameworks with Big Data processing tools on the cloud opens new oppor-tunities for application developers. This paper investigates how workflow sys-tems and science gateways can be extended with Big Data processing capabilities. A generic approach based on infrastructure aware workflows is suggested and a proof of concept is implemented based on the WS-PGRADE/gUSE science gateway framework and its integration with the Hadoop parallel data processing solution based on the MapReduce paradigm in the cloud. The provided analysis demonstrates that the methods described to integrate Big Data processing with workflows and science gateways work well in different cloud infrastructures and application scenarios, and can be used to create massively parallel applications for scientific analysis of Big Data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The large upfront investments required for game development pose a severe barrier for the wider uptake of serious games in education and training. Also, there is a lack of well-established methods and tools that support game developers at preserving and enhancing the games’ pedagogical effectiveness. The RAGE project, which is a Horizon 2020 funded research project on serious games, addresses these issues by making available reusable software components that aim to support the pedagogical qualities of serious games. In order to easily deploy and integrate these game components in a multitude of game engines, platforms and programming languages, RAGE has developed and validated a hybrid component-based software architecture that preserves component portability and interoperability. While a first set of software components is being developed, this paper presents selected examples to explain the overall system’s concept and its practical benefits. First, the Emotion Detection component uses the learners’ webcams for capturing their emotional states from facial expressions. Second, the Performance Statistics component is an add-on for learning analytics data processing, which allows instructors to track and inspect learners’ progress without bothering about the required statistics computations. Third, a set of language processing components accommodate the analysis of textual inputs of learners, facilitating comprehension assessment and prediction. Fourth, the Shared Data Storage component provides a technical solution for data storage - e.g. for player data or game world data - across multiple software components. The presented components are exemplary for the anticipated RAGE library, which will include up to forty reusable software components for serious gaming, addressing diverse pedagogical dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper synthesizes and discusses the spatial and temporal patterns of archaeological sites in Ireland, spanning the Neolithic period and the Bronze Age transition (4300–1900 cal BC), in order to explore the timing and implications of the main changes that occurred in the archaeological record of that period. Large amounts of new data are sourced from unpublished developer-led excavations and combined with national archives, published excavations and online databases. Bayesian radiocarbon models and context- and sample-sensitive summed radiocarbon probabilities are used to examine the dataset. The study captures the scale and timing of the initial expansion of Early Neolithic settlement and the ensuing attenuation of all such activity—an apparent boom-and-bust cycle. The Late Neolithic and Chalcolithic periods are characterised by a resurgence and diversification of activity. Contextualisation and spatial analysis of radiocarbon data reveals finer-scale patterning than is usually possible with summed-probability approaches: the boom-and-bust models of prehistoric populations may, in fact, be a misinterpretation of more subtle demographic changes occurring at the same time as cultural change and attendant differences in the archaeological record.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This information release, produced by the Department of Health, Social Services and Public Safety’s Information and Analysis Directorate, provides information on smoking cessation services. Data are included on the monitoring of smoking cessation services in Northern Ireland during the period 1st April 2014 to 31st March 2015. This report also provides an analysis of data collected in 2014/15 in respect of clients who set a quit date during 2013/14 (52 week follow-up). Information contained within this report was downloaded from a web based recording system. Figures here are correct as of 1st September 2015. The Ten Year Tobacco Control Strategy for Northern Ireland aims to see fewer people starting to smoke, more smokers quitting and protecting people from tobacco smoke. It is aimed at the entire population of Northern Ireland as smoking and its harmful effects cut across all barriers of class, race and gender. There is a strong relationship between smoking and inequalities, with more people dying of smoking-related illnesses in disadvantaged areas of Northern Ireland than in its more affluent areas. In order to ensure that more focused action is directed to where it is needed the most, three priority groups have been identified. They are: · Children and young people; · Disadvantaged people who smoke; and · Pregnant women, and their partners, who smoke. The Public Health Agency (PHA) is responsible for implementing the strategy and the development of cessation services is a key element of the overall aim to tackle smoking. The 2013/14 Health Survey Northern Ireland reported that 22% of adults currently smoke (23% of males and 21% of females). In addition, in 2013, the Young Persons’ Behaviour and Attitude Survey (YPBAS) found that 6% of pupils aged between and 11 and 16 smoked (7% of males and 5% of females).      

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As escolas portuguesas do ensino não superior estão dotadas com infraestruturas e equipamentos que permitem trazer o mundo para dentro da sala de aula, tornando o processo de ensino e de aprendizagem mais rico e motivador para os alunos. A adoção institucional de uma plataforma que segue os princípios da web social, o SAPO Campus (SC), definida pela abertura, partilha, integração, inovação e personalização, pode ser catalisadora de processos de mudança e inovação. O presente estudo teve como finalidade acompanhar o processo de adoção do SC em cinco escolas, bem como analisar o impacto no processo de ensino e de aprendizagem e a forma como os alunos e professores se relacionam com esta tecnologia. As escolas envolvidas foram divididas em dois grupos: o primeiro grupo, constituído por três escolas onde o acompanhamento teve uma natureza mais interventiva e presente, enquanto que no segundo grupo, composto por duas escolas, foram apenas observadas as dinâmicas que se desenvolveram no processo de adoção e utilização do SC. No presente estudo, que se assume como um estudo longitudinal de multicasos, foram aplicadas técnicas de tratamento de dados como a estatística descritiva, a análise de conteúdo e a Social Network Analysis (SNA), com o objetivo de, através de uma triangulação permanente, proceder a uma análise dos impactos observados pela utilização do SC. Estes impactos podem ser situados em três níveis diferentes: relativos à instituição, aos professores e aos alunos. Ao nível da adoção institucional de uma tecnologia, verificou-se que essa adoção passa uma mensagem a toda a organização e que, no caso do SC, apela à participação coletiva num ambiente aberto onde as hierarquias se dissipam. Verificou-se ainda que deve implicar o envolvimento dos alunos em atividades significativas e a adoção de estratégias dinâmicas, preferencialmente integradas num projeto mobilizador. A adoção do SC foi ainda catalisadora de dinâmicas que provocaram mudanças nos padrões de consumo e de produção de conteúdos bem como de uma atitude diferente perante o papel da web social no processo de ensino e aprendizagem. As conclusões apontam ainda no sentido da identificação de um conjunto de fatores, observados no estudo, que tiveram impacto no processo de adoção como o papel das lideranças, a importância da formação de professores, a cultura das escolas, a integração num projeto pedagógico e, a um nível mais primário, as questões do acesso à tecnologia. Algumas comunidades construídas à volta do SAPO Campus, envolvendo professores, alunos e a comunidade, evoluíram no sentido da autossustentação, num percurso de reflexão sobre as práticas pedagógicas e partilha de experiências.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wydział Nauk Geograficznych i Geologicznych

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Doutoramento em Engenharia Florestal e dos Recursos Naturais - Instituto Superior de Agronomia - UL

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Progress in control of bovine tuberculosis (bTB) is often not uniform, usually due to the effect of one or more sometimes unknown epidemiological factors impairing the success of eradication programs. Use of spatial analysis can help to identify clusters of persistence of disease, leading to the identification of these factors thus allowing the implementation of targeted control measures, and may provide some insights of disease transmission, particularly when combined with molecular typing techniques. Here, the spatial dynamics of bTB in a high prevalence region of Spain were assessed during a three year period (2010-2012) using data from the eradication campaigns to detect clusters of positive bTB herds and of those infected with certain Mycobacterium bovis strains (characterized using spoligotyping and VNTR typing). In addition, the within-herd transmission coefficient (β) was estimated in infected herds and its spatial distribution and association with other potential outbreak and herd variables was evaluated. Significant clustering of positive herds was identified in the three years of the study in the same location ("high risk area"). Three spoligotypes (SB0339, SB0121 and SB1142) accounted for >70% of the outbreaks detected in the three years. VNTR subtyping revealed the presence of few but highly prevalent strains within the high risk area, suggesting maintained transmission in the area. The spatial autocorrelation found in the distribution of the estimated within-herd transmission coefficients in herds located within distances <14 km and the results of the spatial regression analysis, support the hypothesis of shared local factors affecting disease transmission in farms located at a close proximity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The literature clearly links the quality and capacity of a country’s infrastructure to its economic growth and competitiveness. This thesis analyses the historic national and spatial distribution of investment by the Irish state in its physical networks (water, wastewater and roads) across the 34 local authorities and examines how Ireland is perceived internationally relative to its economic counterparts. An appraisal of the current status and shortcomings of Ireland’s infrastructure is undertaken using key stakeholders from foreign direct investment companies and national policymakers to identify Ireland's infrastructural gaps, along with current challenges in how the country is delivering infrastructure. The output of these interviews identified many issues with how infrastructure decision-making is currently undertaken. This led to an evaluation of how other countries are informing decision-making, and thus this thesis presents a framework of how and why Ireland should embrace a Systems of Systems (SoS) methodology approach to infrastructure decision-making going forward. In undertaking this study a number of other infrastructure challenges were identified: significant political interference in infrastructure decision-making and delivery the need for a national agency to remove the existing ‘silo’ type of mentality to infrastructure delivery how tax incentives can interfere with the market; and their significance. The two key infrastructure gaps identified during the interview process were: the need for government intervention in the rollout of sufficient communication capacity and at a competitive cost outside of Dublin; and the urgent need to address water quality and capacity with approximately 25% of the population currently being served by water of unacceptable quality. Despite considerable investment in its national infrastructure, Ireland’s infrastructure performance continues to trail behind its economic partners in the Eurozone and OECD. Ireland is projected to have the highest growth rate in the euro zone region in 2015 and 2016, albeit that it required a bailout in 2010, and, at the time of writing, is beginning to invest in its infrastructure networks again. This thesis proposes the development and implementation of a SoS approach for infrastructure decision-making which would be based on: existing spatial and capacity data of each of the constituent infrastructure networks; and scenario computation and analysis of alternative drivers eg. Demographic change, economic variability and demand/capacity constraints. The output from such an analysis would provide valuable evidence upon which policy makers and decision makers alike could rely, which has been lacking in historic investment decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis builds a framework for evaluating downside risk from multivariate data via a special class of risk measures (RM). The peculiarity of the analysis lies in getting rid of strong data distributional assumptions and in orientation towards the most critical data in risk management: those with asymmetries and heavy tails. At the same time, under typical assumptions, such as the ellipticity of the data probability distribution, the conformity with classical methods is shown. The constructed class of RM is a multivariate generalization of the coherent distortion RM, which possess valuable properties for a risk manager. The design of the framework is twofold. The first part contains new computational geometry methods for the high-dimensional data. The developed algorithms demonstrate computability of geometrical concepts used for constructing the RM. These concepts bring visuality and simplify interpretation of the RM. The second part develops models for applying the framework to actual problems. The spectrum of applications varies from robust portfolio selection up to broader spheres, such as stochastic conic optimization with risk constraints or supervised machine learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dada la persistencia de las diferencias en ingresos laborales por regiones en Colombia, el presente artículo propone cuantificar la magnitud de este diferencial que es atribuida a la diferencia en estructuras de mercado laboral, entendiendo esta última como la diferencia en los retornos a las características de la fuerza laboral. Para ello se propone el uso de un método de descomposición del tipo Oaxaca- Blinder y se compara a Bogotá –la ciudad con mayores ingresos laborales- con otras ciudades principales. Los resultados obtenidos al conducir el ejercicio de descomposición muestran que las diferencias en estructura están a favor de Bogotá y que estas explican más de la mitad de la diferencia total, indicando que si se quieren reducir las disparidades de ingresos laborales entre ciudades no es suficiente con calificar la fuerza laboral y que es necesario indagar por las causas que hacen que los retornos a las características difieran entre ciudades.