896 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of powerful new technologies, the existence of large quantities of data, and increasing demands for the extraction of added value from these technologies and data have created a number of significant challenges for those charged with both corporate and information technology management. The possibilities are great, the expectations high, and the risks significant. Organisations seeking to employ cloud technologies and exploit the value of the data to which they have access, be this in the form of "Big Data" available from different external sources or data held within the organisation, in structured or unstructured formats, need to understand the risks involved in such activities. Data owners have responsibilities towards the subjects of the data and must also, frequently, demonstrate that they are in compliance with current standards, laws and regulations. This thesis sets out to explore the nature of the technologies that organisations might utilise, identify the most pertinent constraints and risks, and propose a framework for the management of data from discovery to external hosting that will allow the most significant risks to be managed through the definition, implementation, and performance of appropriate internal control activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT Applications of phosphogypsum (PG) provide nutrients to the soil and reduce Al3+ activity, favoring soil fertility and root growth, but allow Mg2+ mobilization through the soil profile, resulting in variations in the PG rate required to achieve the optimum crop yield. This study evaluated the effect of application rates and splitting of PG on soil fertility of a Typic Hapludox, as well as the influence on annual crops under no-tillage. Using a (4 × 3) + 1 factorial structure, the treatments consisted of four PG rates (3, 6, 9, and 12 Mg ha-1) and three split applications (P1 = 100 % in 2009; P2 = 50+50 % in 2009 and 2010; P3 = 33+33+33 % in 2009, 2010 and 2011), plus a control without PG. The soil was sampled six months after the last PG application, in stratified layers to a depth of 0.8 m. Corn, wheat and soybean were sown between November 2011 and December 2012, and leaf samples were collected for analysis when at least 50 % of the plants showed reproductive structures. The application of PG increased Ca2+ concentrations in all sampled soil layers and the soil pH between 0.2 and 0.8 m, and reduced the concentrations of Al3+ in all layers and of Mg2+ to a depth of 0.6 m, without any effect of splitting the applications. The soil Ca/Mg ratio increased linearly to a depth of 0.6 m with the rates and were found to be higher in the 0.0-0.1 m layer of the P2 and P3 treatments than without splitting (P1). Sulfur concentrations increased linearly by application rates to a depth of 0.8 m, decreasing in the order P3>P2>P1 to a depth of 0.4 m and were higher in the treatments P3 and P2 than P1 between 0.4-0.6 m, whereas no differences were observed in the 0.6-0.8 m layer. No effect was recorded for K, P and potential acidity (H+Al). The leaf Ca and S concentration increased, while Mg decreased for all crops treated with PG, and there was no effect of splitting the application. The yield response of corn to PG rates was quadratic, with the maximum technical efficiency achieved at 6.38 Mg ha-1 of PG, while wheat yield increased linearly in a growing season with a drought period. Soybean yield was not affected by the PG rate, and splitting had no effect on the yield of any of the crops. Phosphogypsum improved soil fertility in the profile, however, Mg2+ migrated downwards, regardless of application splitting. Splitting the PG application induced a higher Ca/Mg ratio in the 0.0-0.1 m layer and less S leaching, but did not affect the crop yield. The application rates had no effect on soybean yield, but were beneficial for corn and, especially, for wheat, which was affected by a drought period during growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brain fluctuations at rest are not random but are structured in spatial patterns of correlated activity across different brain areas. The question of how resting-state functional connectivity (FC) emerges from the brain's anatomical connections has motivated several experimental and computational studies to understand structure-function relationships. However, the mechanistic origin of resting state is obscured by large-scale models' complexity, and a close structure-function relation is still an open problem. Thus, a realistic but simple enough description of relevant brain dynamics is needed. Here, we derived a dynamic mean field model that consistently summarizes the realistic dynamics of a detailed spiking and conductance-based synaptic large-scale network, in which connectivity is constrained by diffusion imaging data from human subjects. The dynamic mean field approximates the ensemble dynamics, whose temporal evolution is dominated by the longest time scale of the system. With this reduction, we demonstrated that FC emerges as structured linear fluctuations around a stable low firing activity state close to destabilization. Moreover, the model can be further and crucially simplified into a set of motion equations for statistical moments, providing a direct analytical link between anatomical structure, neural network dynamics, and FC. Our study suggests that FC arises from noise propagation and dynamical slowing down of fluctuations in an anatomically constrained dynamical system. Altogether, the reduction from spiking models to statistical moments presented here provides a new framework to explicitly understand the building up of FC through neuronal dynamics underpinned by anatomical connections and to drive hypotheses in task-evoked studies and for clinical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents the Multiple Kernel Learning (MKL) approach as a modelling and data exploratory tool and applies it to the problem of wind speed mapping. Support Vector Regression (SVR) is used to predict spatial variations of the mean wind speed from terrain features (slopes, terrain curvature, directional derivatives) generated at different spatial scales. Multiple Kernel Learning is applied to learn kernels for individual features and thematic feature subsets, both in the context of feature selection and optimal parameters determination. An empirical study on real-life data confirms the usefulness of MKL as a tool that enhances the interpretability of data-driven models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Powell Basin is a small oceanic basin located at the NE end of the Antarctic Peninsula developed during the Early Miocene and mostly surrounded by the continental crusts of the South Orkney Microcontinent, South Scotia Ridge and Antarctic Peninsula margins. Gravity data from the SCAN 97 cruise obtained with the R/V Hespérides and data from the Global Gravity Grid and Sea Floor Topography (GGSFT) database (Sandwell and Smith, 1997) are used to determine the 3D geometry of the crustal-mantle interface (CMI) by numerical inversion methods. Water layer contribution and sedimentary effects were eliminated from the Free Air anomaly to obtain the total anomaly. Sedimentary effects were obtained from the analysis of existing and new SCAN 97 multichannel seismic profiles (MCS). The regional anomaly was obtained after spectral and filtering processes. The smooth 3D geometry of the crustal mantle interface obtained after inversion of the regional anomaly shows an increase in the thickness of the crust towards the continental margins and a NW-SE oriented axis of symmetry coinciding with the position of an older oceanic spreading axis. This interface shows a moderate uplift towards the western part and depicts two main uplifts to the northern and eastern sectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present an efficient numerical scheme for the recently introduced geodesic active fields (GAF) framework for geometric image registration. This framework considers the registration task as a weighted minimal surface problem. Hence, the data-term and the regularization-term are combined through multiplication in a single, parametrization invariant and geometric cost functional. The multiplicative coupling provides an intrinsic, spatially varying and data-dependent tuning of the regularization strength, and the parametrization invariance allows working with images of nonflat geometry, generally defined on any smoothly parametrizable manifold. The resulting energy-minimizing flow, however, has poor numerical properties. Here, we provide an efficient numerical scheme that uses a splitting approach; data and regularity terms are optimized over two distinct deformation fields that are constrained to be equal via an augmented Lagrangian approach. Our approach is more flexible than standard Gaussian regularization, since one can interpolate freely between isotropic Gaussian and anisotropic TV-like smoothing. In this paper, we compare the geodesic active fields method with the popular Demons method and three more recent state-of-the-art algorithms: NL-optical flow, MRF image registration, and landmark-enhanced large displacement optical flow. Thus, we can show the advantages of the proposed FastGAF method. It compares favorably against Demons, both in terms of registration speed and quality. Over the range of example applications, it also consistently produces results not far from more dedicated state-of-the-art methods, illustrating the flexibility of the proposed framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityössä tutkitaan keinoja brändätä ja varioida S60-ohjelmistoja dynaamisesti ja ajonaikaisesti. S60 on kehitysalusta, jota käyttävät useat puhelinvalmistajat ja heidän puhelimiaan käyttävät lukuisat eri operaattorit. Operaattorit haluavat puhelimiensa tai osan puhelimen sovelluksista erottuvan kilpailijoista heidän omalla brändillään ja tämän takia täytyy olla keinot joko koko puhelimen, tai valittujen sovellusten brändäykselle. Osa sovelluksista saatetaan haluta vaihtavan käytettyä brändiä sen käyttämien resurssien, kuten verkkopalvelimen, mukaan. Variointidataa tulee myös pystyä jakamaan eri sovellusten tai sovellusten osien kesken. Työssä esitellään Symbian käyttöjärjestelmä ja S60 kehitysympäristö, sekä pohditaan Symbianin turvallisuuskäytäntöjen tuomia haasteita variointidatan jakamiseen eri sovellusten välillä. Olemassaolevia variointitapoja tutkitaan työn mahdolliseksi pohjaksi. Työ sisältää esittelyn projektista, jossa kehitettiin erään S60 sovelluksen dynaaminen brändäystoteutus, joka myös mahdollistaa variointidatan jakamisen eri sovellusten kanssa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kiihtyvä kilpailu yritysten välillä on tuonut yritykset vaikeidenhaasteiden eteen. Tuotteet pitäisi saada markkinoille nopeammin, uusien tuotteiden pitäisi olla parempia kuin vanhojen ja etenkin parempia kuin kilpailijoiden vastaavat tuotteet. Lisäksi tuotteiden suunnittelu-, valmistus- ja muut kustannukset eivät saisi olla suuria. Näiden haasteiden toteuttamisessa yritetään usein käyttää apuna tuotetietoja, niiden hallintaa ja vaihtamista. Andritzin, kuten muidenkin yritysten, on otettava nämä asiat huomioon pärjätäkseen kilpailussa. Tämä työ on tehty Andritzille, joka on maailman johtavia paperin ja sellun valmistukseen tarkoitettujen laitteiden valmistajia ja huoltopalveluiden tarjoajia. Andritz on ottamassa käyttöön ERP-järjestelmän kaikissa toimipisteissään. Sitä halutaan hyödyntää mahdollisimman tehokkaasti, joten myös tuotetiedot halutaan järjestelmään koko elinkaaren ajalta. Osan tuotetiedoista luo Andritzin kumppanit ja alihankkijat, joten myös tietojen vaihto partnereiden välillä halutaan hoitaasiten, että tiedot saadaan suoraan ERP-järjestelmään. Tämän työn tavoitteena onkin löytää ratkaisu, jonka avulla Andritzin ja sen kumppaneiden välinen tietojenvaihto voidaan hoitaa. Tämä diplomityö esittelee tuotetietojen, niiden hallinnan ja vaihtamisen tarkoituksen ja tärkeyden. Työssä esitellään erilaisia ratkaisuvaihtoehtoja tiedonvaihtojärjestelmän toteuttamiseksi. Osa niistä perustuu yleisiin ja toimialakohtaisiin standardeihin. Myös kaksi kaupallista tuotetta esitellään. Tarkasteltavana onseuraavat standardit: PaperIXI, papiNet, X-OSCO, PSK-standardit sekä RosettaNet. Lisäksi työssä tarkastellaan ERP-järjestelmän toimittajan, SAP:in ratkaisuja tietojenvaihtoon. Näistä vaihtoehdoista parhaimpia tarkastellaan vielä yksityiskohtaisemmin ja lopuksi eri ratkaisuja vertaillaan keskenään, jotta löydettäisiin Andritzin tarpeisiin paras vaihtoehto.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Suorituskyky- ja kuormitustestien tekeminen sovelluksille on erittäin tärkeä osa tuotantoprosessia nykypäivänä. Myös Web-sovelluksia testataan yhä enemmän. Tarve suorituskyky- ja kuormitustestien tekemiselle on selvä. Testattavan ympäristön tämänhetkinen, mutta myös tulevaisuuden toimivuus taataan oikein tehdyillä testeillä ja niitä seuraavilla korjaustoimenpiteillä. Suurten käyttäjämäärien testaaminen manuaalisesti on kuitenkin hyvin vaikeaa. Sirpaleisen ympäristön, kuten palveluihin perustuvien Web-sovellusympäristöjen testaaminen on haaste. Tämän työn aiheena on arvioida työkaluja ja menetelmiä, joilla raskaita teollisia Web-sovelluksia voidaan testata. Tavoitteena on löytää testausmenetelmiä, joilla voidaan luotettavasti simuloida suuria käyttäjämääriä. Tavoitteena on myös arvioida erilaisten yhteyksien ja protokollien vaikutusta Web-sovelluksen suorituskykyyn.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sähköisen liiketoiminnan ja mobiliteetin konvergenssi yhdessä teknologisen innovaation kiihtyvän vauhdin kanssa ovat saaneet aikaan kiinnostusta langattomia liiketoimintaratkaisuja kohtaan. Tämän diplomityön tavoitteena oli tutkia sähköisen liiketoiminnan langattomien sovellusten arviointi- ja kehitysprosessia. Työ keskittyy tarkastelemaan paperiteollisuuden toimitusketjun langatonta seurantaa. Tutkimuksessa esitetään langattoman sähköisen liiketoiminnan määritelmä, kuvaillaan langattomuuden eri sovellusalueita ja sovellusten arviointi- ja kehitysprosessin strategisia sekä teknologisia ulottuvuuksia. Työ luo viitekehyksen, jonka avulla tarkastella langattomien teknologioiden merkitystä logistiikassa. Tutkimuksen merkittävin tulos on prosessimalli sovellusten arvioimiseksi ja kehittämiseksi. Mallilla kehitetty langaton sovellus osoittautui tarkastelussa hyödylliseksi toimitusketjun hallinnassa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the world of transport management, the term ‘anticipation’ is gradually replacing ‘reaction’. Indeed, the ability to forecast traffic evolution in a network should ideally form the basis for many traffic management strategies and multiple ITS applications. Real-time prediction capabilities are therefore becoming a concrete need for the management of networks, both for urban and interurban environments, and today’s road operator has increasingly complex and exacting requirements. Recognising temporal patterns in traffic or the manner in which sequential traffic events evolve over time have been important considerations in short-term traffic forecasting. However, little work has been conducted in the area of identifying or associating traffic pattern occurrence with prevailing traffic conditions. This paper presents a framework for detection pattern identification based on finite mixture models using the EM algorithm for parameter estimation. The computation results have been conducted taking into account the traffic data available in an urban network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we report a preliminary analysis of the impact of Global Navigation Satellite System Reflections (GNSS-R) data on ionospheric monitoring over the oceans. The focus will be on a single polar Low Earth Orbiter (LEO) mission exploiting GNSS-R as well as Navigation (GNSS-N) and Occultation (GNSS-O) total electron content (TEC) measurements. In order to assess impact of the data, we have simulated GNSS-R/O/N TEC data as would be measured from the LEO and from International Geodesic Service (IGS) ground stations, with an electron density (ED) field generated using a climatic ionospheric model. We have also developed a new tomographic approach inspired by the physics of the hydrogen atom and used it to effectively retrieve the ED field from the simulated TEC data near the orbital plane. The tomographic inversion results demonstrate the significant impact of GNSS-R: three-dimensional ionospheric ED fields are retrieved over the oceans quite accurately, even as, in the spirit of this initial study, the simulation and inversion approaches avoided intensive computation and sophisticated algorithmic elements (such as spatio-temporal smoothing). We conclude that GNSS-R data over the oceans can contribute significantly to a Global/GNSS Ionospheric Observation System (GIOS). Index Terms Global Navigation Satellite System (GNSS), Global Navigation Satellite System Reflections (GNSS-R), ionosphere, Low Earth Orbiter (LEO), tomography.