39 resultados para Numerical integration


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Numerical models, used for atmospheric research, weather prediction and climate simulation, describe the state of the atmosphere over the heterogeneous surface of the Earth. Several fundamental properties of atmospheric models depend on orography, i.e. on the average elevation of land over a model area. The higher is the models' resolution, the more the details of orography directly influence the simulated atmospheric processes. This sets new requirements for the accuracy of the model formulations with respect to the spatially varying orography. Orography is always averaged, representing the surface elevation within the horizontal resolution of the model. In order to remove the smallest scales and steepest slopes, the continuous spectrum of orography is normally filtered (truncated) even more, typically beyond a few gridlengths of the model. This means, that in the numerical weather prediction (NWP) models, there will always be subgridscale orography effects, which cannot be explicitly resolved by numerical integration of the basic equations, but require parametrization. In the subgrid-scale, different physical processes contribute in different scales. The parametrized processes interact with the resolved-scale processes and with each other. This study contributes to building of a consistent, scale-dependent system of orography-related parametrizations for the High Resolution Limited Area Model (HIRLAM). The system comprises schemes for handling the effects of mesoscale (MSO) and small-scale (SSO) orographic effects on the simulated flow and a scheme of orographic effects on the surface-level radiation fluxes. Representation of orography, scale-dependencies of the simulated processes and interactions between the parametrized and resolved processes are discussed. From the high-resolution digital elevation data, orographic parameters are derived for both momentum and radiation flux parametrizations. Tools for diagnostics and validation are developed and presented. The parametrization schemes applied, developed and validated in this study, are currently being implemented into the reference version of HIRLAM.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hamiltonian systems in stellar and planetary dynamics are typically near integrable. For example, Solar System planets are almost in two-body orbits, and in simulations of the Galaxy, the orbits of stars seem regular. For such systems, sophisticated numerical methods can be developed through integrable approximations. Following this theme, we discuss three distinct problems. We start by considering numerical integration techniques for planetary systems. Perturbation methods (that utilize the integrability of the two-body motion) are preferred over conventional "blind" integration schemes. We introduce perturbation methods formulated with Cartesian variables. In our numerical comparisons, these are superior to their conventional counterparts, but, by definition, lack the energy-preserving properties of symplectic integrators. However, they are exceptionally well suited for relatively short-term integrations in which moderately high positional accuracy is required. The next exercise falls into the category of stability questions in solar systems. Traditionally, the interest has been on the orbital stability of planets, which have been quantified, e.g., by Liapunov exponents. We offer a complementary aspect by considering the protective effect that massive gas giants, like Jupiter, can offer to Earth-like planets inside the habitable zone of a planetary system. Our method produces a single quantity, called the escape rate, which characterizes the system of giant planets. We obtain some interesting results by computing escape rates for the Solar System. Galaxy modelling is our third and final topic. Because of the sheer number of stars (about 10^11 in Milky Way) galaxies are often modelled as smooth potentials hosting distributions of stars. Unfortunately, only a handful of suitable potentials are integrable (harmonic oscillator, isochrone and Stäckel potential). This severely limits the possibilities of finding an integrable approximation for an observed galaxy. A solution to this problem is torus construction; a method for numerically creating a foliation of invariant phase-space tori corresponding to a given target Hamiltonian. Canonically, the invariant tori are constructed by deforming the tori of some existing integrable toy Hamiltonian. Our contribution is to demonstrate how this can be accomplished by using a Stäckel toy Hamiltonian in ellipsoidal coordinates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The earliest stages of human cortical visual processing can be conceived as extraction of local stimulus features. However, more complex visual functions, such as object recognition, require integration of multiple features. Recently, neural processes underlying feature integration in the visual system have been under intensive study. A specialized mid-level stage preceding the object recognition stage has been proposed to account for the processing of contours, surfaces and shapes as well as configuration. This thesis consists of four experimental, psychophysical studies on human visual feature integration. In two studies, classification image a recently developed psychophysical reverse correlation method was used. In this method visual noise is added to near-threshold stimuli. By investigating the relationship between random features in the noise and observer s perceptual decision in each trial, it is possible to estimate what features of the stimuli are critical for the task. The method allows visualizing the critical features that are used in a psychophysical task directly as a spatial correlation map, yielding an effective "behavioral receptive field". Visual context is known to modulate the perception of stimulus features. Some of these interactions are quite complex, and it is not known whether they reflect early or late stages of perceptual processing. The first study investigated the mechanisms of collinear facilitation, where nearby collinear Gabor flankers increase the detectability of a central Gabor. The behavioral receptive field of the mechanism mediating the detection of the central Gabor stimulus was measured by the classification image method. The results show that collinear flankers increase the extent of the behavioral receptive field for the central Gabor, in the direction of the flankers. The increased sensitivity at the ends of the receptive field suggests a low-level explanation for the facilitation. The second study investigated how visual features are integrated into percepts of surface brightness. A novel variant of the classification image method with brightness matching task was used. Many theories assume that perceived brightness is based on the analysis of luminance border features. Here, for the first time this assumption was directly tested. The classification images show that the perceived brightness of both an illusory Craik-O Brien-Cornsweet stimulus and a real uniform step stimulus depends solely on the border. Moreover, the spatial tuning of the features remains almost constant when the stimulus size is changed, suggesting that brightness perception is based on the output of a single spatial frequency channel. The third and fourth studies investigated global form integration in random-dot Glass patterns. In these patterns, a global form can be immediately perceived, if even a small proportion of random dots are paired to dipoles according to a geometrical rule. In the third study the discrimination of orientation structure in highly coherent concentric and Cartesian (straight) Glass patterns was measured. The results showed that the global form was more efficiently discriminated in concentric patterns. The fourth study investigated how form detectability depends on the global regularity of the Glass pattern. The local structure was either Cartesian or curved. It was shown that randomizing the local orientation deteriorated the performance only with the curved pattern. The results give support for the idea that curved and Cartesian patterns are processed in at least partially separate neural systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most countries of Europe, as well as many countries in other parts of the world, are experiencing an increased impact of natural hazards. It is often speculated, but not yet proven, that climate change might influence the frequency and magnitude of certain hydro-meteorological natural hazards. What has certainly been observed is a sharp increase in financial losses caused by natural hazards worldwide. Eventhough Europe appears to be a space that is not affected by natural hazards to such catastrophic extents as other parts of the world are, the damages experienced here are certainly increasing too. Natural hazards, climate change and, in particular, risks have therefore recently been put high on the political agenda of the EU. In the search for appropriate instruments for mitigating impacts of natural hazards and climate change, as well as risks, the integration of these factors into spatial planning practices is constantly receiving higher attention. The focus of most approaches lies on single hazards and climate change mitigation strategies. The current paradigm shift of climate change mitigation to adaptation is used as a basis to draw conclusions and recommendations on what concepts could be further incorporated into spatial planning practices. Especially multi-hazard approaches are discussed as an important approach that should be developed further. One focal point is the definition and applicability of the terms natural hazard, vulnerability and risk in spatial planning practices. Especially vulnerability and risk concepts are so many-fold and complicated that their application in spatial planning has to be analysed most carefully. The PhD thesis is based on six published articles that describe the results of European research projects, which have elaborated strategies and tools for integrated communication and assessment practices on natural hazards and climate change impacts. The papers describe approaches on local, regional and European level, both from theoretical and practical perspectives. Based on these, passed, current and future potential spatial planning applications are reviewed and discussed. In conclusion it is recommended to shift from single hazard assessments to multi-hazard approaches, integrating potential climate change impacts. Vulnerability concepts should play a stronger role than present, and adaptation to natural hazards and climate change should be more emphasized in relation to mitigation. It is outlined that the integration of risk concepts in planning is rather complicated and would need very careful assessment to ensure applicability. Future spatial planning practices should also consider to be more interdisciplinary, i.e. to integrate as many stakeholders and experts as possible to ensure the sustainability of investments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determination of the environmental factors controlling earth surface processes and landform patterns is one of the central themes in physical geography. However, the identification of the main drivers of the geomorphological phenomena is often challenging. Novel spatial analysis and modelling methods could provide new insights into the process-environment relationships. The objective of this research was to map and quantitatively analyse the occurrence of cryogenic phenomena in subarctic Finland. More precisely, utilising a grid-based approach the distribution and abundance of periglacial landforms were modelled to identify important landscape scale environmental factors. The study was performed using a comprehensive empirical data set of periglacial landforms from an area of 600 km2 at a 25-ha resolution. The utilised statistical methods were generalized linear modelling (GLM) and hierarchical partitioning (HP). GLMs were used to produce distribution and abundance models and HP to reveal independently the most likely causal variables. The GLM models were assessed utilising statistical evaluation measures, prediction maps, field observations and the results of HP analyses. A total of 40 different landform types and subtypes were identified. Topographical, soil property and vegetation variables were the primary correlates for the occurrence and cover of active periglacial landforms on the landscape scale. In the model evaluation, most of the GLMs were shown to be robust although the explanation power, prediction ability as well as the selected explanatory variables varied between the models. The great potential of the combination of a spatial grid system, terrain data and novel statistical techniques to map the occurrence of periglacial landforms was demonstrated in this study. GLM proved to be a useful modelling framework for testing the shapes of the response functions and significances of the environmental variables and the HP method helped to make better deductions of the important factors of earth surface processes. Hence, the numerical approach presented in this study can be a useful addition to the current range of techniques available to researchers to map and monitor different geographical phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The module of a quadrilateral is a positive real number which divides quadrilaterals into conformal equivalence classes. This is an introductory text to the module of a quadrilateral with some historical background and some numerical aspects. This work discusses the following topics: 1. Preliminaries 2. The module of a quadrilateral 3. The Schwarz-Christoffel Mapping 4. Symmetry properties of the module 5. Computational results 6. Other numerical methods Appendices include: Numerical evaluation of the elliptic integrals of the first kind. Matlab programs and scripts and possible topics for future research. Numerical results section covers additive quadrilaterals and the module of a quadrilateral under the movement of one of its vertex.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the first time the attempt of Denmark, Finland, Norway and Sweden to increase Nordic economic co-operation and integration (NORDEK 1968-1970) is analysed by using records from the four governments archives and interviews with central actors participating. A dominating argument has until now been that dynamics in Nordic economic integration is different from dynamics in European integration. This archive based study disproves the myth however of ideological Nordism and of short term political developments outside Norden as most important for the NORDEK initiative. The NORDEK initiative was actually more a consequence of a long term socioeconomic and socio-political path dependant process. The study also disproves the myth that the NORDEK plan was a political and ideological symbol without socioeconomic substance. The purpose with NORDEK was to create a better basis for generating economic growth and social welfare. The proposed NORDEK institutions were therefore developed to promote economic progress. The study finally shows that the NORDEK failure in 1970 was not a result of lacking economic rationale or incompatible economic interests. The failure was a result of a power struggle in Finnish domestic policy and lacking political will in the other Nordic countries to continue without Finland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study explores the role of the state in regional integration processes. The question is approached through theoretical discussion and two case-studies - SADC (Southern African Development Community) and the EU. The main research question of the study is, what are the possibilities and problems of the integration process in Southern Africa and how do they differ from the possibilities and problems of the integration process in Europe. The undelrying question of the study is why do states decide to participate in an integration process where they have to limit their sovereignty. Review of the theoretical discussion of the integration studies shows that the integration process is affected by several factors on different levels of the international system. But the state plays a central role in integration processes - integration processes are inititated and carried on by the participatig states. The European integration process shows that the interests of the state can change over time. At the beginning of the integration process, the objective was to strengthen participating states. Later EU member states have decided that it is in their interest to deepen the process even if it has meant limitation of their sovereignty. The determinant factor has been that the member states have considered it to be in their interst to deepen the process. In Southern Africa the integration process is only at the beginning. SADC aims to establish a free trade area by 2008. The biggest challenge is how to implement the integration process so that it benefits all member states in a region that is economically dominated by South Africa. In practice this can be achieved through establishment of corrective mechanisms, which ensure equitable distribution of benefits. This would require deeper integration and South Africa to adapt responsibility towards its regional partners. African integration processes in general have not been as successful as for example the EU. African states have been reluctant to limit their sovereignty in favour of regional organisations.This can be explained by the differences between European and African states. The EU member states have been democracies while African states have been characterised by concentration of power in the executive branch. Furthermore the political systems in Africa have been characterised by vertical clientelist reltionships. As a result it has not been in the interest of the political elite to limit the state sovereignty in favour of regional organisations. In recent years SADC has been relatively succesful in its integration process and reforms, but a lot remains to be done before the implementation of the free trade area can be succesful. The institutional structure and treaties of SADC differ from the structures of the EU. Member states are the main actors of the integration processes. Their differences are reflected in the process and produce different kinds of integration in different parts of the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation consists of an introductory section and three essays investigating the effects of economic integration on labour demand by using theoretical models and by empirical analysis. The essays adopt an intra-industry trade approach to specify a theoretical framework of estimation for determining the effects of economic integration on employment. In all the essays the empirical aim is to explore the labour demand consequences of European integration. The first essay analyzes how labour-demand elasticities with own price have changed during the process of economic integration. As a theoretical result, intensified trade competition increases labour-demand elasticity, whereas better advantage of economies of scale decreases labour-demand elasticity by decreasing the elasticity of substitution between differentiated products. Furthermore, if integration gives rise to an increase in input-substitutability and/or outsourcing activities, labour demand will become more elastic. Using data from the manufacturing sector from 1975 to 2002, the empirical results provide support for the hypothesis that European integration has contributed to increased elasticities of total labour demand in Finland. The second essay analyzes how economic integration affects the impact of welfare poli-cies on employment. The essay considers the viability of financing the public sector, i.e. public consumption and social security expenses, by general labour taxation in an economy which has become more integrated into international product markets. The theoretical results of the second essay indicate that, as increased trade competition crowds out better economies of scale, it becomes more costly to maintain welfare systems financed by labour taxation. Using data from European countries for the years 1975 to 2004, the empirical results provide inconsistent evidence for the hypothesis that economic integration has contributed to the distortion effects of welfare policies on employment. The third essay analyzes the impact of profit sharing on employment as a way to introduce wage flexibility into the process of economic integration. The results of the essay suggest that, in theory, the effects of economic integration on the impact of profit sharing on employment clearly depend on a trade-off between intensified competition and better advantage of economies of scale. If product market competition increases, the ability of profit sharing to improve employment through economic integration increases with moderated wages. While, the economic integration associating with market power in turn decrease the possibilities of profit sharing with higher wages to improve employment. Using data from the manufacturing sector for the years 1996 to 2004, the empirical results show that profit-sharing has a positive impact on employment during the process of European integration, but can have ambiguous effects on the stability of employment in Finland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MEG directly measures the neuronal events and has greater temporal resolution than fMRI, which has limited temporal resolution mainly due to the larger timescale of the hemodynamic response. On the other hand fMRI has advantages in spatial resolution, while the localization results with MEG can be ambiguous due to the non-uniqueness of the electromagnetic inverse problem. Thus, these methods could provide complementary information and could be used to create both spatially and temporally accurate models of brain function. We investigated the degree of overlap, revealed by the two imaging methods, in areas involved in sensory or motor processing in healthy subjects and neurosurgical patients. Furthermore, we used the spatial information from fMRI to construct a spatiotemporal model of the MEG data in order to investigate the sensorimotor system and to create a spatiotemporal model of its function. We compared the localization results from the MEG and fMRI with invasive electrophysiological cortical mapping. We used a recently introduced method, contextual clustering, for hypothesis testing of fMRI data and assessed the the effect of neighbourhood information use on the reproducibility of fMRI results. Using MEG, we identified the ipsilateral primary sensorimotor cortex (SMI) as a novel source area contributing to the somatosensory evoked fields (SEF) to median nerve stimulation. Using combined MEG and fMRI measurements we found that two separate areas in the lateral fissure may be the generators for the SEF responses from the secondary somatosensory cortex region. The two imaging methods indicated activation in corresponding locations. By using complementary information from MEG and fMRI we established a spatiotemporal model of somatosensory cortical processing. This spatiotemporal model of cerebral activity was in good agreement with results from several studies using invasive electrophysiological measurements and with anatomical studies in monkey and man concerning the connections between somatosensory areas. In neurosurgical patients, the MEG dipole model turned out to be more reliable than fMRI in the identification of the central sulcus. This was due to prominent activation in non-primary areas in fMRI, which in some cases led to erroneous or ambiguous localization of the central sulcus.