32 resultados para DISCRETE-SCALE-INVARIANCE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Capercaillie (Tetrao urogallus L.) is often used as a focal species for landscape ecological studies: the minimum size for its lekking area is 300 ha, and the annual home range for an individual may cover 30 80 km2. In Finland, Capercaillie populations have decreased by approximately 40 85%, with the declines likely to have started in the 1940s. Although the declines have partly stabilized from the 1990s onwards, it is obvious that the negative population trend was at least partly caused by changes in human land use. The aim of this thesis was to study the connections between human land use and Capercaillie populations in Finland, using several spatial and temporal scales. First, the effect of forest age structure on Capercaillie population trends was studied in 18 forestry board districts in Finland, during 1965 1988. Second, the abundances of Capercaillie and Moose (Alces alces L.) were compared in terms of several land-use variables on a scale of 50 × 50 km grids and in five regions in Finland. Third, the effects of forest cover and fine-grain forest fragmentation on Capercaillie lekking area persistence were studied in three study locations in Finland, on 1000 and 3000 m spatial scales surrounding the leks. The analyses considering lekking areas were performed with two definitions for forest: > 60 and > 152 m3ha 1 of timber volume. The results show that patterns and processes at large spatial scales strongly influence Capercaillie in Finland. In particular, in southwestern and eastern Finland, high forest cover and low human impact were found to be beneficial for this species. Forest cover (> 60 m3ha 1 of timber) surrounding the lekking sites positively affected lekking area persistence only at the larger landscape scale (3000 m radius). The effects of older forest classes were hard to assess due to scarcity of older forests in several study areas. Young and middle-aged forest classes were common in the vicinity of areas with high Capercaillie abundances especially in northern Finland. The increase in the amount of younger forest classes did not provide a good explanation for Capercaillie population decline in 1965 1988. In addition, there was no significant connection between mature forests (> 152 m3ha 1 of timber) and lekking area persistence in Finland. It seems that in present-day Finnish landscapes, area covered with old forest is either too scarce to efficiently explain the abundance of Capercaillie and the persistence of the lekking areas, or the effect of forest age is only important when considering smaller spatial scales than the ones studied in this thesis. In conclusion, larger spatial scales should be considered for assessing the future Capercaillie management. According to the proposed multi-level planning, the first priority should be to secure the large, regional-scale forest cover, and the second priority should be to maintain fine-grained, heterogeneous structure within the separate forest patches. A management unit covering hundreds of hectares, or even tens or hundreds of square kilometers, should be covered, which requires regional-level land-use planning and co-operation between forest owners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Earth's ecosystems are protected from the dangerous part of the solar ultraviolet (UV) radiation by stratospheric ozone, which absorbs most of the harmful UV wavelengths. Severe depletion of stratospheric ozone has been observed in the Antarctic region, and to a lesser extent in the Arctic and midlatitudes. Concern about the effects of increasing UV radiation on human beings and the natural environment has led to ground based monitoring of UV radiation. In order to achieve high-quality UV time series for scientific analyses, proper quality control (QC) and quality assurance (QA) procedures have to be followed. In this work, practices of QC and QA are developed for Brewer spectroradiometers and NILU-UV multifilter radiometers, which measure in the Arctic and Antarctic regions, respectively. These practices are applicable to other UV instruments as well. The spectral features and the effect of different factors affecting UV radiation were studied for the spectral UV time series at Sodankylä. The QA of the Finnish Meteorological Institute's (FMI) two Brewer spectroradiometers included daily maintenance, laboratory characterizations, the calculation of long-term spectral responsivity, data processing and quality assessment. New methods for the cosine correction, the temperature correction and the calculation of long-term changes of spectral responsivity were developed. Reconstructed UV irradiances were used as a QA tool for spectroradiometer data. The actual cosine correction factor was found to vary between 1.08-1.12 and 1.08-1.13. The temperature characterization showed a linear temperature dependence between the instrument's internal temperature and the photon counts per cycle. Both Brewers have participated in international spectroradiometer comparisons and have shown good stability. The differences between the Brewers and the portable reference spectroradiometer QASUME have been within 5% during 2002-2010. The features of the spectral UV radiation time series at Sodankylä were analysed for the time period 1990-2001. No statistically significant long-term changes in UV irradiances were found, and the results were strongly dependent on the time period studied. Ozone was the dominant factor affecting UV radiation during the springtime, whereas clouds played a more important role during the summertime. During this work, the Antarctic NILU-UV multifilter radiometer network was established by the Instituto Nacional de Meteorogía (INM) as a joint Spanish-Argentinian-Finnish cooperation project. As part of this work, the QC/QA practices of the network were developed. They included training of the operators, daily maintenance, regular lamp tests and solar comparisons with the travelling reference instrument. Drifts of up to 35% in the sensitivity of the channels of the NILU-UV multifilter radiometers were found during the first four years of operation. This work emphasized the importance of proper QC/QA, including regular lamp tests, for the multifilter radiometers also. The effect of the drifts were corrected by a method scaling the site NILU-UV channels to those of the travelling reference NILU-UV. After correction, the mean ratios of erythemally-weighted UV dose rates measured during solar comparisons between the reference NILU-UV and the site NILU-UVs were 1.007±0.011 and 1.012±0.012 for Ushuaia and Marambio, respectively, when the solar zenith angle varied up to 80°. Solar comparisons between the NILU-UVs and spectroradiometers showed a ±5% difference near local noon time, which can be seen as proof of successful QC/QA procedures and transfer of irradiance scales. This work also showed that UV measurements made in the Arctic and Antarctic can be comparable with each other.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carbon nanotubes, seamless cylinders made from carbon atoms, have outstanding characteristics: inherent nano-size, record-high Young’s modulus, high thermal stability and chemical inertness. They also have extraordinary electronic properties: in addition to extremely high conductance, they can be both metals and semiconductors without any external doping, just due to minute changes in the arrangements of atoms. As traditional silicon-based devices are reaching the level of miniaturisation where leakage currents become a problem, these properties make nanotubes a promising material for applications in nanoelectronics. However, several obstacles must be overcome for the development of nanotube-based nanoelectronics. One of them is the ability to modify locally the electronic structure of carbon nanotubes and create reliable interconnects between nanotubes and metal contacts which likely can be used for integration of the nanotubes in macroscopic electronic devices. In this thesis, the possibility of using ion and electron irradiation as a tool to introduce defects in nanotubes in a controllable manner and to achieve these goals is explored. Defects are known to modify the electronic properties of carbon nanotubes. Some defects are always present in pristine nanotubes, and naturally are introduced during irradiation. Obviously, their density can be controlled by irradiation dose. Since different types of defects have very different effects on the conductivity, knowledge of their abundance as induced by ion irradiation is central for controlling the conductivity. In this thesis, the response of single walled carbon nanotubes to ion irradiation is studied. It is shown that, indeed, by energy selective irradiation the conductance can be controlled. Not only the conductivity, but the local electronic structure of single walled carbon nanotubes can be changed by the defects. The presented studies show a variety of changes in the electronic structures of semiconducting single walled nanotubes, varying from individual new states in the band gap to changes in the band gap width. The extensive simulation results for various types of defect make it possible to unequivocally identify defects in single walled carbon nanotubes by combining electronic structure calculations and scanning tunneling spectroscopy, offering a reference data for a wide scientific community of researchers studying nanotubes with surface probe microscopy methods. In electronics applications, carbon nanotubes have to be interconnected to the macroscopic world via metal contacts. Interactions between the nanotubes and metal particles are also essential for nanotube synthesis, as single walled nanotubes are always grown from metal catalyst particles. In this thesis, both growth and creation of nanotube-metal nanoparticle interconnects driven by electron irradiation is studied. Surface curvature and the size of metal nanoparticles is demonstrated to determine the local carbon solubility in these particles. As for nanotube-metal contacts, previous experiments have proved the possibility to create junctions between carbon nanotubes and metal nanoparticles under irradiation in a transmission electron microscope. In this thesis, the microscopic mechanism of junction formation is studied by atomistic simulations carried out at various levels of sophistication. It is shown that structural defects created by the electron beam and efficient reconstruction of the nanotube atomic network, inherently related to the nanometer size and quasi-one dimensional structure of nanotubes, are the driving force for junction formation. Thus, the results of this thesis not only address practical aspects of irradiation-mediated engineering of nanosystems, but also contribute to our understanding of the behaviour of point defects in low-dimensional nanoscale materials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Earlier work has suggested that large-scale dynamos can reach and maintain equipartition field strengths on a dynamical time scale only if magnetic helicity of the fluctuating field can be shed from the domain through open boundaries. To test this scenario in convection-driven dynamos by comparing results for open and closed boundary conditions. Three-dimensional numerical simulations of turbulent compressible convection with shear and rotation are used to study the effects of boundary conditions on the excitation and saturation level of large-scale dynamos. Open (vertical field) and closed (perfect conductor) boundary conditions are used for the magnetic field. The contours of shear are vertical, crossing the outer surface, and are thus ideally suited for driving a shear-induced magnetic helicity flux. We find that for given shear and rotation rate, the growth rate of the magnetic field is larger if open boundary conditions are used. The growth rate first increases for small magnetic Reynolds number, Rm, but then levels off at an approximately constant value for intermediate values of Rm. For large enough Rm, a small-scale dynamo is excited and the growth rate in this regime increases proportional to Rm^(1/2). In the nonlinear regime, the saturation level of the energy of the mean magnetic field is independent of Rm when open boundaries are used. In the case of perfect conductor boundaries, the saturation level first increases as a function of Rm, but then decreases proportional to Rm^(-1) for Rm > 30, indicative of catastrophic quenching. These results suggest that the shear-induced magnetic helicity flux is efficient in alleviating catastrophic quenching when open boundaries are used. The horizontally averaged mean field is still weakly decreasing as a function of Rm even for open boundaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The human resource (HR) function is under pressure both to change roles and to play a large variety of roles. Questions of change and development in the HR function become particularly interesting in the context of mergers and acquisitions when two corporations are integrated. The purpose of the thesis is to examine the roles played by the HR function in the context of large-scale mergers and thus to understand what happens to the HR function in such change environments, and to shed light on the underlying factors that influence changes in the HR function. To achieve this goal, the study seeks first to identify the roles played by the HR function before and after the merger, and second, to identify the factors that affect the roles played by the HR function. It adopts a qualitative case study approach including ten focal case organisations (mergers) and four matching cases (non-mergers). The sample consists of large corporations originating from either Finland or Sweden. HR directors and members of the top management teams within the case organisations were interviewed. The study suggests that changes occur within the HR function, and that the trend is for the HR function to become increasingly strategic. However, the HR function was found to play strategic roles only when the HR administration ran smoothly. The study also suggests that the HR function has become more versatile. An HR function that was perceived to be mainly administrative before the merger is likely after the merger to perform some strategically important activities in addition to the administrative ones. Significant changes in the roles played by the HR function were observed in some of the case corporations. This finding suggests that the merger integration process is a window of opportunity for the HR function. HR functions that take a proactive and leading role during the integration process might expand the number of roles played and move from being an administrator before the merger to also being a business partner after integration. The majority of the HR functions studied remained mainly reactive during the organisational change process and although the evidence showed that they moved towards strategic tasks, the intra-functional changes remained comparatively small in these organisations. The study presents a new model that illustrates the impact of the relationship between the top management team and the HR function on the role of the HR function. The expectations held by the top management team for the HR function and the performance of the HR function were found to interact. On a dimension reaching from tactical to strategic, HR performance is likely to correspond to the expectations held by top management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A distributed system is a collection of networked autonomous processing units which must work in a cooperative manner. Currently, large-scale distributed systems, such as various telecommunication and computer networks, are abundant and used in a multitude of tasks. The field of distributed computing studies what can be computed efficiently in such systems. Distributed systems are usually modelled as graphs where nodes represent the processors and edges denote communication links between processors. This thesis concentrates on the computational complexity of the distributed graph colouring problem. The objective of the graph colouring problem is to assign a colour to each node in such a way that no two nodes connected by an edge share the same colour. In particular, it is often desirable to use only a small number of colours. This task is a fundamental symmetry-breaking primitive in various distributed algorithms. A graph that has been coloured in this manner using at most k different colours is said to be k-coloured. This work examines the synchronous message-passing model of distributed computation: every node runs the same algorithm, and the system operates in discrete synchronous communication rounds. During each round, a node can communicate with its neighbours and perform local computation. In this model, the time complexity of a problem is the number of synchronous communication rounds required to solve the problem. It is known that 3-colouring any k-coloured directed cycle requires at least ½(log* k - 3) communication rounds and is possible in ½(log* k + 7) communication rounds for all k ≥ 3. This work shows that for any k ≥ 3, colouring a k-coloured directed cycle with at most three colours is possible in ½(log* k + 3) rounds. In contrast, it is also shown that for some values of k, colouring a directed cycle with at most three colours requires at least ½(log* k + 1) communication rounds. Furthermore, in the case of directed rooted trees, reducing a k-colouring into a 3-colouring requires at least log* k + 1 rounds for some k and possible in log* k + 3 rounds for all k ≥ 3. The new positive and negative results are derived using computational methods, as the existence of distributed colouring algorithms corresponds to the colourability of so-called neighbourhood graphs. The colourability of these graphs is analysed using Boolean satisfiability (SAT) solvers. Finally, this thesis shows that similar methods are applicable in capturing the existence of distributed algorithms for other graph problems, such as the maximal matching problem.