200 resultados para Fysik


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need for mutual recognition of accurate measurement results made by competent laboratories has been very widely accepted at the international level e.g., at the World Trade Organization. A partial solution to the problem was made by the International Committee for Weights and Measures (CIPM) in setting up the Mutual Recognition Arrangement (CIPM MRA), which was signed by National Metrology Institutes (NMI) around the world. The core idea of the CIPM MRA is to have global arrangements for the mutual acceptance of the calibration certificates of National Metrology Institutes. The CIPM MRA covers all the fields of science and technology for which NMIs have their national standards. The infrastructure for the metrology of the gaseous compounds carbon monoxide (CO), nitrogen monoxide (NO), nitrogen dioxide (NO2), sulphur dioxide (SO2) and ozone (O3) has been constructed at the national level at the Finnish Meteorological Institute (FMI). The calibration laboratory at the FMI was constructed for providing calibration services for air quality measurements and to fulfil the requirements of a metrology laboratory. The laboratory successfully participated, with good results, in the first comparison project, which was aimed at defining the state of the art in the preparation and analysis of the gas standards used by European metrology institutes and calibration laboratories in the field of air quality. To confirm the competence of the laboratory, the international external surveillance study was conducted at the laboratory. Based on the evidence, the Centre for Metrology and Accreditation (MIKES) designated the calibration laboratory at the Finnish Meteorological Institute (FMI) as a National Standard Laboratory in the field of air quality. With this designation, the MIKES-FMI Standards Laboratory became a member of CIPM MRA, and Finland was brought into the internationally-accepted forum in the field of gas metrology. The concept of ‘once measured - everywhere accepted’ is the leading theme of the CIPM MRA. The calibration service of the MIKES-FMI Standards Laboratory realizes the SI traceability system for the gas components, and is constructed to enable it to meet the requirements of the European air quality directives. In addition, all the relevant uncertainty sources that influence the measurement results have been evaluated, and the uncertainty budgets for the measurement results have been created.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis was to study the seismic tomography structure of the earth s crust together with earthquake distribution and mechanism beneath the central Fennoscandian Shield, mainly in southern and central Finland. The earthquake foci and some fault plane solutions are correlated with 3-D images of the velocity tomography. The results are discussed in relation to the stress field of the Shield and with other geophysical, e.g. geomagnetic, gravimetric, tectonic, and anisotropy studies of the Shield. The earthquake data of the Fennoscandian Shield has been extracted from the Nordic earthquake parameter data base which was founded at the time of inception of the earthquake catalogue for northern Europe. Eight earlier earthquake source mechanisms are included in a pilot study on creating a novel technique for calculating an earthquake fault plane solution. Altogether, eleven source mechanisms of shallow, weak earthquakes are related in the 3-D tomography model to trace stresses of the crust in southern and central Finland. The earthquakes in the eastern part of the Fennoscandian Shield represent low-active, intraplate seismicity. Earthquake mechanisms with NW-SE oriented horizontal compression confirm that the dominant stress field originates from the ridge-push force in the North Atlantic Ocean. Earthquakes accumulate in coastal areas, in intersections of tectonic lineaments, in main fault zones or are bordered by fault lines. The majority of Fennoscandian earthquakes concentrate on the south-western Shield in southern Norway and Sweden. Onwards, epicentres spread via the ridge of the Shield along the west-coast of the Gulf of Bothnia northwards along the Tornio River - Finnmark fault system to the Barents Sea, and branch out north-eastwards via the Kuusamo region to the White Sea Kola Peninsula faults. The local seismic tomographic method was applied to find the terrane distribution within the central parts of the Shield the Svecofennian Orogen. From 300 local explosions a total of 19765 crustal Pg- and Sg-wave arrival times were inverted to create independent 3-D Vp and Vs tomographic models, from which the Vp/Vs ratio was calculated. The 3-D structure of the crust is presented as a P-wave and for the first time as an S-wave velocity model, and also as a Vp/Vs-ratio model of the SVEKALAPKO area that covers 700x800 km2 in southern and central Finland. Also, some P-wave Moho-reflection data was interpolated to image the relief of the crust-mantle boundary (i.e. Moho). In the tomography model, the seismic velocities vary smoothly. The lateral variations are larger for Vp (dVp =0.7 km/s) than for Vs (dVs =0.4 km/s). The Vp/Vs ratio varies spatially more distinctly than P- and S-wave velocities, usually from 1.70 to 1.74 in the upper crust and from 1.72 to 1.78 in the lower crust. Schist belts and their continuations at depth are associated with lower velocities and lower Vp/Vs ratios than in the granitoid areas. The tomography modelling suggests that the Svecofennian Orogen was accreted from crustal blocks ranging in size from 100x100 km2 to 200x200 km2 in cross-sectional area. The intervening sedimentary belts have ca. 0.2 km/s lower P- and S-wave velocities and ca. 0.04 lower Vp/Vs ratios. Thus, the tomographic model supports the concept that the thick Svecofennian crust was accreted from several crustal terranes, some hidden, and that the crust was later modified by intra- and underplating. In conclusion, as a novel approach the earthquake focal mechanism and focal depth distribution is discussed in relation to the 3-D tomography model. The schist belts and the transformation zones between the high- and low-velocity anomaly blocks are characterized by deeper earthquakes than the granitoid areas where shallow events dominate. Although only a few focal mechanisms were solved for southern Finland, there is a trend towards strike-slip and oblique strike-slip movements inside schist areas. The normal dip-slip type earthquakes are typical in the seismically active Kuusamo district in the NE edge of the SVEKALAPKO area, where the Archean crust is ca. 15-20 km thinner than the Proterozoic Svecofennian crust. Two near vertical dip-slip mechanism earthquakes occurred in the NE-SW junction between the Central Finland Granitoid Complex and the Vyborg rapakivi batholith, where high Vp/Vs-ratio deep-set intrusion splits the southern Finland schist belt into two parts in the tomography model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main method of modifying properties of semiconductors is to introduce small amount of impurities inside the material. This is used to control magnetic and optical properties of materials and to realize p- and n-type semiconductors out of intrinsic material in order to manufacture fundamental components such as diodes. As diffusion can be described as random mixing of material due to thermal movement of atoms, it is essential to know the diffusion behavior of the impurities in order to manufacture working components. In modified radiotracer technique diffusion is studied using radioactive isotopes of elements as tracers. The technique is called modified as atoms are deployed inside the material by ion beam implantation. With ion implantation, a distinct distribution of impurities can be deployed inside the sample surface with good con- trol over the amount of implanted atoms. As electromagnetic radiation and other nuclear decay products emitted by radioactive materials can be easily detected, only very low amount of impurities can be used. This makes it possible to study diffusion in pure materials without essentially modifying the initial properties by doping. In this thesis a modified radiotracer technique is used to study the diffusion of beryllium in GaN, ZnO, SiGe and glassy carbon. GaN, ZnO and SiGe are of great interest to the semiconductor industry and beryllium as a small and possibly rapid dopant hasn t been studied previously using the technique. Glassy carbon has been added to demonstrate the feasibility of the technique. In addition, the diffusion of magnetic impurities, Mn and Co, has been studied in GaAs and ZnO (respectively) with spintronic applications in mind.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inflation is a period of accelerated expansion in the very early universe, which has the appealing aspect that it can create primordial perturbations via quantum fluctuations. These primordial perturbations have been observed in the cosmic microwave background, and these perturbations also function as the seeds of all large-scale structure in the universe. Curvaton models are simple modifications of the standard inflationary paradigm, where inflation is driven by the energy density of the inflaton, but another field, the curvaton, is responsible for producing the primordial perturbations. The curvaton decays after inflation as ended, where the isocurvature perturbations of the curvaton are converted into adiabatic perturbations. Since the curvaton must decay, it must have some interactions. Additionally realistic curvaton models typically have some self-interactions. In this work we consider self-interacting curvaton models, where the self-interaction is a monomial in the potential, suppressed by the Planck scale, and thus the self-interaction is very weak. Nevertheless, since the self-interaction makes the equations of motion non-linear, it can modify the behaviour of the model very drastically. The most intriguing aspect of this behaviour is that the final properties of the perturbations become highly dependent on the initial values. Departures of Gaussian distribution are important observables of the primordial perturbations. Due to the non-linearity of the self-interacting curvaton model and its sensitivity to initial conditions, it can produce significant non-Gaussianity of the primordial perturbations. In this work we investigate the non-Gaussianity produced by the self-interacting curvaton, and demonstrate that the non-Gaussianity parameters do not obey the analytically derived approximate relations often cited in the literature. Furthermore we also consider a self-interacting curvaton with a mass in the TeV-scale. Motivated by realistic particle physics models such as the Minimally Supersymmetric Standard Model, we demonstrate that a curvaton model within the mass range can be responsible for the observed perturbations if it can decay late enough.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The magnetic field of the Earth is 99 % of the internal origin and generated in the outer liquid core by the dynamo principle. In the 19th century, Carl Friedrich Gauss proved that the field can be described by a sum of spherical harmonic terms. Presently, this theory is the basis of e.g. IGRF models (International Geomagnetic Reference Field), which are the most accurate description available for the geomagnetic field. In average, dipole forms 3/4 and non-dipolar terms 1/4 of the instantaneous field, but the temporal mean of the field is assumed to be a pure geocentric axial dipolar field. The validity of this GAD (Geocentric Axial Dipole) hypothesis has been estimated by using several methods. In this work, the testing rests on the frequency dependence of inclination with respect to latitude. Each combination of dipole (GAD), quadrupole (G2) and octupole (G3) produces a distinct inclination distribution. These theoretical distributions have been compared with those calculated from empirical observations from different continents, and last, from the entire globe. Only data from Precambrian rocks (over 542 million years old) has been used in this work. The basic assumption is that during the long-term course of drifting continents, the globe is sampled adequately. There were 2823 observations altogether in the paleomagnetic database of the University of Helsinki. The effect of the quality of observations, as well as the age and rocktype, has been tested. For comparison between theoretical and empirical distributions, chi-square testing has been applied. In addition, spatiotemporal binning has effectively been used to remove the errors caused by multiple observations. The modelling from igneous rock data tells that the average magnetic field of the Earth is best described by a combination of a geocentric dipole and a very weak octupole (less than 10 % of GAD). Filtering and binning gave distributions a more GAD-like appearance, but deviation from GAD increased as a function of the age of rocks. The distribution calculated from so called keypoles, the most reliable determinations, behaves almost like GAD, having a zero quadrupole and an octupole 1 % of GAD. In no earlier study, past-400-Ma rocks have given a result so close to GAD, but low inclinations have been prominent especially in the sedimentary data. Despite these results, a greater deal of high-quality data and a proof of the long-term randomness of the Earth's continental motions are needed to make sure the dipole model holds true.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interaction between forests and the atmosphere occurs by radiative and turbulent transport. The fluxes of energy and mass between surface and the atmosphere directly influence the properties of the lower atmosphere and in longer time scales the global climate. Boreal forest ecosystems are central in the global climate system, and its responses to human activities, because they are significant sources and sinks of greenhouse gases and of aerosol particles. The aim of the present work was to improve our understanding on the existing interplay between biologically active canopy, microenvironment and turbulent flow and quantify. In specific, the aim was to quantify the contribution of different canopy layers to whole forest fluxes. For this purpose, long-term micrometeorological and ecological measurements made in a Scots pine (Pinus sylvestris) forest at SMEAR II research station in Southern Finland were used. The properties of turbulent flow are strongly modified by the interaction between the canopy elements: momentum is efficiently absorbed in the upper layers of the canopy, mean wind speed and turbulence intensities decrease rapidly towards the forest floor and power spectra is modulated by spectral short-cut . In the relative open forest, diabatic stability above the canopy explained much of the changes in velocity statistics within the canopy except in strongly stable stratification. Large eddies, ranging from tens to hundred meters in size, were responsible for the major fraction of turbulent transport between a forest and the atmosphere. Because of this, the eddy-covariance (EC) method proved to be successful for measuring energy and mass exchange inside a forest canopy with exception of strongly stable conditions. Vertical variations of within canopy microclimate, light attenuation in particular, affect strongly the assimilation and transpiration rates. According to model simulations, assimilation rate decreases with height more rapidly than stomatal conductance (gs) and transpiration and, consequently, the vertical source-sink distributions for carbon dioxide (CO2) and water vapor (H2O) diverge. Upscaling from a shoot scale to canopy scale was found to be sensitive to chosen stomatal control description. The upscaled canopy level CO2 fluxes can vary as much as 15 % and H2O fluxes 30 % even if the gs models are calibrated against same leaf-level dataset. A pine forest has distinct overstory and understory layers, which both contribute significantly to canopy scale fluxes. The forest floor vegetation and soil accounted between 18 and 25 % of evapotranspiration and between 10 and 20 % of sensible heat exchange. Forest floor was also an important deposition surface for aerosol particles; between 10 and 35 % of dry deposition of particles within size range 10 30 nm occurred there. Because of the northern latitudes, seasonal cycle of climatic factors strongly influence the surface fluxes. Besides the seasonal constraints, partitioning of available energy to sensible and latent heat depends, through stomatal control, on the physiological state of the vegetation. In spring, available energy is consumed mainly as sensible heat and latent heat flux peaked about two months later, in July August. On the other hand, annual evapotranspiration remains rather stable over range of environmental conditions and thus any increase of accumulated radiation affects primarily the sensible heat exchange. Finally, autumn temperature had strong effect on ecosystem respiration but its influence on photosynthetic CO2 uptake was restricted by low radiation levels. Therefore, the projected autumn warming in the coming decades will presumably reduce the positive effects of earlier spring recovery in terms of carbon uptake potential of boreal forests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research is connected with an education development project for the four-year-long officer education program at the National Defence University. In this curriculum physics was studied in two alternative course plans namely scientific and general. Observations connected to the later one e.g. student feedback and learning outcome gave indications that action was needed to support the course. The reform work was focused on the production of aligned course related instructional material. The learning material project produced a customized textbook set for the students of the general basic physics course. The research adapts phases that are typical in Design Based Research (DBR). The research analyses the feature requirements for physics textbook aimed at a specific sector and frames supporting instructional material development, and summarizes the experiences gained in the learning material project when the selected frames have been applied. The quality of instructional material is an essential part of qualified teaching. The goal of instructional material customization is to increase the product's customer centric nature and to enhance its function as a support media for the learning process. Textbooks are still one of the core elements in physics teaching. The idea of a textbook will remain but the form and appearance may change according to the prevailing technology. The work deals with substance connected frames (demands of a physics textbook according to the PER-viewpoint, quality thinking in educational material development), frames of university pedagogy and instructional material production processes. A wide knowledge and understanding of different frames are useful in development work, if they are to be utilized to aid inspiration without limiting new reasoning and new kinds of models. Applying customization even in the frame utilization supports creative and situation aware design and diminishes the gap between theory and practice. Generally, physics teachers produce their own supplementary instructional material. Even though customization thinking is not unknown the threshold to produce an entire textbook might be high. Even though the observations here are from the general physics course at the NDU, the research gives tools also for development in other discipline related educational contexts. This research is an example of an instructional material development work together the questions it uncovers, and presents thoughts when textbook customization is rewarding. At the same time, the research aims to further creative customization thinking in instruction and development. Key words: Physics textbook, PER (Physics Education Research), Instructional quality, Customization, Creativity

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatile organic compounds (VOCs) are emitted into the atmosphere from natural and anthropogenic sources, vegetation being the dominant source on a global scale. Some of these reactive compounds are deemed major contributors or inhibitors to aerosol particle formation and growth, thus making VOC measurements essential for current climate change research. This thesis discusses ecosystem scale VOC fluxes measured above a boreal Scots pine dominated forest in southern Finland. The flux measurements were performed using the micrometeorological disjunct eddy covariance (DEC) method combined with proton transfer reaction mass spectrometry (PTR-MS), which is an online technique for measuring VOC concentrations. The measurement, calibration, and calculation procedures developed in this work proved to be well suited to long-term VOC concentration and flux measurements with PTR-MS. A new averaging approach based on running averaged covariance functions improved the determination of the lag time between wind and concentration measurements, which is a common challenge in DEC when measuring fluxes near the detection limit. The ecosystem scale emissions of methanol, acetaldehyde, and acetone were substantial. These three oxygenated VOCs made up about half of the total emissions, with the rest comprised of monoterpenes. Contrary to the traditional assumption that monoterpene emissions from Scots pine originate mainly as evaporation from specialized storage pools, the DEC measurements indicated a significant contribution from de novo biosynthesis to the ecosystem scale monoterpene emissions. This thesis offers practical guidelines for long-term DEC measurements with PTR-MS. In particular, the new averaging approach to the lag time determination seems useful in the automation of DEC flux calculations. Seasonal variation in the monoterpene biosynthesis and the detailed structure of a revised hybrid algorithm, describing both de novo and pool emissions, should be determined in further studies to improve biological realism in the modelling of monoterpene emissions from Scots pine forests. The increasing number of DEC measurements of oxygenated VOCs will probably enable better estimates of the role of these compounds in plant physiology and tropospheric chemistry. Keywords: disjunct eddy covariance, lag time determination, long-term flux measurements, proton transfer reaction mass spectrometry, Scots pine forests, volatile organic compounds

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atmospheric particles affect the radiation balance of the Earth and thus the climate. New particle formation from nucleation has been observed in diverse atmospheric conditions but the actual formation path is still unknown. The prevailing conditions can be exploited to evaluate proposed formation mechanisms. This study aims to improve our understanding of new particle formation from the view of atmospheric conditions. The role of atmospheric conditions on particle formation was studied by atmospheric measurements, theoretical model simulations and simulations based on observations. Two separate column models were further developed for aerosol and chemical simulations. Model simulations allowed us to expand the study from local conditions to varying conditions in the atmospheric boundary layer, while the long-term measurements described especially characteristic mean conditions associated with new particle formation. The observations show statistically significant difference in meteorological and back-ground aerosol conditions between observed event and non-event days. New particle formation above boreal forest is associated with strong convective activity, low humidity and low condensation sink. The probability of a particle formation event is predicted by an equation formulated for upper boundary layer conditions. The model simulations call into question if kinetic sulphuric acid induced nucleation is the primary particle formation mechanism in the presence of organic vapours. Simultaneously the simulations show that ignoring spatial and temporal variation in new particle formation studies may lead to faulty conclusions. On the other hand, the theoretical simulations indicate that short-scale variations in temperature and humidity unlikely have a significant effect on mean binary water sulphuric acid nucleation rate. The study emphasizes the significance of mixing and fluxes in particle formation studies, especially in the atmospheric boundary layer. The further developed models allow extensive aerosol physical and chemical studies in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thin films are the basis of much of recent technological advance, ranging from coatings with mechanical or optical benefits to platforms for nanoscale electronics. In the latter, semiconductors have been the norm ever since silicon became the main construction material for a multitude of electronical components. The array of characteristics of silicon-based systems can be widened by manipulating the structure of the thin films at the nanoscale - for instance, by making them porous. The different characteristics of different films can then to some extent be combined by simple superposition. Thin films can be manufactured using many different methods. One emerging field is cluster beam deposition, where aggregates of hundreds or thousands of atoms are deposited one by one to form a layer, the characteristics of which depend on the parameters of deposition. One critical parameter is deposition energy, which dictates how porous, if at all, the layer becomes. Other parameters, such as sputtering rate and aggregation conditions, have an effect on the size and consistency of the individual clusters. Understanding nanoscale processes, which cannot be observed experimentally, is fundamental to optimizing experimental techniques and inventing new possibilities for advances at this scale. Atomistic computer simulations offer a window to the world of nanometers and nanoseconds in a way unparalleled by the most accurate of microscopes. Transmission electron microscope image simulations can then bridge this gap by providing a tangible link between the simulated and the experimental. In this thesis, the entire process of cluster beam deposition is explored using molecular dynamics and image simulations. The process begins with the formation of the clusters, which is investigated for Si/Ge in an Ar atmosphere. The structure of the clusters is optimized to bring it as close to the experimental ideal as possible. Then, clusters are deposited, one by one, onto a substrate, until a sufficiently thick layer has been produced. Finally, the concept is expanded by further deposition with different parameters, resulting in multiple superimposed layers of different porosities. This work demonstrates how the aggregation of clusters is not entirely understood within the scope of the approximations used in the simulations; yet, it is also shown how the continued deposition of clusters with a varying deposition energy can lead to a novel kind of nanostructured thin film: a multielemental porous multilayer. According to theory, these new structures have characteristics that can be tailored for a variety of applications, with precision heretofore unseen in conventional multilayer manufacture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleation is the first step in a phase transition where small nuclei of the new phase start appearing in the metastable old phase, such as the appearance of small liquid clusters in a supersaturated vapor. Nucleation is important in various industrial and natural processes, including atmospheric new particle formation: between 20 % to 80 % of atmospheric particle concentration is due to nucleation. These atmospheric aerosol particles have a significant effect both on climate and human health. Different simulation methods are often applied when studying things that are difficult or even impossible to measure, or when trying to distinguish between the merits of various theoretical approaches. Such simulation methods include, among others, molecular dynamics and Monte Carlo simulations. In this work molecular dynamics simulations of the homogeneous nucleation of Lennard-Jones argon have been performed. Homogeneous means that the nucleation does not occur on a pre-existing surface. The simulations include runs where the starting configuration is a supersaturated vapor and the nucleation event is observed during the simulation (direct simulations), as well as simulations of a cluster in equilibrium with a surrounding vapor (indirect simulations). The latter type are a necessity when the conditions prevent the occurrence of a nucleation event in a reasonable timeframe in the direct simulations. The effect of various temperature control schemes on the nucleation rate (the rate of appearance of clusters that are equally able to grow to macroscopic sizes and to evaporate) was studied and found to be relatively small. The method to extract the nucleation rate was also found to be of minor importance. The cluster sizes from direct and indirect simulations were used in conjunction with the nucleation theorem to calculate formation free energies for the clusters in the indirect simulations. The results agreed with density functional theory, but were higher than values from Monte Carlo simulations. The formation energies were also used to calculate surface tension for the clusters. The sizes of the clusters in the direct and indirect simulations were compared, showing that the direct simulation clusters have more atoms between the liquid-like core of the cluster and the surrounding vapor. Finally, the performance of various nucleation theories in predicting simulated nucleation rates was investigated, and the results among other things highlighted once again the inadequacy of the classical nucleation theory that is commonly employed in nucleation studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Physics teachers are in a key position to form the attitudes and conceptions of future generations toward science and technology, as well as to educate future generations of scientists. Therefore, good teacher education is one of the key areas of physics departments education program. This dissertation is a contribution to the research-based development of high quality physics teacher education, designed to meet three central challenges of good teaching. The first challenge relates to the organization of physics content knowledge. The second challenge, connected to the first one, is to understand the role of experiments and models in (re)constructing the content knowledge of physics for purposes of teaching. The third challenge is to provide for pre-service physics teachers opportunities and resources for reflecting on or assessing their knowledge and experience about physics and physics education. This dissertation demonstrates how these challenges can be met when the content knowledge of physics, the relevant epistemological aspects of physics and the pedagogical knowledge of teaching and learning physics are combined. The theoretical part of this dissertation is concerned with designing two didactical reconstructions for purposes of physics teacher education: the didactical reconstruction of processes (DRoP) and the didactical reconstruction of structures (DRoS). This part starts with taking into account the required professional competencies of physics teachers, the pedagogical aspects of teaching and learning, and the benefits of the graphical ways of representing knowledge. Then it continues with the conceptual and philosophical analysis of physics, especially with the analysis of experiments and models role in constructing knowledge. This analysis is condensed in the form of the epistemological reconstruction of knowledge justification. Finally, these two parts are combined in the designing and production of the DRoP and DRoS. The DRoP captures the knowledge formation of physical concepts and laws in concise and simplified form while still retaining authenticity from the processes of how concepts have been formed. The DRoS is used for representing the structural knowledge of physics, the connections between physical concepts, quantities and laws, to varying extents. Both DRoP and DRoS are represented in graphical form by means of flow charts consisting of nodes and directed links connecting the nodes. The empirical part discusses two case studies that show how the three challenges are met through the use of DRoP and DRoS and how the outcomes of teaching solutions based on them are evaluated. The research approach is qualitative; it aims at the in-depth evaluation and understanding about the usefulness of the didactical reconstructions. The data, which were collected from the advanced course for prospective physics teachers during 20012006, consisted of DRoP and DRoS flow charts made by students and student interviews. The first case study discusses how student teachers used DRoP flow charts to understand the process of forming knowledge about the law of electromagnetic induction. The second case study discusses how student teachers learned to understand the development of physical quantities as related to the temperature concept by using DRoS flow charts. In both studies, the attention is focused on the use of DRoP and DRoS to organize knowledge and on the role of experiments and models in this organization process. The results show that students understanding about physics knowledge production improved and their knowledge became more organized and coherent. It is shown that the flow charts and the didactical reconstructions behind them had an important role in gaining these positive learning results. On the basis of the results reported here, the designed learning tools have been adopted as a standard part of the teaching solutions used in the physics teacher education courses in the Department of Physics, University of Helsinki.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea level rise is among the most worrying consequences of climate change, and the biggest uncertainty of sea level predictions lies in the future behaviour of the ice sheets of Greenland and Antarctica. In this work, a literature review is made concerning the future of the Greenland ice sheet and the effect of its melting on Baltic Sea level. The relation between sea level and ice sheets is also considered more generally from a theoretical and historical point of view. Lately, surprisingly rapid changes in the amount of ice discharging into the sea have been observed along the coastal areas of the ice sheets, and the mass deficit of Greenland and West Antarctic ice sheets which are considered vulnerable to warming has been increasing from the 1990s. The changes are probably related to atmospheric or oceanic temperature variations which affect the flow speed of ice either via meltwater penetrating to the bottom of the ice sheet or via changes in the flow resistance generated by the floating parts of an ice stream. These phenomena are assumed to increase the mass deficit of the ice sheets in the warming climate; however, there is no comprehensive theory to explain and model them. Thus, it is not yet possible to make reliable predictions of the ice sheet contribution to sea level rise. On the grounds of the historical evidence it appears that sea level can rise rather rapidly, 1 2 metres per century, even during warm climate periods. Sea level rise projections of similar magnitude have been made with so-called semiempirical methods that are based on modelling the link between sea level and global mean temperature. Such a rapid rise would require considerable acceleration of the ice sheet flow. Stronger rise appears rather unlikely, among other things because the mountainous coastline restricts ice discharge from Greenland. The upper limit of sea level rise from Greenland alone has been estimated at half a metre by the end of this century. Due to changes in the Earth s gravity field, the sea level rise caused by melting ice is not spatially uniform. Near the melting ice sheet the sea level rise is considerably smaller than the global average, whereas farther away it is slightly greater than the average. Because of this phenomenon, the effect of the Greenland ice sheet on Baltic Sea level will probably be rather small during this century, 15 cm at most. Melting of the Antarctic ice sheet is clearly more dangerous for the Baltic Sea, but also very uncertain. It is likely that the sea level predictions will become more accurate in the near future as the ice sheet models develop.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Keskileveysasteilla sään päivittäiset vaihtelut ovat pitkälti sidoksissa syklonien liikkeisiin. Siksi on tärkeä selvittää, miten syklonitoiminta mahdollisesti muuttuu kasvihuoneilmiön voimistuessa. Tähänastisessa tutkimuksessa on tarkasteltu sekä olemassa olevia uusanalysoituja säähavaintoaikasarjoja että simuloitu syklonitoiminnan muutoksia ilmastomallien avulla. Uusanalyysien ongelmana on niiden epähomogeenisuus ja lyhyys. Ilmastomallien avulla voidaan sen sijaan luoda pitempiä, tulevaisuuteen ulottuvia aikasarjoja, joissa ilmastopakotteen vaikutus on mahdollista saada selvemmin esiin. Tutkielmassa pyritään selvittämään 30:n vuosina 1993-2009 julkaistun, ilmastomalleihin pohjautuvan tutkimuksen perusteella, millaisia tuloksia syklonitoiminnan muutoksia simuloitaessa on tähän asti saatu. Tulokset ovat osin ristiriitaisia, mikä johtuu eroista mm. mallien ominaisuuksissa, käytetyissä ilmastopakotteissa sekä tavoissa, joilla aikasarjoja on analysoitu. Erityisesti tapa, jolla sykloniklimatologiat on eristetty aikasarjoista, luo eroja tutkimusten välille. Yleisimmät menetelmät ovat kaistanpäästösuodatus (BP-suodatus) ja erilaiset hahmontunnistukseen perustuvat syklonien paikannus- ja jäljitysmenetelmät. Vaikka tutkimuksessa on pääasiassa siirrytty käyttämään paikannus- ja jäljitysmenetelmiä, ongelmana ovat niiden erilaiset toimintatavat, minkä vuoksi niitä on vaikea vertailla keskenään. Menetelmien kirjavuudesta huolimatta joistain syklonitoiminnan kvalitatiivisiin muutoksiin liittyvistä seikoista vallitsee kohtalainen yksimielisyys: keskileveysasteilla syklonien lukumäärä tulee vähenemään, keskimääräinen intensiteetti voimistumaan ja syklonien radat siirtyvät molemmilla pallonpuoliskoilla kohti napaa. Uusanalyysien perusteella saadut tulokset tukevat intensiteetin voimistumista ja ratojen siirrosta mutta eriävät lukumäärän suhteen. On mahdollista, että uusanalyyseissä 1900-luvun loppupuoliskolla havaittu lukumäärän kasvutrendi selittyy tarkentuneilla havaintomenetelmillä tai syklonitoiminnan pitkäaikaisella, luonnollisella vaihtelulla.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This doctoral thesis is about the solar wind influence on the atmosphere of the planet Venus. A numerical plasma simulation model was developed for the interaction between Venus and the solar wind to study the erosion of charged particles from the Venus upper atmosphere. The developed model is a hybrid simulation where ions are treated as particles and electrons are modelled as a fluid. The simulation was used to study the solar wind induced ion escape from Venus as observed by the European Space Agency's Venus Express and NASA's Pioneer Venus Orbiter spacecraft. Especially, observations made by the ASPERA-4 particle instrument onboard Venus Express were studied. The thesis consists of an introductory part and four peer-reviewed articles published in scientific journals. In the introduction Venus is presented as one of the terrestrial planets in the Solar System and the main findings of the work are discussed within the wider context of planetary physics. Venus is the closest neighbouring planet to the Earth and the most earthlike planet in its size and mass orbiting the Sun. Whereas the atmosphere of the Earth consists mainly of nitrogen and oxygen, Venus has a hot carbon dioxide atmosphere, which is dominated by the greenhouse effect. Venus has all of its water in the atmosphere, which is only a fraction of the Earth's total water supply. Since planets developed presumably in similar conditions in the young Solar System, why Venus and Earth became so different in many respects? One important feature of Venus is that the planet does not have an intrinsic magnetic field. This makes it possible for the solar wind, a continuous stream of charged particles from the Sun, to flow close to Venus and to pick up ions from the planet's upper atmosphere. The strong intrinsic magnetic field of the Earth dominates the terrestrial magnetosphere and deflects the solar wind flow far away from the atmosphere. The region around Venus where the planet's atmosphere interacts with the solar wind is called the plasma environment or the induced magnetosphere. Main findings of the work include new knowledge about the movement of escaping planetary ions in the Venusian induced magnetosphere. Further, the developed simulation model was used to study how the solar wind conditions affect the ion escape from Venus. Especially, the global three-dimensional structure of the Venusian particle and magnetic environment was studied. The results help to interpret spacecraft observations around the planet. Finally, several remaining questions were identified, which could potentially improve our knowledge of the Venus ion escape and guide the future development of planetary plasma simulations.