997 resultados para Division of Biological Sciences


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nucleation is the first step in a phase transition where small nuclei of the new phase start appearing in the metastable old phase, such as the appearance of small liquid clusters in a supersaturated vapor. Nucleation is important in various industrial and natural processes, including atmospheric new particle formation: between 20 % to 80 % of atmospheric particle concentration is due to nucleation. These atmospheric aerosol particles have a significant effect both on climate and human health. Different simulation methods are often applied when studying things that are difficult or even impossible to measure, or when trying to distinguish between the merits of various theoretical approaches. Such simulation methods include, among others, molecular dynamics and Monte Carlo simulations. In this work molecular dynamics simulations of the homogeneous nucleation of Lennard-Jones argon have been performed. Homogeneous means that the nucleation does not occur on a pre-existing surface. The simulations include runs where the starting configuration is a supersaturated vapor and the nucleation event is observed during the simulation (direct simulations), as well as simulations of a cluster in equilibrium with a surrounding vapor (indirect simulations). The latter type are a necessity when the conditions prevent the occurrence of a nucleation event in a reasonable timeframe in the direct simulations. The effect of various temperature control schemes on the nucleation rate (the rate of appearance of clusters that are equally able to grow to macroscopic sizes and to evaporate) was studied and found to be relatively small. The method to extract the nucleation rate was also found to be of minor importance. The cluster sizes from direct and indirect simulations were used in conjunction with the nucleation theorem to calculate formation free energies for the clusters in the indirect simulations. The results agreed with density functional theory, but were higher than values from Monte Carlo simulations. The formation energies were also used to calculate surface tension for the clusters. The sizes of the clusters in the direct and indirect simulations were compared, showing that the direct simulation clusters have more atoms between the liquid-like core of the cluster and the surrounding vapor. Finally, the performance of various nucleation theories in predicting simulated nucleation rates was investigated, and the results among other things highlighted once again the inadequacy of the classical nucleation theory that is commonly employed in nucleation studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mesoscale weather phenomena, such as the sea breeze circulation or lake effect snow bands, are typically too large to be observed at one point, yet too small to be caught in a traditional network of weather stations. Hence, the weather radar is one of the best tools for observing, analyzing and understanding their behavior and development. A weather radar network is a complex system, which has many structural and technical features to be tuned, from the location of each radar to the number of pulses averaged in the signal processing. These design parameters have no universal optimal values, but their selection depends on the nature of the weather phenomena to be monitored as well as on the applications for which the data will be used. The priorities and critical values are different for forest fire forecasting, aviation weather service or the planning of snow ploughing, to name a few radar-based applications. The main objective of the work performed within this thesis has been to combine knowledge of technical properties of the radar systems and our understanding of weather conditions in order to produce better applications able to efficiently support decision making in service duties for modern society related to weather and safety in northern conditions. When a new application is developed, it must be tested against ground truth . Two new verification approaches for radar-based hail estimates are introduced in this thesis. For mesoscale applications, finding the representative reference can be challenging since these phenomena are by definition difficult to catch with surface observations. Hence, almost any valuable information, which can be distilled from unconventional data sources such as newspapers and holiday shots is welcome. However, as important as getting data is to obtain estimates of data quality, and to judge to what extent the two disparate information sources can be compared. The presented new applications do not rely on radar data alone, but ingest information from auxiliary sources such as temperature fields. The author concludes that in the future the radar will continue to be a key source of data and information especially when used together in an effective way with other meteorological data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosol particles deteriorate air quality, atmospheric visibility and our health. They affect the Earth s climate by absorbing and scattering sunlight, forming clouds, and also via several feed-back mechanisms. The net effect on the radiative balance is negative, i.e. cooling, which means that particles counteract the effect of greenhouse gases. However, particles are one of the poorly known pieces in the climate puzzle. Some of the airborne particles are natural, some anthropogenic; some enter the atmosphere in particle form, while others form by gas-to-particle conversion. Unless the sources and dynamical processes shaping the particle population are quantified, they cannot be incorporated into climate models. The molecular level understanding of new particle formation is still inadequate, mainly due to the lack of suitable measurement techniques to detect the smallest particles and their precursors. This thesis has contributed to our ability to measure newly formed particles. Three new condensation particle counter applications for measuring the concentration of nano-particles were developed. The suitability of the methods for detecting both charged and electrically neutral particles and molecular clusters as small as 1 nm in diameter was thoroughly tested both in laboratory and field conditions. It was shown that condensation particle counting has reached the size scale of individual molecules, and besides measuring the concentration they can be used for getting size information. In addition to atmospheric research, the particle counters could have various applications in other fields, especially in nanotechnology. Using the new instruments, the first continuous time series of neutral sub-3 nm particle concentrations were measured at two field sites, which represent two different kinds of environments: the boreal forest and the Atlantic coastline, both of which are known to be hot-spots for new particle formation. The contribution of ions to the total concentrations in this size range was estimated, and it could be concluded that the fraction of ions was usually minor, especially in boreal forest conditions. Since the ionization rate is connected to the amount of cosmic rays entering the atmosphere, the relative contribution of neutral to charged nucleation mechanisms extends beyond academic interest, and links the research directly to current climate debate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Floating in the air that surrounds us is a number of small particles, invisible to the human eye. The mixture of air and particles, liquid or solid, is called an aerosol. Aerosols have significant effects on air quality, visibility and health, and on the Earth's climate. Their effect on the Earth's climate is the least understood of climatically relevant effects. They can scatter the incoming radiation from the Sun, or they can act as seeds onto which cloud droplets are formed. Aerosol particles are created directly, by human activity or natural reasons such as breaking ocean waves or sandstorms. They can also be created indirectly as vapors or very small particles are emitted into the atmosphere and they combine to form small particles that later grow to reach climatically or health relevant sizes. The mechanisms through which those particles are formed is still under scientific discussion, even though this knowledge is crucial to make air quality or climate predictions, or to understand how aerosols will influence and will be influenced by the climate's feedback loops. One of the proposed mechanisms responsible for new particle formation is ion-induced nucleation. This mechanism is based on the idea that newly formed particles were ultimately formed around an electric charge. The amount of available charges in the atmosphere varies depending on radon concentrations in the soil and in the air, as well as incoming ionizing radiation from outer space. In this thesis, ion-induced nucleation is investigated through long-term measurements in two different environments: in the background site of Hyytiälä and in the urban site that is Helsinki. The main conclusion of this thesis is that ion-induced nucleation generally plays a minor role in new particle formation. The fraction of particles formed varies from day to day and from place to place. The relative importance of ion-induced nucleation, i.e. the fraction of particles formed through ion-induced nucleation, is bigger in cleaner areas where the absolute number of particles formed is smaller. Moreover, ion-induced nucleation contributes to a bigger fraction of particles on warmer days, when the sulfuric acid and water vapor saturation ratios are lower. This analysis will help to understand the feedbacks associated with climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine institutional work from a discursive perspective and argue that reasonability, the existence of acceptable justifying reasons for beliefs and practices, is a key part of legitimation. Drawing on philosophy of language, we maintain that institutional work takes place in the context of ‘space of reasons’ determined by widely held assumptions about what is reasonable and what is not. We argue that reasonability provides the main contextual constraint of institutional work, its major outcome, and a key trigger for actors to engage in it. We draw on Hilary Putnam’s concept ‘division of linguistic labor’ to highlight the specialized distribution of knowledge and authority in defining valid ways of reasoning. In this view, individuals use institutionalized vocabularies to reason about their choices and understand their context with limited understanding of how and why these structures have become what they are. We highlight the need to understand how professions and other actors establish and maintain the criteria of reasoning in various areas of expertise through discursive institutional work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosol particles have effect on climate, visibility, air quality and human health. However, the strength of which aerosol particles affect our everyday life is not well described or entirely understood. Therefore, investigations of different processes and phenomena including e.g. primary particle sources, initial steps of secondary particle formation and growth, significance of charged particles in particle formation, as well as redistribution mechanisms in the atmosphere are required. In this work sources, sinks and concentrations of air ions (charged molecules, cluster and particles) were investigated directly by measuring air molecule ionising components (i.e. radon activity concentrations and external radiation dose rates) and charged particle size distributions, as well as based on literature review. The obtained results gave comprehensive and valuable picture of the spatial and temporal variation of the air ion sources, sinks and concentrations to use as input parameters in local and global scale climate models. Newly developed air ion spectrometers (Airel Ltd.) offered a possibility to investigate atmospheric (charged) particle formation and growth at sub-3 nm sizes. Therefore, new visual classification schemes for charged particle formation events were developed, and a newly developed particle growth rate method was tested with over one year dataset. These data analysis methods have been widely utilised by other researchers since introducing them. This thesis resulted interesting characteristics of atmospheric particle formation and growth: e.g. particle growth may sometimes be suppressed before detection limit (~ 3 nm) of traditional aerosol instruments, particle formation may take place during daytime as well as in the evening, growth rates of sub-3 nm particles were quite constant throughout the year while growth rates of larger particles (3-20 nm in diameter) were higher during summer compared to winter. These observations were thought to be a consequence of availability of condensing vapours. The observations of this thesis offered new understanding of the particle formation in the atmosphere. However, the role of ions in particle formation, which is not well understood with current knowledge, requires further research in future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eddy covariance (EC)-flux measurement technique is based on measurement of turbulent motions of air with accurate and fast measurement devices. For instance, in order to measure methane flux a fast methane gas analyser is needed which measures methane concentration at least ten times in a second in addition to a sonic anemometer, which measures the three wind components with the same sampling interval. Previously measurement of methane flux was almost impossible to carry out with EC-technique due to lack of fast enough gas analysers. However during the last decade new instruments have been developed and thus methane EC-flux measurements have become more common. Performance of four methane gas analysers suitable for eddy covariance measurements are assessed in this thesis. The assessment and comparison was performed by analysing EC-data obtained during summer 2010 (1.4.-26.10.) at Siikaneva fen. The four participating methane gas analysers are TGA-100A (Campbell Scientific Inc., USA), RMT-200 (Los Gatos Research, USA), G1301-f (Picarro Inc., USA) and Prototype-7700 (LI-COR Biosciences, USA). RMT-200 functioned most reliably throughout the measurement campaign and the corresponding methane flux data had the smallest random error. In addition, methane fluxes calculated from data obtained from G1301-f and RMT-200 agree remarkably well throughout the measurement campaign. The calculated cospectra and power spectra agree well with corresponding temperature spectra. Prototype-7700 functioned only slightly over one month in the beginning of the measurement campaign and thus its accuracy and long-term performance is difficult to assess.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A two-dimensional numerical model which employs the depth-averaged forms of continuity and momentum equations along with k-e turbulence closure scheme is used to simulate the flow at the open channel divisions. The model is generalised to flows of arbitrary geometries and MacCormack finite volume method is used for solving governing equations. Application of cartesian version of the model to analyse the flow at right-angled junction is presented. The numerical predictions are compared with experimental data of earlier investigators and measurements made as part of the present study. Performance of the model in predicting discharge distribution, surface profiles, separation zone parameters and energy losses is evaluated and discussed in detail. To illustrate the application of the numerical model to analyse the flow in acute angled offtakes and streamlined branch entries, a few computational results are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, a procedure is presented for the reconstruction of biological organs from image sequences obtained through CT-scan. Although commercial software, which can accomplish this task, are readily available, the procedure presented here needs only free software. The procedure has been applied to reconstruct a liver from the scan data available in literature. 3D biological organs obtained this way can be used for the finite element analysis of biological organs and this has been demonstrated by carrying out an FE analysis on the reconstructed liver.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electron transfer is an essential activity in biological systems. The migrating electron originates from water-oxygen in photosynthesis and reverts to dioxygen in respiration. In this cycle two metal porphyrin complexes possessing circular conjugated system and macrocyclic pi-clouds, chlorophyll and hems, play a decisive role in mobilising electrons for travel over biological structures as extraneous electrons. Transport of electrons within proteins (as in cytochromes) and within DNA (during oxidative damage and repair) is known to occur. Initial evaluations did not favour formation of semiconducting pathways of delocalized electrons of the peptide bonds in proteins and of the bases in nucleic acids. Direct measurement of conductivity of bulk material and quantum chemical calculations of their polymeric structures also did not support electron transfer in both proteins and nucleic acids. New experimental approaches have revived interest in the process of charge transfer through DNA duplex. The fluorescence on photoexcitation of Ru-complex was found to be quenched by Rh-complex, when both were tethered to DNA and intercalated in the base stack. Similar experiments showed that damage to G-bases and repair of T-T dimers in DNA can occur by possible long range electron transfer through the base stack. The novelty of this phenomenon prompted the apt name, chemistry at a distance. Based on experiments with ruthenium modified proteins, intramolecular electron transfer in proteins is now proposed to use pathways that include C-C sigma-bonds and surprisingly hydrogen bonds which remained out of favour for a long time. In support of this, some experimental evidence is now available showing that hydrogen bond-bridges facilitate transfer of electrons between metal-porphyrin complexes. By molecular orbital calculations over 20 years ago. we found that "delocalization of an extraneous electron is pronounced when it enters low-lying virtual orbitals of the electronic structures of peptide units linked by hydrogen bonds". This review focuses on supramolecular electron transfer pathways that can emerge on interlinking by hydrogen bonds and metal coordination of some unnoticed structures with pi-clouds in proteins and nucleic acids, potentially useful in catalysis and energy missions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a methodology to reconstruct 3D biological organs from image sequences or other scan data using readily available free softwares with the final goal of using the organs (3D solids) for finite element analysis. The methodology deals with issues such as segmentation, conversion to polygonal surface meshes, and finally conversion of these meshes to 3D solids. The user is able to control the detail or the level of complexity of the solid constructed. The methodology is illustrated using 3D reconstruction of a porcine liver as an example. Finally, the reconstructed liver is imported into the commercial software ANSYS, and together with a cyst inside the liver, a nonlinear analysis performed. The results confirm that the methodology can be used for obtaining 3D geometry of biological organs. The results also demonstrate that the geometry obtained by following this methodology can be used for the nonlinear finite element analysis of organs. The methodology (or the procedure) would be of use in surgery planning and surgery simulation since both of these extensively use finite elements for numerical simulations and it is better if these simulations are carried out on patient specific organ geometries. Instead of following the present methodology, it would cost a lot to buy a commercial software which can reconstruct 3D biological organs from scanned image sequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several concepts have been developed in the recent years for nanomaterial based integrated MEMS platform in order to accelerate the process of biological sample preparation followed by selective screening and identification of target molecules. In this context, there exist several challenges which need to be addressed in the process of electrical lysis of biological cells. These are due to (i) low resource settings while achieving maximal lysis (ii) high throughput of target molecules to be detected (iii) automated extraction and purification of relevant molecules such as DNA and protein from extremely small volume of sample (iv) requirement of fast, accurate and yet scalable methods (v) multifunctionality toward process monitoring and (vi) downward compatibility with already existing diagnostic protocols. This paper reports on the optimization of electrical lysis process based on various different nanocomposite coated electrodes placed in a microfluidic channel. The nanocomposites are synthesized using different nanomaterials like Zinc nanorod dispersion in polymer. The efficiency of electrical lysis with various different electrode coatings has been experimentally verified in terms of DNA concentration, amplification and protein yield. The influence of the coating thickness on the injection current densities has been analyzed. We further correlate experimentally the current density vs. voltage relationship with the extent of bacterial cell lysis. A coupled multiphysics based simulation model is used to predict the cell trajectories and lysis efficiencies under various electrode boundary conditions as estimated from experimental results. Detailed in-situ fluorescence imaging and spectroscopy studies are performed to validate various hypotheses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, first a Fortran code is developed for three dimensional linear elastostatics using constant boundary elements; the code is based on a MATLAB code developed by the author earlier. Next, the code is parallelized using BLACS, MPI, and ScaLAPACK. Later, the parallelized code is used to demonstrate the usefulness of the Boundary Element Method (BEM) as applied to the realtime computational simulation of biological organs, while focusing on the speed and accuracy offered by BEM. A computer cluster is used in this part of the work. The commercial software package ANSYS is used to obtain the `exact' solution against which the solution from BEM is compared; analytical solutions, wherever available, are also used to establish the accuracy of BEM. A pig liver is the biological organ considered. Next, instead of the computer cluster, a Graphics Processing Unit (GPU) is used as the parallel hardware. Results indicate that BEM is an interesting choice for the simulation of biological organs. Although the use of BEM for the simulation of biological organs is not new, the results presented in the present study are not found elsewhere in the literature. Also, a serial MATLAB code, and both serial and parallel versions of a Fortran code, which can solve three dimensional (3D) linear elastostatic problems using constant boundary elements, are provided as supplementary files that can be freely downloaded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, possibility of simulating biological organs in realtime using the Boundary Element Method (BEM) is investigated, with specific reference to the speed and the accuracy offered by BEM. First, a Graphics Processing Unit (GPU) is used to speed up the BEM computations to achieve the realtime performance. Next, instead of the GPU, a computer cluster is used. A pig liver is the biological organ considered. Results indicate that BEM is an interesting choice for the simulation of biological organs. Although the use of BEM for the simulation of biological organs is not new, the results presented in the present study are not found elsewhere in the literature.