37 resultados para OPEN CLUSTERS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atmospheric aerosol particles have a strong impact on the global climate. A deep understanding of the physical and chemical processes affecting the atmospheric aerosol climate system is crucial in order to describe those processes properly in global climate models. Besides the climatic effects, aerosol particles can deteriorate e.g. visibility and human health. Nucleation is a fundamental step in atmospheric new particle formation. However, details of the atmospheric nucleation mechanisms have remained unresolved. The main reason for that has been the non-existence of instruments capable of measuring neutral newly formed particles in the size range below 3 nm in diameter. This thesis aims to extend the detectable particle size range towards close-to-molecular sizes (~1nm) of freshly nucleated clusters, and by direct measurement obtain the concentrations of sub-3 nm particles in atmospheric environment and in well defined laboratory conditions. In the work presented in this thesis, new methods and instruments for the sub-3 nm particle detection were developed and tested. The selected approach comprises four different condensation based techniques and one electrical detection scheme. All of them are capable to detect particles with diameters well below 3 nm, some even down to ~1 nm. The developed techniques and instruments were deployed in the field measurements as well as in laboratory nucleation experiments. Ambient air studies showed that in a boreal forest environment a persistent population of 1-2 nm particles or clusters exists. The observation was done using 4 different instruments showing a consistent capability for the direct measurement of the atmospheric nucleation. The results from the laboratory experiments showed that sulphuric acid is a key species in the atmospheric nucleation. The mismatch between the earlier laboratory data and ambient observations on the dependency of nucleation rate on sulphuric acid concentration was explained. The reason was shown to be associated in the inefficient growth of the nucleated clusters and in the insufficient detection efficiency of particle counters used in the previous experiments. Even though the exact molecular steps of nucleation still remain an open question, the instrumental techniques developed in this work as well as their application in laboratory and ambient studies opened a new view into atmospheric nucleation and prepared the way for investigating the nucleation processes with more suitable tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Earlier work has suggested that large-scale dynamos can reach and maintain equipartition field strengths on a dynamical time scale only if magnetic helicity of the fluctuating field can be shed from the domain through open boundaries. To test this scenario in convection-driven dynamos by comparing results for open and closed boundary conditions. Three-dimensional numerical simulations of turbulent compressible convection with shear and rotation are used to study the effects of boundary conditions on the excitation and saturation level of large-scale dynamos. Open (vertical field) and closed (perfect conductor) boundary conditions are used for the magnetic field. The contours of shear are vertical, crossing the outer surface, and are thus ideally suited for driving a shear-induced magnetic helicity flux. We find that for given shear and rotation rate, the growth rate of the magnetic field is larger if open boundary conditions are used. The growth rate first increases for small magnetic Reynolds number, Rm, but then levels off at an approximately constant value for intermediate values of Rm. For large enough Rm, a small-scale dynamo is excited and the growth rate in this regime increases proportional to Rm^(1/2). In the nonlinear regime, the saturation level of the energy of the mean magnetic field is independent of Rm when open boundaries are used. In the case of perfect conductor boundaries, the saturation level first increases as a function of Rm, but then decreases proportional to Rm^(-1) for Rm > 30, indicative of catastrophic quenching. These results suggest that the shear-induced magnetic helicity flux is efficient in alleviating catastrophic quenching when open boundaries are used. The horizontally averaged mean field is still weakly decreasing as a function of Rm even for open boundaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The open access (OA) model for journals is compared to the open source principle for computer software. Since the early 1990s nearly 1,000 OA scientific journals have emerged – mostly as voluntary community efforts, although recently some professionally operating publishers have used author charges or institutional membership. This study of OA journals without author charges shows that their impact is still relatively small, but awareness of it is increasing. The average number of research articles per year is lower than for major scientific journals but the publication times are shorter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Internet has recently made possible the free global availability of scientific journal articles. Open Access (OA) can occur either via OA scientific journals, or via authors posting manuscripts of articles published in subscription journals in open web repositories. So far there have been few systematic studies showing how big the extent of OA is, in particular studies covering all fields of science. Methodology/Principal Findings: The proportion of peer reviewed scholarly journal articles, which are available openly in full text on the web, was studied using a random sample of 1837 titles and a web search engine. Of articles published in 2008, 8,5% were freely available at the publishers’ sites. For an additional 11,9% free manuscript versions could be found using search engines, making the overall OA percentage 20,4%. Chemistry (13%) had the lowest overall share of OA, Earth Sciences (33%) the highest. In medicine, biochemistry and chemistry publishing in OA journals was more common. In all other fields author-posted manuscript copies dominated the picture. Conclusions/Significance: The results show that OA already has a significant positive impact on the availability of the scientific journal literature and that there are big differences between scientific disciplines in the uptake. Due to the lack of awareness of OA-publishing among scientists in most fields outside physics, the results should be of general interest to all scholars. The results should also interest academic publishers, who need to take into account OA in their business strategies and copyright policies, as well as research funders, who like the NIH are starting to require OA availability of results from research projects they fund. The method and search tools developed also offer a good basis for more in-depth studies as well as longitudinal studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction This case study is based on the experiences with the Electronic Journal of Information Technology in Construction (ITcon), founded in 1995. Development This journal is an example of a particular category of open access journals, which use neither author charges nor subscriptions to finance their operations, but rely largely on unpaid voluntary work in the spirit of the open source movement. The journal has, after some initial struggle, survived its first decade and is now established as one of half-a-dozen peer reviewed journals in its field. Operations The journal publishes articles as they become ready, but creates virtual issues through alerting messages to “subscribers”. It has also started to publish special issues, since this helps in attracting submissions, and also helps in sharing the work-load of review management. From the start the journal adopted a rather traditional layout of the articles. After the first few years the HTML version was dropped and papers are only published in PDF format. Performance The journal has recently been benchmarked against the competing journals in its field. Its acceptance rate of 53% is slightly higher and its average turnaround time of seven months almost a year faster compared to those journals in the sample for which data could be obtained. The server log files for the past three years have also been studied. Conclusions Our overall experience demonstrates that it is possible to publish this type of OA journal, with a yearly publishing volume equal to a quarterly journal and involving the processing of some fifty submissions a year, using a networked volunteer-based organization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches. Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine. Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors. Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.