911 resultados para FIELD ANALYSIS
Resumo:
The neutron skin thickness of nuclei is a sensitive probe of the nuclear symmetry energy and has multiple implications for nuclear and astrophysical studies. However, precision measurements of this observable are difficult to obtain. The analysis of the experimental data may imply some assumptions about the bulk or surface nature of the formation of the neutron skin. Here we study the bulk or surface character of neutron skins of nuclei following from calculations with Gogny, Skyrme, and covariant nuclear mean-field interactions. These interactions are successful in describing nuclear charge radii and binding energies but predict different values for neutron skins. We perform the study by fitting two-parameter Fermi distributions to the calculated self-consistent neutron and proton densities. We note that the equivalent sharp radius is a more suitable reference quantity than the half-density radius parameter of the Fermi distributions to discern between the bulk and surface contributions in neutron skins. We present calculations for nuclei in the stability valley and for the isotopic chains of Sn and Pb.
Resumo:
Quartz Tuning Fork (QTF)-based Scanning Probe Microscopy (SPM) is an important field of research. A suitable model for the QTF is important to obtain quantitative measurements with these devices. Analytical models have the limitation of being based on the double cantilever configuration. In this paper, we present an electromechanical finite element model of the QTF electrically excited with two free prongs. The model goes beyond the state-of-the-art of numerical simulations currently found in the literature for this QTF configuration. We present the first numerical analysis of both the electrical and mechanical behavior of QTF devices. Experimental measurements obtained with 10 units of the same model of QTF validate the finite element model with a good agreement.
Resumo:
With the aim of monitoring the dynamics of the Livingston Island ice cap, the Departament de Geodinàmica i Geofísica of the Universitat de Barcelona began ye a r ly surveys in the austral summer of 1994-95 on Johnsons Glacier. During this field campaign 10 shallow ice cores were sampled with a manual ve rtical ice-core drilling machine. The objectives were: i) to detect the tephra layer accumulated on the glacier surface, attributed to the 1970 Deception Island pyroclastic eruption, today interstratified; ii) to verify wheter this layer might serve as a reference level; iii) to measure the 1 3 7Cs radio-isotope concentration accumulated in the 1965 snow stratum; iv) to use the isochrone layer as a mean of verifying the age of the 1970 tephra layer; and, v) to calculate both the equilibrium line of the glacier and average mass balance over the last 28 years (1965-1993). The stratigr a p hy of the cores, their cumulative density curves and the isothermal ice temperatures recorded confi rm that Johnsons Glacier is a temperate glacier. Wi n d, solar radiation heating and liquid water are the main agents controlling the ve rtical and horizontal redistribution of the volcanic and cryoclastic particles that are sedimented and remain interstratified within the g l a c i e r. It is because of this redistribution that the 1970 tephra layer does not always serve as a ve ry good reference level. The position of the equilibrium line altitude (ELA) in 1993, obtained by the 1 3 7Cs spectrometric analysis, varies from about 200 m a.s.l. to 250 m a.s.l. This indicates a rising trend in the equilibrium line altitude from the beginning of the 1970s to the present day. The va rying slope orientation of Johnsons Glacier relative to the prevailing NE wind gives rise to large local differences in snow accumulation, which locally modifies the equilibrium line altitude. In the cores studied, 1 3 7Cs appears to be associated with the 1970 tephra laye r. This indicates an intense ablation episode throughout the sampled area (at least up to 330 m a.s.l), which probably occurred synchronically to the 1970 tephra deposition or later. A rough estimate of the specific mass balance reveals a considerable accumulation gradient related to the increase with altitude.
Resumo:
Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury.
Resumo:
The number of qualitative research methods has grown substantially over the last twenty years, both in social sciences and, more recently, in the health sciences. This growth came with questions on the quality criteria needed to evaluate this work, and numerous guidelines were published. The latters include many discrepancies though, both in their vocabulary and construction. Many expert evaluators decry the absence of consensual and reliable evaluation tools. The authors present the results of an evaluation of 58 existing guidelines in 4 major health science fields (medicine and epidemiology; nursing and health education; social sciences and public health; psychology / psychiatry, research methods and organization) by expert users (article reviewers, experts allocating funds, editors, etc.). The results propose a toolbox containing 12 consensual criteria with the definitions given by expert users. They also indicate in which disciplinary field each type of criteria is known to be more or less essential. Nevertheless, the authors highlight the limitations of the criteria comparability, as soon as one focuses on their specific definitions. They conclude that each criterion in the toolbox must be explained to come to broader consensus and identify definitions that are consensual to all the fields examined and easily operational.
Resumo:
A method to evaluate the physical realizability of an arbitrary three-dimensional vectorial field distribution in the focal area is proposed. A parameter that measures the similarity between the designed (target) field and the physically achievable beam is provided. This analysis is carried out within the framework of the closest electromagnetic field to a given vectorial function, and the procedure is applied to two illustrative cases.
Resumo:
Defining digital humanities might be an endless debate if we stick to the discussion about the boundaries of this concept as an academic "discipline". In an attempt to concretely identify this field and its actors, this paper shows that it is possible to analyse them through Twitter, a social media widely used by this "community of practice". Based on a network analysis of 2,500 users identified as members of this movement, the visualisation of the "who's following who?" graph allows us to highlight the structure of the network's relationships, and identify users whose position is particular. Specifically, we show that linguistic groups are key factors to explain clustering within a network whose characteristics look similar to a small world.
Resumo:
Water withdrawal from Mediterranean reservoirs in summer is usually very high. Because of this, stratification is often continuous and far from the typical two-layered structure, favoring the excitation of higher vertical modes. The analysis of wind, temperature, and current data from Sau reservoir (Spain) shows that the third vertical mode of the internal seiche (baroclinic mode) dominated the internal wave field at the beginning of September 2003. We used a continuous stratification two-dimensional model to calculate the period and velocity distribution of the various modes of the internal seiche, and we calculated that the period of the third vertical mode is ;24 h, which coincides with the period of the dominating winds. As a result of the resonance between the third mode and the wind, the other oscillation modes were not excited during this period
Resumo:
For years a literature on the uses that political parties make of information andcommunication technologies (ICTs) has been developed. It is a rapidly increasing, rich,and interesting field in the forefront of the investigation in political science. Generally,these works start from the expectation that the ICTs have a regenerative potential forliberal democracies and for the political parties as well. In developed societies, politicalparties have experienced some transformations that have leaded them to an increasingdivorce with the public. This divorce is shown by the decay of party adscription andmembership, and also by the decay of the conventional political participation. In thetheoretical discussion this situation has been described as ¿the crisis of the democracy¿(Norris, 1999). According to the more radically oriented scholars this crisis reflects theincapacities of liberal democracies. In this sense, ICTs suppose a great opportunity tosurpass the representative institutions and to institutionalize new forms of directdemocracy. More moderate scholars have considered that ICTs offer the opportunity for¿renaissance¿ for representative institutions, as they can reinforce the bonds between thepublic and its representatives.
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
This review has tried to collect and correlate all the various equations for the g matrix of strong field d5 systems obtained from different basis sets using full electron and hole formalism calculations. It has corrected mistakes found in the literature and shown how the failure to properly take in symmetry boundary conditions has produced a variety of apparently inconsistent equations in the literature. The review has reexamined the problem of spin-orbit interaction with excited t4e states and finds that the earlier reports that it is zero in octahedral symmetry is not correct. It has shown how redefining what x, y, and z are in the principal coordinate system simplifies, compared to previous methods, the analysis of experimental g values with the equations.
Resumo:
Coating and filler pigments have strong influence to the properties of the paper. Filler content can be even over 30 % and pigment content in coating is about 85-95 weight percent. The physical and chemical properties of the pigments are different and the knowledge of these properties is important for optimising of optical and printing properties of the paper. The size and shape of pigment particles can be measured by different analysers which can be based on sedimentation, laser diffraction, changes in electric field etc. In this master's thesis was researched particle properties especially by scanning electron microscope (SEM) and image analysis programs. Research included nine pigments with different particle size and shape. Pigments were analysed by two image analysis programs (INCA Feature and Poikki), Coulter LS230 (laser diffraction) and SediGraph 5100 (sedimentation). The results were compared to perceive the effect of particle shape to the performance of the analysers. Only image analysis programs gave parameters of the particle shape. One part of research was also the sample preparation for SEM. Individual particles should be separated and distinct in ideal sample. Analysing methods gave different results but results from image analysis programs corresponded even to sedimentation or to laser diffraction depending on the particle shape. Detailed analysis of the particle shape required high magnification in SEM, but measured parameters described very well the shape of the particles. Large particles (ecd~1 µm) could be used also in 3D-modelling which enabled the measurement of the thickness of the particles. Scanning electron microscope and image analysis programs were effective and multifunctional tools for particle analyses. Development and experience will devise the usability of analysing method in routine use.
Resumo:
This article focuses on the analysis of the regulatory framework of citizen participation in the local government, which organises direct and participatory democracy at the local level, and identifies the laws and mechanisms through which the constitutional requirements for participation are accomplished. Mu nicipalities, the authority closest to citizens, are the best level of government since they directly involve civil society in the decision-making process experiencing the scope and appropriateness of the instruments by which it is channeled.
Resumo:
This paper proposes a calibration method which can be utilized for the analysis of SEM images. The field of application of the developed method is a calculation of surface potential distribution of biased silicon edgeless detector. The suggested processing of the data collected by SEM consists of several stages and takes into account different aspects affecting the SEM image. The calibration method doesn’t pretend to be precise but at the same time it gives the basics of potential distribution when the different biasing voltages applied to the detector.