982 resultados para Fast methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

By detecting leading protons produced in the Central Exclusive Diffractive process, p+p → p+X+p, one can measure the missing mass, and scan for possible new particle states such as the Higgs boson. This process augments - in a model independent way - the standard methods for new particle searches at the Large Hadron Collider (LHC) and will allow detailed analyses of the produced central system, such as the spin-parity properties of the Higgs boson. The exclusive central diffractive process makes possible precision studies of gluons at the LHC and complements the physics scenarios foreseen at the next e+e− linear collider. This thesis first presents the conclusions of the first systematic analysis of the expected precision measurement of the leading proton momentum and the accuracy of the reconstructed missing mass. In this initial analysis, the scattered protons are tracked along the LHC beam line and the uncertainties expected in beam transport and detection of the scattered leading protons are accounted for. The main focus of the thesis is in developing the necessary radiation hard precision detector technology for coping with the extremely demanding experimental environment of the LHC. This will be achieved by using a 3D silicon detector design, which in addition to the radiation hardness of up to 5×10^15 neutrons/cm2, offers properties such as a high signal-to- noise ratio, fast signal response to radiation and sensitivity close to the very edge of the detector. This work reports on the development of a novel semi-3D detector design that simplifies the 3D fabrication process, but conserves the necessary properties of the 3D detector design required in the LHC and in other imaging applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual content is a critical component of everyday social media, on platforms explicitly framed around the visual (Instagram and Vine), on those offering a mix of text and images in myriad forms (Facebook, Twitter, and Tumblr), and in apps and profiles where visual presentation and provision of information are important considerations. However, despite being so prominent in forms such as selfies, looping media, infographics, memes, online videos, and more, sociocultural research into the visual as a central component of online communication has lagged behind the analysis of popular, predominantly text-driven social media. This paper underlines the increasing importance of visual elements to digital, social, and mobile media within everyday life, addressing the significant research gap in methods for tracking, analysing, and understanding visual social media as both image-based and intertextual content. In this paper, we build on our previous methodological considerations of Instagram in isolation to examine further questions, challenges, and benefits of studying visual social media more broadly, including methodological and ethical considerations. Our discussion is intended as a rallying cry and provocation for further research into visual (and textual and mixed) social media content, practices, and cultures, mindful of both the specificities of each form, but also, and importantly, the ongoing dialogues and interrelations between them as communication forms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern-day weather forecasting is highly dependent on Numerical Weather Prediction (NWP) models as the main data source. The evolving state of the atmosphere with time can be numerically predicted by solving a set of hydrodynamic equations, if the initial state is known. However, such a modelling approach always contains approximations that by and large depend on the purpose of use and resolution of the models. Present-day NWP systems operate with horizontal model resolutions in the range from about 40 km to 10 km. Recently, the aim has been to reach operationally to scales of 1 4 km. This requires less approximations in the model equations, more complex treatment of physical processes and, furthermore, more computing power. This thesis concentrates on the physical parameterization methods used in high-resolution NWP models. The main emphasis is on the validation of the grid-size-dependent convection parameterization in the High Resolution Limited Area Model (HIRLAM) and on a comprehensive intercomparison of radiative-flux parameterizations. In addition, the problems related to wind prediction near the coastline are addressed with high-resolution meso-scale models. The grid-size-dependent convection parameterization is clearly beneficial for NWP models operating with a dense grid. Results show that the current convection scheme in HIRLAM is still applicable down to a 5.6 km grid size. However, with further improved model resolution, the tendency of the model to overestimate strong precipitation intensities increases in all the experiment runs. For the clear-sky longwave radiation parameterization, schemes used in NWP-models provide much better results in comparison with simple empirical schemes. On the other hand, for the shortwave part of the spectrum, the empirical schemes are more competitive for producing fairly accurate surface fluxes. Overall, even the complex radiation parameterization schemes used in NWP-models seem to be slightly too transparent for both long- and shortwave radiation in clear-sky conditions. For cloudy conditions, simple cloud correction functions are tested. In case of longwave radiation, the empirical cloud correction methods provide rather accurate results, whereas for shortwave radiation the benefit is only marginal. Idealised high-resolution two-dimensional meso-scale model experiments suggest that the reason for the observed formation of the afternoon low level jet (LLJ) over the Gulf of Finland is an inertial oscillation mechanism, when the large-scale flow is from the south-east or west directions. The LLJ is further enhanced by the sea-breeze circulation. A three-dimensional HIRLAM experiment, with a 7.7 km grid size, is able to generate a similar LLJ flow structure as suggested by the 2D-experiments and observations. It is also pointed out that improved model resolution does not necessary lead to better wind forecasts in the statistical sense. In nested systems, the quality of the large-scale host model is really important, especially if the inner meso-scale model domain is small.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years there has been growing interest in selecting suitable wood raw material to increase end product quality and to increase the efficiency of industrial processes. Genetic background and growing conditions are known to affect properties of growing trees, but only a few parameters reflecting wood quality, such as volume and density can be measured on an industrial scale. Therefore research on cellular level structures of trees grown in different conditions is needed to increase understanding of the growth process of trees leading to desired wood properties. In this work the cellular and cell wall structures of wood were studied. Parameters, such as the mean microfibril angle (MFA), the spiral grain angles, the fibre length, the tracheid cell wall thickness and the cross-sectional shape of the tracheid, were determined as a function of distance from the pith towards the bark and mutual dependencies of these parameters were discussed. Samples from fast-grown trees, which belong to a same clone, grown in fertile soil and also from fertilised trees were measured. It was found that in fast-grown trees the mean MFA decreased more gradually from the pith to the bark than in reference stems. In fast-grown samples cells were shorter, more thin-walled and their cross-sections were rounder than in slower-grown reference trees. Increased growth rate was found to cause an increase in spiral grain variation both within and between annual rings. Furthermore, methods for determination of the mean MFA using x-ray diffraction were evaluated. Several experimental arrangements including the synchrotron radiation based microdiffraction were compared. For evaluation of the data analysis procedures a general form for diffraction conditions in terms of angles describing the fibre orientation and the shape of the cell was derived. The effects of these parameters on the obtained microfibril angles were discussed. The use of symmetrical transmission geometry and tangentially cut samples gave the most reliable MFA values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An efficient and statistically robust solution for the identification of asteroids among numerous sets of astrometry is presented. In particular, numerical methods have been developed for the short-term identification of asteroids at discovery, and for the long-term identification of scarcely observed asteroids over apparitions, a task which has been lacking a robust method until now. The methods are based on the solid foundation of statistical orbital inversion properly taking into account the observational uncertainties, which allows for the detection of practically all correct identifications. Through the use of dimensionality-reduction techniques and efficient data structures, the exact methods have a loglinear, that is, O(nlog(n)), computational complexity, where n is the number of included observation sets. The methods developed are thus suitable for future large-scale surveys which anticipate a substantial increase in the astrometric data rate. Due to the discontinuous nature of asteroid astrometry, separate sets of astrometry must be linked to a common asteroid from the very first discovery detections onwards. The reason for the discontinuity in the observed positions is the rotation of the observer with the Earth as well as the motion of the asteroid and the observer about the Sun. Therefore, the aim of identification is to find a set of orbital elements that reproduce the observed positions with residuals similar to the inevitable observational uncertainty. Unless the astrometric observation sets are linked, the corresponding asteroid is eventually lost as the uncertainty of the predicted positions grows too large to allow successful follow-up. Whereas the presented identification theory and the numerical comparison algorithm are generally applicable, that is, also in fields other than astronomy (e.g., in the identification of space debris), the numerical methods developed for asteroid identification can immediately be applied to all objects on heliocentric orbits with negligible effects due to non-gravitational forces in the time frame of the analysis. The methods developed have been successfully applied to various identification problems. Simulations have shown that the methods developed are able to find virtually all correct linkages despite challenges such as numerous scarce observation sets, astrometric uncertainty, numerous objects confined to a limited region on the celestial sphere, long linking intervals, and substantial parallaxes. Tens of previously unknown main-belt asteroids have been identified with the short-term method in a preliminary study to locate asteroids among numerous unidentified sets of single-night astrometry of moving objects, and scarce astrometry obtained nearly simultaneously with Earth-based and space-based telescopes has been successfully linked despite a substantial parallax. Using the long-term method, thousands of realistic 3-linkages typically spanning several apparitions have so far been found among designated observation sets each spanning less than 48 hours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"We thank MrGilder for his considered comments and suggestions for alternative analyses of our data. We also appreciate Mr Gilder’s support of our call for larger studies to contribute to the evidence base for preoperative loading with high-carbohydrate fluids..."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing business process drift detection methods do not work with event streams. As such, they are designed to detect inter-trace drifts only, i.e. drifts that occur between complete process executions (traces), as recorded in event logs. However, process drift may also occur during the execution of a process, and may impact ongoing executions. Existing methods either do not detect such intra-trace drifts, or detect them with a long delay. Moreover, they do not perform well with unpredictable processes, i.e. processes whose logs exhibit a high number of distinct executions to the total number of executions. We address these two issues by proposing a fully automated and scalable method for online detection of process drift from event streams. We perform statistical tests over distributions of behavioral relations between events, as observed in two adjacent windows of adaptive size, sliding along with the stream. An extensive evaluation on synthetic and real-life logs shows that our method is fast and accurate in the detection of typical change patterns, and performs significantly better than the state of the art.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nucleation is the first step of the process by which gas molecules in the atmosphere condense to form liquid or solid particles. Despite the importance of atmospheric new-particle formation for both climate and health-related issues, little information exists on its precise molecular-level mechanisms. In this thesis, potential nucleation mechanisms involving sulfuric acid together with either water and ammonia or reactive biogenic molecules are studied using quantum chemical methods. Quantum chemistry calculations are based on the numerical solution of Schrödinger's equation for a system of atoms and electrons subject to various sets of approximations, the precise details of which give rise to a large number of model chemistries. A comparison of several different model chemistries indicates that the computational method must be chosen with care if accurate results for sulfuric acid - water - ammonia clusters are desired. Specifically, binding energies are incorrectly predicted by some popular density functionals, and vibrational anharmonicity must be accounted for if quantitatively reliable formation free energies are desired. The calculations reported in this thesis show that a combination of different high-level energy corrections and advanced thermochemical analysis can quantitatively replicate experimental results concerning the hydration of sulfuric acid. The role of ammonia in sulfuric acid - water nucleation was revealed by a series of calculations on molecular clusters of increasing size with respect to all three co-ordinates; sulfuric acid, water and ammonia. As indicated by experimental measurements, ammonia significantly assists the growth of clusters in the sulfuric acid - co-ordinate. The calculations presented in this thesis predict that in atmospheric conditions, this effect becomes important as the number of acid molecules increases from two to three. On the other hand, small molecular clusters are unlikely to contain more than one ammonia molecule per sulfuric acid. This implies that the average NH3:H2SO4 mole ratio of small molecular clusters in atmospheric conditions is likely to be between 1:3 and 1:1. Calculations on charged clusters confirm the experimental result that the HSO4- ion is much more strongly hydrated than neutral sulfuric acid. Preliminary calculations on HSO4- NH3 clusters indicate that ammonia is likely to play at most a minor role in ion-induced nucleation in the sulfuric acid - water system. Calculations of thermodynamic and kinetic parameters for the reaction of stabilized Criegee Intermediates with sulfuric acid demonstrate that quantum chemistry is a powerful tool for investigating chemically complicated nucleation mechanisms. The calculations indicate that if the biogenic Criegee Intermediates have sufficiently long lifetimes in atmospheric conditions, the studied reaction may be an important source of nucleation precursors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Foliage density and leaf area index are important vegetation structure variables. They can be measured by several methods but few have been tested in tropical forests which have high structural heterogeneity. In this study, foliage density estimates by two indirect methods, the point quadrat and photographic methods, were compared with those obtained by direct leaf counts in the understorey of a wet evergreen forest in southern India. The point quadrat method has a tendency to overestimate, whereas the photographic method consistently and ignificantly underestimates foliage density. There was stratification within the understorey, with areas close to the ground having higher foliage densities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solar UV radiation is harmful for life on planet Earth, but fortunately the atmospheric oxygen and ozone absorb almost entirely the most energetic UVC radiation photons. However, part of the UVB radiation and much of the UVA radiation reaches the surface of the Earth, and affect human health, environment, materials and drive atmospheric and aquatic photochemical processes. In order to quantify these effects and processes there is a need for ground-based UV measurements and radiative transfer modeling to estimate the amounts of UV radiation reaching the biosphere. Satellite measurements with their near-global spatial coverage and long-term data conti-nuity offer an attractive option for estimation of the surface UV radiation. This work focuses on radiative transfer theory based methods used for estimation of the UV radiation reaching the surface of the Earth. The objectives of the thesis were to implement the surface UV algorithm originally developed at NASA Goddard Space Flight Center for estimation of the surface UV irradiance from the meas-urements of the Dutch-Finnish built Ozone Monitoring Instrument (OMI), to improve the original surface UV algorithm especially in relation with snow cover, to validate the OMI-derived daily surface UV doses against ground-based measurements, and to demonstrate how the satellite-derived surface UV data can be used to study the effects of the UV radiation. The thesis consists of seven original papers and a summary. The summary includes an introduction of the OMI instrument, a review of the methods used for modeling of the surface UV using satellite data as well as the con-clusions of the main results of the original papers. The first two papers describe the algorithm used for estimation of the surface UV amounts from the OMI measurements as well as the unique Very Fast Delivery processing system developed for processing of the OMI data received at the Sodankylä satellite data centre. The third and the fourth papers present algorithm improvements related to the surface UV albedo of the snow-covered land. Fifth paper presents the results of the comparison of the OMI-derived daily erythemal doses with those calculated from the ground-based measurement data. It gives an estimate of the expected accuracy of the OMI-derived sur-face UV doses for various atmospheric and other conditions, and discusses the causes of the differences between the satellite-derived and ground-based data. The last two papers demonstrate the use of the satellite-derived sur-face UV data. Sixth paper presents an assessment of the photochemical decomposition rates in aquatic environment. Seventh paper presents use of satellite-derived daily surface UV doses for planning of the outdoor material weathering tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge of the physical properties of asteroids is crucial in many branches of solar-system research. Knowledge of the spin states and shapes is needed, e.g., for accurate orbit determination and to study the history and evolution of the asteroids. In my thesis, I present new methods for using photometric lightcurves of asteroids in the determination of their spin states and shapes. The convex inversion method makes use of a general polyhedron shape model and provides us at best with an unambiguous spin solution and a convex shape solution that reproduces the main features of the original shape. Deriving information about the non-convex shape features is, in principle, also possible, but usually requires a priori information about the object. Alternatively, a distribution of non-convex solutions, describing the scale of the non-convexities, is also possible to be obtained. Due to insufficient number of absolute observations and inaccurately defined asteroid phase curves, the $c/b$-ratio, i.e., the flatness of the shape model is often somewhat ill-defined. However, especially in the case of elongated objects, the flatness seems to be quite well constrained, even in the case when only relative lightcurves are available. The results prove that it is, contrary to the earlier misbelief, possible to derive shape information from the lightcurve data if a sufficiently wide range of observing geometries is covered by the observations. Along with the more accurate shape models, also the rotational states, i.e., spin vectors and rotation periods, are defined with improved accuracy. The shape solutions obtained so far reveal a population of irregular objects whose most descriptive shape characteristics, however, can be expressed with only a few parameters. Preliminary statistical analyses for the shapes suggests that there are correlations between shape and other physical properties, such as the size, rotation period and taxonomic type of the asteroids. More shape data of, especially, the smallest and largest asteroids, as well as the fast and slow rotators is called for in order to be able to study the statistics more thoroughly.