955 resultados para TAC, Radon, ricostruzione, tomografia


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Floating in the air that surrounds us is a number of small particles, invisible to the human eye. The mixture of air and particles, liquid or solid, is called an aerosol. Aerosols have significant effects on air quality, visibility and health, and on the Earth's climate. Their effect on the Earth's climate is the least understood of climatically relevant effects. They can scatter the incoming radiation from the Sun, or they can act as seeds onto which cloud droplets are formed. Aerosol particles are created directly, by human activity or natural reasons such as breaking ocean waves or sandstorms. They can also be created indirectly as vapors or very small particles are emitted into the atmosphere and they combine to form small particles that later grow to reach climatically or health relevant sizes. The mechanisms through which those particles are formed is still under scientific discussion, even though this knowledge is crucial to make air quality or climate predictions, or to understand how aerosols will influence and will be influenced by the climate's feedback loops. One of the proposed mechanisms responsible for new particle formation is ion-induced nucleation. This mechanism is based on the idea that newly formed particles were ultimately formed around an electric charge. The amount of available charges in the atmosphere varies depending on radon concentrations in the soil and in the air, as well as incoming ionizing radiation from outer space. In this thesis, ion-induced nucleation is investigated through long-term measurements in two different environments: in the background site of Hyytiälä and in the urban site that is Helsinki. The main conclusion of this thesis is that ion-induced nucleation generally plays a minor role in new particle formation. The fraction of particles formed varies from day to day and from place to place. The relative importance of ion-induced nucleation, i.e. the fraction of particles formed through ion-induced nucleation, is bigger in cleaner areas where the absolute number of particles formed is smaller. Moreover, ion-induced nucleation contributes to a bigger fraction of particles on warmer days, when the sulfuric acid and water vapor saturation ratios are lower. This analysis will help to understand the feedbacks associated with climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aerosol particles have effect on climate, visibility, air quality and human health. However, the strength of which aerosol particles affect our everyday life is not well described or entirely understood. Therefore, investigations of different processes and phenomena including e.g. primary particle sources, initial steps of secondary particle formation and growth, significance of charged particles in particle formation, as well as redistribution mechanisms in the atmosphere are required. In this work sources, sinks and concentrations of air ions (charged molecules, cluster and particles) were investigated directly by measuring air molecule ionising components (i.e. radon activity concentrations and external radiation dose rates) and charged particle size distributions, as well as based on literature review. The obtained results gave comprehensive and valuable picture of the spatial and temporal variation of the air ion sources, sinks and concentrations to use as input parameters in local and global scale climate models. Newly developed air ion spectrometers (Airel Ltd.) offered a possibility to investigate atmospheric (charged) particle formation and growth at sub-3 nm sizes. Therefore, new visual classification schemes for charged particle formation events were developed, and a newly developed particle growth rate method was tested with over one year dataset. These data analysis methods have been widely utilised by other researchers since introducing them. This thesis resulted interesting characteristics of atmospheric particle formation and growth: e.g. particle growth may sometimes be suppressed before detection limit (~ 3 nm) of traditional aerosol instruments, particle formation may take place during daytime as well as in the evening, growth rates of sub-3 nm particles were quite constant throughout the year while growth rates of larger particles (3-20 nm in diameter) were higher during summer compared to winter. These observations were thought to be a consequence of availability of condensing vapours. The observations of this thesis offered new understanding of the particle formation in the atmosphere. However, the role of ions in particle formation, which is not well understood with current knowledge, requires further research in future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a number of applications of computerized tomography, the ultimate goal is to detect and characterize objects within a cross section. Detection of edges of different contrast regions yields the required information. The problem of detecting edges from projection data is addressed. It is shown that the class of linear edge detection operators used on images can be used for detection of edges directly from projection data. This not only reduces the computational burden but also avoids the difficulties of postprocessing a reconstructed image. This is accomplished by a convolution backprojection operation. For example, with the Marr-Hildreth edge detection operator, the filtering function that is to be used on the projection data is the Radon transform of the Laplacian of the 2-D Gaussian function which is combined with the reconstruction filter. Simulation results showing the efficacy of the proposed method and a comparison with edges detected from the reconstructed image are presented

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we construct low decoding complexity STBCs by using the Pauli matrices as linear dispersion matrices. In this case the Hurwitz-Radon orthogonality condition is shown to be easily checked by transferring the problem to $\mathbb{F}_4$ domain. The problem of constructing low decoding complexity STBCs is shown to be equivalent to finding certain codes over $\mathbb{F}_4$. It is shown that almost all known low complexity STBCs can be obtained by this approach. New codes are given that have the least known decoding complexity in particular ranges of rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we address the reconstruction problem from laterally truncated helical cone-beam projections. The reconstruction problem from lateral truncation, though similar to that of interior radon problem, is slightly different from it as well as the local (lambda) tomography and pseudo-local tomography in the sense that we aim to reconstruct the entire object being scanned from a region-of-interest (ROI) scan data. The method proposed in this paper is a projection data completion approach followed by the use of any standard accurate FBP type reconstruction algorithm. In particular, we explore a windowed linear prediction (WLP) approach for data completion and compare the quality of reconstruction with the linear prediction (LP) technique proposed earlier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decoding of linear space-time block codes (STBCs) with sphere-decoding (SD) is well known. A fast-version of the SD known as fast sphere decoding (FSD) has been recently studied by Biglieri, Hong and Viterbo. Viewing a linear STBC as a vector space spanned by its defining weight matrices over the real number field, we define a quadratic form (QF), called the Hurwitz-Radon QF (HRQF), on this vector space and give a QF interpretation of the FSD complexity of a linear STBC. It is shown that the FSD complexity is only a function of the weight matrices defining the code and their ordering, and not of the channel realization (even though the equivalent channel when SD is used depends on the channel realization) or the number of receive antennas. It is also shown that the FSD complexity is completely captured into a single matrix obtained from the HRQF. Moreover, for a given set of weight matrices, an algorithm to obtain a best ordering of them leading to the least FSD complexity is presented. The well known classes of low FSD complexity codes (multi-group decodable codes, fast decodable codes and fast group decodable codes) are presented in the framework of HRQF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Planar triazinium cationic species from vanadyl-assisted cyclization of 1-(2-thiazolylazo)-2-naphthol (H-TAN, 1), 1-(2-pyridylazo)-2-naphthol (H-PAN, 2), 2-(2'-thiazolylazo)-p-cresol (H-TAC, 3) and 6-(2'-thiazolylazo)- resorcinol (H-TAR, 5) were prepared and characterized. A dioxovanadium(V) species VO2(TAR)] (4) was also isolated. Compounds 1, 2 and 4 were structurally characterized. Both 1 and 2 have planar structures. Complex 4 has (VO3N2)-O-V coordination geometry. The cyclised triazinium compound forms a radical species within -0.06 to -0.29 V vs. SCE in DMF-0.1 M tetrabutylammonium perchlorate with a second response due to formation of an anionic species. A confocal microscopic study showed higher nuclear uptake for 1 having a fused thiazole moiety than 2 with a fused pyridine ring. The compounds showed a partial intercalative mode of binding to calf thymus DNA. Compound 1 showed plasmid DNA photo-cleavage activity under argon and photocytotoxicity in HeLa and MCF-7 cells with IC50 values of 15.1 and 3.4 mu M respectively in visible light of 400-700 nm, while being essentially non-toxic in the dark with IC50 values of 90.4 and 21.9 mu M. ATDDFT study was done to rationalize the experimental data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using a Girsanov change of measures, we propose novel variations within a particle-filtering algorithm, as applied to the inverse problem of state and parameter estimations of nonlinear dynamical systems of engineering interest, toward weakly correcting for the linearization or integration errors that almost invariably occur whilst numerically propagating the process dynamics, typically governed by nonlinear stochastic differential equations (SDEs). Specifically, the correction for linearization, provided by the likelihood or the Radon-Nikodym derivative, is incorporated within the evolving flow in two steps. Once the likelihood, an exponential martingale, is split into a product of two factors, correction owing to the first factor is implemented via rejection sampling in the first step. The second factor, which is directly computable, is accounted for via two different schemes, one employing resampling and the other using a gain-weighted innovation term added to the drift field of the process dynamics thereby overcoming the problem of sample dispersion posed by resampling. The proposed strategies, employed as add-ons to existing particle filters, the bootstrap and auxiliary SIR filters in this work, are found to non-trivially improve the convergence and accuracy of the estimates and also yield reduced mean square errors of such estimates vis-a-vis those obtained through the parent-filtering schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automated security is one of the major concerns of modern times. Secure and reliable authentication systems are in great demand. A biometric trait like the finger knuckle print (FKP) of a person is unique and secure. Finger knuckle print is a novel biometric trait and is not explored much for real-time implementation. In this paper, three different algorithms have been proposed based on this trait. The first approach uses Radon transform for feature extraction. Two levels of security are provided here and are based on eigenvalues and the peak points of the Radon graph. In the second approach, Gabor wavelet transform is used for extracting the features. Again, two levels of security are provided based on magnitude values of Gabor wavelet and the peak points of Gabor wavelet graph. The third approach is intended to authenticate a person even if there is a damage in finger knuckle position due to injury. The FKP image is divided into modules and module-wise feature matching is done for authentication. Performance of these algorithms was found to be much better than very few existing works. Moreover, the algorithms are designed so as to implement in real-time system with minimal changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Girsanov linearization method (GLM), proposed earlier in Saha, N., and Roy, D., 2007, ``The Girsanov Linearisation Method for Stochastically Driven Nonlinear Oscillators,'' J. Appl. Mech., 74, pp. 885-897, is reformulated to arrive at a nearly exact, semianalytical, weak and explicit scheme for nonlinear mechanical oscillators under additive stochastic excitations. At the heart of the reformulated linearization is a temporally localized rejection sampling strategy that, combined with a resampling scheme, enables selecting from and appropriately modifying an ensemble of locally linearized trajectories while weakly applying the Girsanov correction (the Radon-Nikodym derivative) for the linearization errors. The semianalyticity is due to an explicit linearization of the nonlinear drift terms and it plays a crucial role in keeping the Radon-Nikodym derivative ``nearly bounded'' above by the inverse of the linearization time step (which means that only a subset of linearized trajectories with low, yet finite, probability exceeds this bound). Drift linearization is conveniently accomplished via the first few (lower order) terms in the associated stochastic (Ito) Taylor expansion to exclude (multiple) stochastic integrals from the numerical treatment. Similarly, the Radon-Nikodym derivative, which is a strictly positive, exponential (super-) martingale, is converted to a canonical form and evaluated over each time step without directly computing the stochastic integrals appearing in its argument. Through their numeric implementations for a few low-dimensional nonlinear oscillators, the proposed variants of the scheme, presently referred to as the Girsanov corrected linearization method (GCLM), are shown to exhibit remarkably higher numerical accuracy over a much larger range of the time step size than is possible with the local drift-linearization schemes on their own.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decoding of linear space-time block codes (STBCs) with sphere-decoding (SD) is well known. A fast-version of the SD known as fast sphere decoding (FSD) was introduced by Biglieri, Hong and Viterbo. Viewing a linear STBC as a vector space spanned by its defining weight matrices over the real number field, we define a quadratic form (QF), called the Hurwitz-Radon QF (HRQF), on this vector space and give a QF interpretation of the FSD complexity of a linear STBC. It is shown that the FSD complexity is only a function of the weight matrices defining the code and their ordering, and not of the channel realization (even though the equivalent channel when SD is used depends on the channel realization) or the number of receive antennas. It is also shown that the FSD complexity is completely captured into a single matrix obtained from the HRQF. Moreover, for a given set of weight matrices, an algorithm to obtain an optimal ordering of them leading to the least FSD complexity is presented. The well known classes of low FSD complexity codes (multi-group decodable codes, fast decodable codes and fast group decodable codes) are presented in the framework of HRQF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a Monte Carlo filter for recursive estimation of diffusive processes that modulate the instantaneous rates of Poisson measurements. A key aspect is the additive update, through a gain-like correction term, empirically approximated from the innovation integral in the time-discretized Kushner-Stratonovich equation. The additive filter-update scheme eliminates the problem of particle collapse encountered in many conventional particle filters. Through a few numerical demonstrations, the versatility of the proposed filter is brought forth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: The late antique destruction of two bronze statues of Pausanias - the Spartan general responsible for the Greek victory at the Battle of Plataea (479 BC) - housed in the temple of Athena Chalkioikos in Sparta (Lib. Ep. 1518), has been interpreted as one of the few cases of a violent conflict between pagan and Christian population in Greece. Nevertheless the sources suggest that late antique Sparta was a bastion of Hellenic paganism and give a picture of a small and quiet town ruled by a pagan educated élite, where pagans like Libanius wanted to live. Since there is no evidence of a violent conflict between pagans and Christians in Sparta, and Libanius confirms that in 365 AD all the temples and cult statues were still in place, this paper addresses the issue from a different point of view and offers a new contribution to the history of Sparta in Late Antiquity. By using literary, archaeological and epigraphic evidence the paper explores: 1) the relationship between Roman administration and Spartan élite in the IVth century AD; 2) the historical memory of Pausanias in Late Antiquity. It will be emphasized that the obscure burning of the two statues helped to remove from Sparta the memory of Pausanias - a controversial figure, misrepresented in Late Antiquity and connected to the ancient staseis in Laconia - in order to promote a positive image of Sparta as a city without conflicts and ruled by the political system of Lycurgus (eunomia). As documented by local inscriptions in praise of late Roman governors, the mythical lawgiver Lycurgus was the paradigm of the imperial governors who rebuilded the town in the IVth cent. AD. It can be assumed that while Rome, Constantinople, Antioch and Athens were troubled by political and religious violence or by seditions between different factions, Sparta aimed to revive its traditional model of civic order in the new historical context of Late Antiquity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY INTRODUCTION OVERVIEW OF INTERNATIONAL EBM HISTORY References CANADA Overview Activities to date Integrated Management implementation in Canada Objectives, indicators and reference points Assessment approaches Research directions for the future Management directions for the future References JAPAN Overview Conservation and sustainable use of marine living resources Harvest control by TAC system Stock Recovery Plan and effort regulation system Stock enhancement by hatchery-produced juvenile release Conservation and sustainable develop-ment on coastal waters The implementation of ecosystem-based management PEOPLE’S REPUBLIC OF CHINA Overview Current actions Output control Input control Summer fishing ban Enhance ecosystem health REPUBLIC OF KOREA Initiatives and actions of ecosystem-based management in Korea Current ecosystem-based management initiatives in Korea Precautionary TAC-based fishery management Closed fishing season/areas Fish size- and sex-controls Fishing gear design restrictions Marine protected areas (MPA) RUSSIA Existing and anticipated ecosystem-based management initiatives Issues related to the implementation of ecosystem-based management UNITED STATES OF AMERICA Definitions and approaches to ecosystem-based fishery management in the United States Present U.S. legislative mandates relating to ecosystem-based fishery management Target species Bycatch species Threatened or endangered species Habitats Food webs Ecosystems Integration of legislative mandates into an ecosystem approach Scientific issues in implementing ecosystem-based approaches References DISCUSSION AND RECOMMENDATIONS APPENDICES Appendix 10.1 Study group membership and participants Appendix 10.2 Terminology definitions Appendix 10.3 Present state of implementing ecosystem-based fishery management in Alaska: Alaska groundfish fisheries Appendix 10.4 Present state of implementing ecosystem-based fishery management off the West Coast of the United States: Pacific Coast groundfish fisheries Appendix 10.5 Descriptions of multi-species and ecosystem models developed or under development in the U.S. North Pacific region that might be used to predict effects of fishing on ecosystems Appendix 10.6 A potential standard reporting format (developed by Australia, and currently being used by the U.S.A in their contribution to this report) (83 page document)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Setting total allowable catches (TACs) is an endogenous process in which different agents and institutions, often with conflicting interests and opportunistic behaviour, try to influence policy-makers. Such policy-makers, far from being the benevolent social planners many would wish them to be, may also pursue self-interest when making final decisions. Although restricted knowledge of stock abundance and population dynamics, and weakness in enforcement, have effects, these other factors may explain the reason why TAC management has failed to guarantee sustainable exploitation of fish resources. Rejecting the exogeneity of the TAC and taking advantage of fruitful debate on economic policy (i.e. the rules vs. discretion debate, and that surrounding the independence of central banks), two institutional developments are analysed as potential mechanisms to face up to misconceptions about TACs: long-term harvest control rules, and a central bank of fish.