961 resultados para counting
Resumo:
Large-scale chromosome rearrangements such as copy number variants (CNVs) and inversions encompass a considerable proportion of the genetic variation between human individuals. In a number of cases, they have been closely linked with various inheritable diseases. Single-nucleotide polymorphisms (SNPs) are another large part of the genetic variance between individuals. They are also typically abundant and their measuring is straightforward and cheap. This thesis presents computational means of using SNPs to detect the presence of inversions and deletions, a particular variety of CNVs. Technically, the inversion-detection algorithm detects the suppressed recombination rate between inverted and non-inverted haplotype populations whereas the deletion-detection algorithm uses the EM-algorithm to estimate the haplotype frequencies of a window with and without a deletion haplotype. As a contribution to population biology, a coalescent simulator for simulating inversion polymorphisms has been developed. Coalescent simulation is a backward-in-time method of modelling population ancestry. Technically, the simulator also models multiple crossovers by using the Counting model as the chiasma interference model. Finally, this thesis includes an experimental section. The aforementioned methods were tested on synthetic data to evaluate their power and specificity. They were also applied to the HapMap Phase II and Phase III data sets, yielding a number of candidates for previously unknown inversions, deletions and also correctly detecting known such rearrangements.
Resumo:
Much of our understanding and management of ecological processes requires knowledge of the distribution and abundance of species. Reliable abundance or density estimates are essential for managing both threatened and invasive populations, yet are often challenging to obtain. Recent and emerging technological advances, particularly in unmanned aerial vehicles (UAVs), provide exciting opportunities to overcome these challenges in ecological surveillance. UAVs can provide automated, cost-effective surveillance and offer repeat surveys for pest incursions at an invasion front. They can capitalise on manoeuvrability and advanced imagery options to detect species that are cryptic due to behaviour, life-history or inaccessible habitat. UAVs may also cause less disturbance, in magnitude and duration, for sensitive fauna than other survey methods such as transect counting by humans or sniffer dogs. The surveillance approach depends upon the particular ecological context and the objective. For example, animal, plant and microbial target species differ in their movement, spread and observability. Lag-times may exist between a pest species presence at a site and its detectability, prompting a need for repeat surveys. Operationally, however, the frequency and coverage of UAV surveys may be limited by financial and other constraints, leading to errors in estimating species occurrence or density. We use simulation modelling to investigate how movement ecology should influence fine-scale decisions regarding ecological surveillance using UAVs. Movement and dispersal parameter choices allow contrasts between locally mobile but slow-dispersing populations, and species that are locally more static but invasive at the landscape scale. We find that low and slow UAV flights may offer the best monitoring strategy to predict local population densities in transects, but that the consequent reduction in overall area sampled may sacrifice the ability to reliably predict regional population density. Alternative flight plans may perform better, but this is also dependent on movement ecology and the magnitude of relative detection errors for different flight choices. Simulated investigations such as this will become increasingly useful to reveal how spatio-temporal extent and resolution of UAV monitoring should be adjusted to reduce observation errors and thus provide better population estimates, maximising the efficacy and efficiency of unmanned aerial surveys.
Resumo:
The vastly increased popularity of the Internet as an effective publication and distribution channel of digital works has created serious challenges to enforcing intellectual property rights. Works are widely disseminated on the Internet, with and without permission. This thesis examines the current problems with licence management and copy protection and outlines a new method and system that solve these problems. The WARP system (Works, Authors, Royalties, and Payments) is based on global registration and transfer monitoring of digital works, and accounting and collection of Internet levy funded usage fees payable to the authors and right holders of the works. The detection and counting of downloads is implemented with origrams, short and original parts picked from the contents of the digital work. The origrams are used to create digests, digital fingerprints that identify the piece of work transmitted over the Internet without the need to embed ID tags or any other easily removable metadata in the file.
Resumo:
A high temperature source has been developed and coupled to a high resolution Fourier transform spectrometer to record emission spectra of acetylene around 3 mu m up to 1455 K under Doppler limited resolution (0.015 cm(-1)). The nu(3)-ground state (GS) and nu(2)+nu(4)+nu(5)(Sigma(+)(u) and Delta(u))-GS bands and 76 related hot bands, counting e and f parities separately, are assigned using semiautomatic methods based on a global model to reproduce all related vibration-rotation states. Significantly higher J-values than previously reported are observed for 40 known substates while 37 new e or f vibrational substates, up to about 6000 cm(-1), are identified and characterized by vibration-rotation parameters. The 3 811 new or improved data resulting from the analysis are merged into the database presented by Robert et al. [Mol. Phys. 106, 2581 (2008)], now including 15 562 lines accessing vibrational states up to 8600 cm(-1). A global model, updated as compared to the one in the previous paper, allows all lines in the database to be simultaneously fitted, successfully. The updates are discussed taking into account, in particular, the systematic inclusion of Coriolis interaction.
Resumo:
Mediastinitis as a complication after cardiac surgery is rare but disastrous increasing the hospital stay, hospital costs, morbidity and mortality. It occurs in 1-3 % of patients after median sternotomy. The purpose of this study was to find out the risk factors and also to investigate new ways to prevent mediastinitis. First, we assessed operating room air contamination monitoring by comparing the bacteriological technique with continuous particle counting in low level contamination achieved by ultra clean garment options in 66 coronary artery bypass grafting operations. Second, we examined surgical glove perforations and the changes in bacterial flora of surgeons' fingertips in 116 open-heart operations. Third, the effect of gentamicin-collagen sponge on preventing surgical site infections (SSI) was studied in randomized controlled study with 557 participants. Finally, incidence, outcome, and risk factors of mediastinitis were studied in over 10,000 patients. With the alternative garment and textile system (cotton group and clean air suit group), the air counts fell from 25 to 7 colony-forming units/m3 (P<0.01). The contamination of the sternal wound was reduced by 46% and that of the leg wound by >90%. In only 17% operations both gloves were found unpunctured. Frequency of glove perforations and bacteria counts of hands were found to increase with operation time. With local gentamicin prophylaxis slightly less SSIs (4.0 vs. 5.9%) and mediastinitis (1.1 vs. 1.9%) occurred. We identified 120/10713 cases of postoperative mediastinitis (1.1%). During the study period, the patient population grew significantly older, the proportion of women and patients with ASA score >3 increased significantly. In multivariate logistic regression analysis, the only significant predictor for mediastinitis was obesity. Continuous particle monitoring is a good intraoperative method to control the air contamination related to the theatre staff behavior during individual operation. When a glove puncture is detected, both gloves are to be changed. Before donning a new pair of gloves, the renewed disinfection of hands will help to keep their bacterial counts lower even towards the end of long operation. Gentamicin-collagen sponge may have beneficial effects on the prevention of SSI, but further research is needed. Mediastinitis is not diminishing. Larger populations at risk, for example proportions of overweight patients, reinforce the importance of surveillance and pose a challenge in focusing preventive measures.
Resumo:
Viable stuffed fullerenelike boron carbide nanoclusters, C50B34, C48B36-2, and their isomers based on an icosahedral B-84 fragment of elemental beta-rhombohedral boron have been investigated using density functional theory calculations. The structure and the stability of these clusters are rationalized using the polyhedral skeletal electron counting and ring-cap orbital overlap compatibility rules. The curvature of the fullerene was found to play a vital role in achieving the most stable isomer C50B34(3B). The large highest occupied molecular orbital-lowest unoccupied molecular orbital (HOMO-LUMO) gaps, three dimensional aromaticity, and electron detachment energies support their high stability. Further, the IR and Raman active modes were recognized.
Resumo:
An acyclic edge coloring of a graph is a proper edge coloring such that there are no bichromatic cycles. The acyclic chromatic index of a graph is the minimum number k such that there is an acyclic edge coloring using k colors and is denoted by a'(G). It was conjectured by Alon, Suclakov and Zaks (and earlier by Fiamcik) that a'(G) <= Delta+2, where Delta = Delta(G) denotes the maximum degree of the graph. Alon et al. also raised the question whether the complete graphs of even order are the only regular graphs which require Delta+2 colors to be acyclically edge colored. In this article, using a simple counting argument we observe not only that this is not true, but in fact all d-regular graphs with 2n vertices and d>n, requires at least d+2 colors. We also show that a'(K-n,K-n) >= n+2, when n is odd using a more non-trivial argument. (Here K-n,K-n denotes the complete bipartite graph with n vertices on each side.) This lower bound for Kn,n can be shown to be tight for some families of complete bipartite graphs and for small values of n. We also infer that for every d, n such that d >= 5, n >= 2d+3 and dn even, there exist d-regular graphs which require at least d+2-colors to be acyclically edge colored. (C) 2009 Wiley Periodicals, Inc. J Graph Theory 63: 226-230, 2010.
Resumo:
We consider the problem of detecting statistically significant sequential patterns in multineuronal spike trains. These patterns are characterized by ordered sequences of spikes from different neurons with specific delays between spikes. We have previously proposed a data-mining scheme to efficiently discover such patterns, which occur often enough in the data. Here we propose a method to determine the statistical significance of such repeating patterns. The novelty of our approach is that we use a compound null hypothesis that not only includes models of independent neurons but also models where neurons have weak dependencies. The strength of interaction among the neurons is represented in terms of certain pair-wise conditional probabilities. We specify our null hypothesis by putting an upper bound on all such conditional probabilities. We construct a probabilistic model that captures the counting process and use this to derive a test of significance for rejecting such a compound null hypothesis. The structure of our null hypothesis also allows us to rank-order different significant patterns. We illustrate the effectiveness of our approach using spike trains generated with a simulator.
Resumo:
The structure and operation of CdTe, CdZnTe and Si pixel detectors based on crystalline semiconductors, bump bonding and CMOS technology and developed mainly at Oy Simage Ltd. And Oy Ajat Ltd., Finland for X- and gamma ray imaging are presented. This detector technology evolved from the development of Si strip detectors at the Finnish Research Institute for High Energy Physics (SEFT) which later merged with other physics research units to form the Helsinki Institute of Physics (HIP). General issues of X-ray imaging such as the benefits of the method of direct conversion of X-rays to signal charge in comparison to the indirect method and the pros and cons of photon counting vs. charge integration are discussed. A novel design of Si and CdTe pixel detectors and the analysis of their imaging performance in terms of SNR, MTF, DQE and dynamic range are presented in detail. The analysis shows that directly converting crystalline semiconductor pixel detectors operated in the charge integration mode can be used in X-ray imaging very close to the theoretical performance limits in terms of efficiency and resolution. Examples of the application of the developed imaging technology to dental intra oral and panoramic and to real time X-ray imaging are given. A CdTe photon counting gamma imager is introduced. A physical model to calculate the photo peak efficiency of photon counting CdTe pixel detectors is developed and described in detail. Simulation results indicates that the charge sharing phenomenon due to diffusion of signal charge carriers limits the pixel size of photon counting detectors to about 250 μm. Radiation hardness issues related to gamma and X-ray imaging detectors are discussed.
Resumo:
Nucleation is the first step in the formation of a new phase inside a mother phase. Two main forms of nucleation can be distinguished. In homogeneous nucleation, the new phase is formed in a uniform substance. In heterogeneous nucleation, on the other hand, the new phase emerges on a pre-existing surface (nucleation site). Nucleation is the source of about 30% of all atmospheric aerosol which in turn has noticeable health effects and a significant impact on climate. Nucleation can be observed in the atmosphere, studied experimentally in the laboratory and is the subject of ongoing theoretical research. This thesis attempts to be a link between experiment and theory. By comparing simulation results to experimental data, the aim is to (i) better understand the experiments and (ii) determine where the theory needs improvement. Computational fluid dynamics (CFD) tools were used to simulate homogeneous onecomponent nucleation of n-alcohols in argon and helium as carrier gases, homogeneous nucleation in the water-sulfuric acid-system, and heterogeneous nucleation of water vapor on silver particles. In the nucleation of n-alcohols, vapor depletion, carrier gas effect and carrier gas pressure effect were evaluated, with a special focus on the pressure effect whose dependence on vapor and carrier gas properties could be specified. The investigation of nucleation in the water-sulfuric acid-system included a thorough analysis of the experimental setup, determining flow conditions, vapor losses, and nucleation zone. Experimental nucleation rates were compared to various theoretical approaches. We found that none of the considered theoretical descriptions of nucleation captured the role of water in the process at all relative humidities. Heterogeneous nucleation was studied in the activation of silver particles in a TSI 3785 particle counter which uses water as its working fluid. The role of the contact angle was investigated and the influence of incoming particle concentrations and homogeneous nucleation on counting efficiency determined.
Resumo:
A large fraction of an XML document typically consists of text data. The XPath query language allows text search via the equal, contains, and starts-with predicates. Such predicates can be efficiently implemented using a compressed self-index of the document's text nodes. Most queries, however, contain some parts querying the text of the document, plus some parts querying the tree structure. It is therefore a challenge to choose an appropriate evaluation order for a given query, which optimally leverages the execution speeds of the text and tree indexes. Here the SXSI system is introduced. It stores the tree structure of an XML document using a bit array of opening and closing brackets plus a sequence of labels, and stores the text nodes of the document using a global compressed self-index. On top of these indexes sits an XPath query engine that is based on tree automata. The engine uses fast counting queries of the text index in order to dynamically determine whether to evaluate top-down or bottom-up with respect to the tree structure. The resulting system has several advantages over existing systems: (1) on pure tree queries (without text search) such as the XPathMark queries, the SXSI system performs on par or better than the fastest known systems MonetDB and Qizx, (2) on queries that use text search, SXSI outperforms the existing systems by 1-3 orders of magnitude (depending on the size of the result set), and (3) with respect to memory consumption, SXSI outperforms all other systems for counting-only queries.
Resumo:
Counting-rate meters normally used for finding pulse frequencies are sluggish in their response to any rapid change in the pulse repetition frequency (P.R.F.). An instrument is described which measures each pulse interval and provides immediately afterwards an output voltage proportional to the reciprocal of interval duration. A response to a change in the P.R.F. as rapidly as is physically possible is obtained. The instrument has wide application in low level radiation detection and in several other fields especially for rapidly varying counting-rates.
Resumo:
Aerosol particles deteriorate air quality, atmospheric visibility and our health. They affect the Earth s climate by absorbing and scattering sunlight, forming clouds, and also via several feed-back mechanisms. The net effect on the radiative balance is negative, i.e. cooling, which means that particles counteract the effect of greenhouse gases. However, particles are one of the poorly known pieces in the climate puzzle. Some of the airborne particles are natural, some anthropogenic; some enter the atmosphere in particle form, while others form by gas-to-particle conversion. Unless the sources and dynamical processes shaping the particle population are quantified, they cannot be incorporated into climate models. The molecular level understanding of new particle formation is still inadequate, mainly due to the lack of suitable measurement techniques to detect the smallest particles and their precursors. This thesis has contributed to our ability to measure newly formed particles. Three new condensation particle counter applications for measuring the concentration of nano-particles were developed. The suitability of the methods for detecting both charged and electrically neutral particles and molecular clusters as small as 1 nm in diameter was thoroughly tested both in laboratory and field conditions. It was shown that condensation particle counting has reached the size scale of individual molecules, and besides measuring the concentration they can be used for getting size information. In addition to atmospheric research, the particle counters could have various applications in other fields, especially in nanotechnology. Using the new instruments, the first continuous time series of neutral sub-3 nm particle concentrations were measured at two field sites, which represent two different kinds of environments: the boreal forest and the Atlantic coastline, both of which are known to be hot-spots for new particle formation. The contribution of ions to the total concentrations in this size range was estimated, and it could be concluded that the fraction of ions was usually minor, especially in boreal forest conditions. Since the ionization rate is connected to the amount of cosmic rays entering the atmosphere, the relative contribution of neutral to charged nucleation mechanisms extends beyond academic interest, and links the research directly to current climate debate.
Resumo:
Radiometric determination methods, such as alpha spectrometry require long counting times when low activities are to be determined. Mass spectrometric techniques as Inductively Coupled Plasma Mass Spectrometry (ICP-MS), Thermal Ionisation Mass Spectrometry (TIMS) and Accelerator Mass Spectrometry (AMS) have shown several advantages compared to traditional methods when measuring long-lived radionuclides. Mass spectrometric methods for determination of very low concentrations of elemental isotopes, and thereby isotopic ratios, have been developed using a variety of ion sources. Although primarily applied to the determination of the lighter stable element isotopes and radioactive isotopes in geological studies, the techniques can equally well be applied to the measurement of activity concentrations of long-lived low-level radionuclides in various samples using “isotope dilution” methods such as those applied in inductively coupled plasma mass spectrometry (ICP-MS). Due to the low specific activity of long-lived radionuclides, many of these are more conveniently detected using mass spectrometric techniques. Mass spectrometry also enables the individual determination of Pu-239 and Pu-240, which cannot be obtained by alpha spectrometry. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) are rapidly growing techniques for the ultra-trace analytical determination of stable and long-lived isotopes and have a wide potential within environmental science, including ecosystem tracers and radio ecological studies. Such instrumentation, of course needs good radiochemical separation, to give best performance. The objectives of the project is to identify current needs and problems within low-level determination of long-lived radioisotopes by ICP-MS, to perform intercalibration and development and improvement of ICP-MS methods for the measurement of radionuclides and isotope ratios and to develop new methods based on modified separation chemistry applied to new auxiliary equipment.
Resumo:
The purpose of this study was to find out whether food-related lifestyle guides and explains product evaluations, specifically, consumer perceptions and choice evaluations of five different food product categories: lettuce, mincemeat, savoury sauce, goat cheese, and pudding. The opinions of consumers who shop in neighbourhood stores were considered most valuable. This study applies means-end chain (MEC) theory, according to which products are seen as means by which consumers attain meaningful goals. The food-related lifestyle (FRL) instrument was created to study lifestyles that reflect these goals. Further, this research has adopted the view that the FRL functions as a script which guides consumer behaviour. Two research methods were used in this study. The first was the laddering interview, the primary aim of which was to gather information for formulating the questionnaire of the main study. The survey consisted of two separate questionnaires. The first was the FRL questionnaire modified for this study. The aim of the other questionnaire was to determine the choice criteria for buying five different categories of food products. Before these analyses could be made, several data modifications were made following MEC analysis procedures. Beside forming FRL dimensions by counting sum-scores from the FRL statements, factor analysis was run in order to elicit latent factors underlying the dimensions. The lifestyle factors found were adventurous, conscientious, enthusiastic, snacking, moderate, and uninvolved lifestyles. The association analyses were done separately for each choice of product as well as for each attribute-consequence linkage with a non-parametric Mann-Whitney U test. The testing variables were FRL dimensions and the FRL lifestyle factors. In addition, the relation between the attribute-consequence linkages and the demographic variables were analysed. Results from this study showed that the choice of product is sequential, so that consumers first categorize products into groups based on specific criteria like health or convenience. It was attested that the food-related lifestyles function as a script in food choice and that the FRL instrument can be used to predict consumer buying behaviour. Certain lifestyles were associated with the choice of each product category. The actual product choice within a product category then appeared to be a different matter. In addition, this study proposes a modification to the FRL instrument. The positive towards advertising FRL dimension was modified to examine many kinds of information search including the internet, TV, magazines, and other people. This new dimension, which was designated as being open to additional information, proved to be very robust and reliable in finding differences in consumer choice behaviour. Active additional information search was linked to adventurous and snacking food-related lifestyles. The results of this study support the previous knowledge that consumers expect to get many benefits simultaneously when they buy food products. This study brought detailed information about the benefits sought, the combination of benefits differing between products and between respondents. Household economy, pleasure and quality were emphasized with the choice of lettuce. Quality was the most significant benefit in choosing mincemeat, but health related benefits were often evaluated as well. The dominant benefits linked to savoury sauce were household economic benefits, expected pleasurable experiences, and a lift in self-respect. The choice of goat cheese appeared not to be an economic decision, self-respect, pleasure, and quality being included in the choice criteria. In choosing pudding, the respondents considered the well-being of family members, and indulged their family members or themselves.