934 resultados para Capture probability
Resumo:
Advancements in the analysis techniques have led to a rapid accumulation of biological data in databases. Such data often are in the form of sequences of observations, examples including DNA sequences and amino acid sequences of proteins. The scale and quality of the data give promises of answering various biologically relevant questions in more detail than what has been possible before. For example, one may wish to identify areas in an amino acid sequence, which are important for the function of the corresponding protein, or investigate how characteristics on the level of DNA sequence affect the adaptation of a bacterial species to its environment. Many of the interesting questions are intimately associated with the understanding of the evolutionary relationships among the items under consideration. The aim of this work is to develop novel statistical models and computational techniques to meet with the challenge of deriving meaning from the increasing amounts of data. Our main concern is on modeling the evolutionary relationships based on the observed molecular data. We operate within a Bayesian statistical framework, which allows a probabilistic quantification of the uncertainties related to a particular solution. As the basis of our modeling approach we utilize a partition model, which is used to describe the structure of data by appropriately dividing the data items into clusters of related items. Generalizations and modifications of the partition model are developed and applied to various problems. Large-scale data sets provide also a computational challenge. The models used to describe the data must be realistic enough to capture the essential features of the current modeling task but, at the same time, simple enough to make it possible to carry out the inference in practice. The partition model fulfills these two requirements. The problem-specific features can be taken into account by modifying the prior probability distributions of the model parameters. The computational efficiency stems from the ability to integrate out the parameters of the partition model analytically, which enables the use of efficient stochastic search algorithms.
Resumo:
The transport of live fish is a crucial step to establish fish culture in captivity, and is especially challenging for species that have not been commonly cultured before, therefore transport and handling methods need to be optimized and tailored. This study describes the use of tuna tubes for small-scale transport of medium-sized pelagic fish from the Scombridae family. Tuna tubes are an array of vertical tubes that hold the fish, while fresh seawater is pumped up the tubes and through the fish mouth and gills, providing oxygen and removing wastes. In this study, 19 fish were captured using rod and line and 42% of the captured fish were transported alive in the custom-designed tuna tubes to an on-shore holding tank: five mackerel tuna (Euthynnus affinis) and three leaping bonito (Cybiosarda elegans). Out of these, just three (15.8% of total fish) acclimatized to the tank's condition. Based on these results, we discuss an improved design of the tuna tubes that has the potential to increase survival rates and enable a simple and low cost method of transporting of live pelagic fish.
Resumo:
Network data packet capture and replay capabilities are basic requirements for forensic analysis of faults and security-related anomalies, as well as for testing and development. Cyber-physical networks, in which data packets are used to monitor and control physical devices, must operate within strict timing constraints, in order to match the hardware devices' characteristics. Standard network monitoring tools are unsuitable for such systems because they cannot guarantee to capture all data packets, may introduce their own traffic into the network, and cannot reliably reproduce the original timing of data packets. Here we present a high-speed network forensics tool specifically designed for capturing and replaying data traffic in Supervisory Control and Data Acquisition systems. Unlike general-purpose "packet capture" tools it does not affect the observed network's data traffic and guarantees that the original packet ordering is preserved. Most importantly, it allows replay of network traffic precisely matching its original timing. The tool was implemented by developing novel user interface and back-end software for a special-purpose network interface card. Experimental results show a clear improvement in data capture and replay capabilities over standard network monitoring methods and general-purpose forensics solutions.
Resumo:
Hydrologic impacts of climate change are usually assessed by downscaling the General Circulation Model (GCM) output of large-scale climate variables to local-scale hydrologic variables. Such an assessment is characterized by uncertainty resulting from the ensembles of projections generated with multiple GCMs, which is known as intermodel or GCM uncertainty. Ensemble averaging with the assignment of weights to GCMs based on model evaluation is one of the methods to address such uncertainty and is used in the present study for regional-scale impact assessment. GCM outputs of large-scale climate variables are downscaled to subdivisional-scale monsoon rainfall. Weights are assigned to the GCMs on the basis of model performance and model convergence, which are evaluated with the Cumulative Distribution Functions (CDFs) generated from the downscaled GCM output (for both 20th Century [20C3M] and future scenarios) and observed data. Ensemble averaging approach, with the assignment of weights to GCMs, is characterized by the uncertainty caused by partial ignorance, which stems from nonavailability of the outputs of some of the GCMs for a few scenarios (in Intergovernmental Panel on Climate Change [IPCC] data distribution center for Assessment Report 4 [AR4]). This uncertainty is modeled with imprecise probability, i.e., the probability being represented as an interval gray number. Furthermore, the CDF generated with one GCM is entirely different from that with another and therefore the use of multiple GCMs results in a band of CDFs. Representing this band of CDFs with a single valued weighted mean CDF may be misleading. Such a band of CDFs can only be represented with an envelope that contains all the CDFs generated with a number of GCMs. Imprecise CDF represents such an envelope, which not only contains the CDFs generated with all the available GCMs but also to an extent accounts for the uncertainty resulting from the missing GCM output. This concept of imprecise probability is also validated in the present study. The imprecise CDFs of monsoon rainfall are derived for three 30-year time slices, 2020s, 2050s and 2080s, with A1B, A2 and B1 scenarios. The model is demonstrated with the prediction of monsoon rainfall in Orissa meteorological subdivision, which shows a possible decreasing trend in the future.
Resumo:
A new simple-pole model for muon capture by 40Ca with emission of neutrons is suggested, in close analogy with radiative pion capture, and the calculated energy spectrum of the emitted neutron agrees well with the experimental results of the Columbia group for higher neutron energies.
Resumo:
Defence against pathogens is a vital need of all living organisms that has led to the evolution of complex immune mechanisms. However, although immunocompetence the ability to resist pathogens and control infection has in recent decades become a focus for research in evolutionary ecology, the variation in immune function observed in natural populations is relatively little understood. This thesis examines sources of this variation (environmental, genetic and maternal effects) during the nestling stage and its fitness consequences in wild populations of passerines: the blue tit (Cyanistes caeruleus) and the collared flycatcher (Ficedula albicollis). A developing organism may face a dilemma as to whether to allocate limited resources to growth or to immune defences. The optimal level of investment in immunity is shaped inherently by specific requirements of the environment. If the probability of contracting infection is low, maintaining high growth rates even at the expense of immune function may be advantageous for nestlings, as body mass is usually a good predictor of post-fledging survival. In experiments with blue tits and haematophagous hen fleas (Ceratophyllus gallinae) using two methods, methionine supplementation (to manipulate nestlings resource allocation to cellular immune function) and food supplementation (to increase resource availability), I confirmed that there is a trade-off between growth and immunity and that the abundance of ectoparasites is an environmental factor affecting allocation of resources to immune function. A cross-fostering experiment also revealed that environmental heterogeneity in terms of abundance of ectoparasites may contribute to maintaining additive genetic variation in immunity and other traits. Animal model analysis of extensive data collected from the population of collared flycatchers on Gotland (Sweden) allowed examination of the narrow-sense heritability of PHA-response the most commonly used index of cellular immunocompetence in avian studies. PHA-response is not heritable in this population, but is subject to a non-heritable origin (presumably maternal) effect. However, experimental manipulation of yolk androgen levels indicates that the mechanism of the maternal effect in PHA-response is not in ovo deposition of androgens. The relationship between PHA-response and recruitment was studied for over 1300 collared flycatcher nestlings. Multivariate selection analysis shows that it is body mass, not PHA-response, that is under direct selection. PHA-response appears to be related to recruitment because of its positive relationship with body mass. These results imply that either PHA-response fails to capture the immune mechanisms that are relevant for defence against pathogens encountered by fledglings or that the selection pressure from parasites is not as strong as commonly assumed.
Resumo:
We derive a very general expression of the survival probability and the first passage time distribution for a particle executing Brownian motion in full phase space with an absorbing boundary condition at a point in the position space, which is valid irrespective of the statistical nature of the dynamics. The expression, together with the Jensen's inequality, naturally leads to a lower bound to the actual survival probability and an approximate first passage time distribution. These are expressed in terms of the position-position, velocity-velocity, and position-velocity variances. Knowledge of these variances enables one to compute a lower bound to the survival probability and consequently the first passage distribution function. As examples, we compute these for a Gaussian Markovian process and, in the case of non-Markovian process, with an exponentially decaying friction kernel and also with a power law friction kernel. Our analysis shows that the survival probability decays exponentially at the long time irrespective of the nature of the dynamics with an exponent equal to the transition state rate constant.
Resumo:
Study of the evolution of species or organisms is essential for various biological applications. Evolution is typically studied at the molecular level by analyzing the mutations of DNA sequences of organisms. Techniques have been developed for building phylogenetic or evolutionary trees for a set of sequences. Though phylogenetic trees capture the overall evolutionary relationships among the sequences, they do not reveal fine-level details of the evolution. In this work, we attempt to resolve various fine-level sequence transformation details associated with a phylogenetic tree using cellular automata. In particular, our work tries to determine the cellular automata rules for neighbor-dependent mutations of segments of DNA sequences. We also determine the number of time steps needed for evolution of a progeny from an ancestor and the unknown segments of the intermediate sequences in the phylogenetic tree. Due to the existence of vast number of cellular automata rules, we have developed a grid system that performs parallel guided explorations of the rules on grid resources. We demonstrate our techniques by conducting experiments on a grid comprising machines in three countries and obtaining potentially useful statistics regarding evolutions in three HIV sequences. In particular, our work is able to verify the phenomenon of neighbor-dependent mutations and find that certain combinations of neighbor-dependent mutations, defined by a cellular automata rule, occur with greater than 90% probability. We also find the average number of time steps for mutations for some branches of phylogenetic tree over a large number of possible transformations with standard deviations less than 2.
Resumo:
We consider the problem of detecting statistically significant sequential patterns in multineuronal spike trains. These patterns are characterized by ordered sequences of spikes from different neurons with specific delays between spikes. We have previously proposed a data-mining scheme to efficiently discover such patterns, which occur often enough in the data. Here we propose a method to determine the statistical significance of such repeating patterns. The novelty of our approach is that we use a compound null hypothesis that not only includes models of independent neurons but also models where neurons have weak dependencies. The strength of interaction among the neurons is represented in terms of certain pair-wise conditional probabilities. We specify our null hypothesis by putting an upper bound on all such conditional probabilities. We construct a probabilistic model that captures the counting process and use this to derive a test of significance for rejecting such a compound null hypothesis. The structure of our null hypothesis also allows us to rank-order different significant patterns. We illustrate the effectiveness of our approach using spike trains generated with a simulator.
Resumo:
Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.
Resumo:
Boron neutron capture therapy (BNCT) is a radiotherapy that has mainly been used to treat malignant brain tumours, melanomas, and head and neck cancer. In BNCT, the patient receives an intravenous infusion of a 10B-carrier, which accumulates in the tumour area. The tumour is irradiated with epithermal or thermal neutrons, which result in a boron neutron capture reaction that generates heavy particles to damage tumour cells. In Finland, boronophenylalanine fructose (BPA-F) is used as the 10B-carrier. Currently, the drifting of boron from blood to tumour as well as the spatial and temporal accumulation of boron in the brain, are not precisely known. Proton magnetic resonance spectroscopy (1H MRS) could be used for selective BPA-F detection and quantification as aromatic protons of BPA resonate in the spectrum region, which is clear of brain metabolite signals. This study, which included both phantom and in vivo studies, examined the validity of 1H MRS as a tool for BPA detection. In the phantom study, BPA quantification was studied at 1.5 and 3.0 T with single voxel 1H MRS, and at 1.5 T with magnetic resonance imaging (MRSI). The detection limit of BPA was determined in phantom conditions at 1.5 T and 3.0 T using single voxel 1H MRS, and at 1.5 T using MRSI. In phantom conditions, BPA quantification accuracy of ± 5% and ± 15% were achieved with single voxel MRS using external or internal (internal water signal) concentration references, respectively. For MRSI, a quantification accuracy of <5% was obtained using an internal concentration reference (creatine). The detection limits of BPA in phantom conditions for the PRESS sequence were 0.7 (3.0 T) and 1.4 mM (1.5 T) mM with 20 × 20 × 20 mm3 single voxel MRS, and 1.0 mM with acquisition-weighted MRSI (nominal voxel volume 10(RL) × 10(AP) × 7.5(SI) mm3), respectively. In the in vivo study, an MRSI or single voxel MRS or both was performed for ten patients (patients 1-10) on the day of BNCT. Three patients had glioblastoma multiforme (GBM), and five patients had a recurrent or progressing GBM or anaplastic astrocytoma gradus III, and two patients had head and neck cancer. For nine patients (patients 1-9), MRS/MRSI was performed 70-140 min after the second irradiation field, and for one patient (patient 10), the MRSI study began 11 min before the end of the BPA-F infusion and ended 6 min after the end of the infusion. In comparison, single voxel MRS was performed before BNCT, for two patients (patients 3 and 9), and for one patient (patient 9), MRSI was performed one month after treatment. For one patient (patient 10), MRSI was performed four days before infusion. Signals from the tumour spectrum aromatic region were detected on the day of BNCT in three patients, indicating that in favourable cases, it is possible to detect BPA in vivo in the patient’s brain after BNCT treatment or at the end of BPA-F infusion. However, because the shape and position of the detected signals did not exactly match the BPA spectrum detected in the in vitro conditions, assignment of BPA is difficult. The opportunity to perform MRS immediately after the end of BPA-F infusion for more patients is necessary to evaluate the suitability of 1H MRS for BPA detection or quantification for treatment planning purposes. However, it could be possible to use MRSI as criteria in selecting patients for BNCT.