177 resultados para box-counting method
em Université de Lausanne, Switzerland
Resumo:
Liquid scintillation counting (LSC) is one of the most widely used methods for determining the activity of 241Pu. One of the main challenges of this counting method is the efficiency calibration of the system for the low beta energies of 241Pu (Emax = 20.8 keV). In this paper we compare the two most frequently used methods, the CIEMAT/NIST efficiency tracing (CNET) method and the experimental quench correction curve method. Both methods proved to be reliable, and agree within their uncertainties, for the expected quenching conditions of the sources.
Resumo:
The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.
Resumo:
The 4πβ-γ coincidence counting method and its close relatives are widely used for the primary standardization of radioactivity. Both the general formalism and specific implementation of these methods have been well-documented. In particular, previous papers contain the extrapolation equations used for various decay schemes, methods for determining model parameters and, in some cases, tabulated uncertainty budgets. Two things often lacking from experimental reports are both the rationale for estimating uncertainties in a specific way and the details of exactly how a specific component of uncertainty was estimated. Furthermore, correlations among the components of uncertainty are rarely mentioned. To fill in these gaps, the present article shares the best-practices from a few practitioners of this craft. We explain and demonstrate with examples of how these approaches can be used to estimate the uncertainty of the reported massic activity. We describe uncertainties due to measurement variability, extrapolation functions, dead-time and resolving-time effects, gravimetric links, and nuclear and atomic data. Most importantly, a thorough understanding of the measurement system and its response to the decay under study can be used to derive a robust estimate of the measurement uncertainty.
Resumo:
This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.
Resumo:
Fractal geometry is a fundamental approach for describing the complex irregularities of the spatial structure of point patterns. The present research characterizes the spatial structure of the Swiss population distribution in the three Swiss geographical regions (Alps, Plateau and Jura) and at the entire country level. These analyses were carried out using fractal and multifractal measures for point patterns, which enabled the estimation of the spatial degree of clustering of a distribution at different scales. The Swiss population dataset is presented on a grid of points and thus it can be modelled as a "point process" where each point is characterized by its spatial location (geometrical support) and a number of inhabitants (measured variable). The fractal characterization was performed by means of the box-counting dimension and the multifractal analysis was conducted through the Renyi's generalized dimensions and the multifractal spectrum. Results showed that the four population patterns are all multifractals and present different clustering behaviours. Applying multifractal and fractal methods at different geographical regions and at different scales allowed us to quantify and describe the dissimilarities between the four structures and their underlying processes. This paper is the first Swiss geodemographic study applying multifractal methods using high resolution data.
Resumo:
Maintenance of corneal transparency is crucial for vision and depends mainly on the endothelium, a non-proliferative monolayer of cells covering the inner part of the cornea. When endothelial cell density falls below a critical threshold, the barrier and "pump" functions of the endothelium are compromised which results in corneal oedema and loss of visual acuity. The conventional treatment for such severe disorder is corneal graft. Unfortunately, there is a worldwide shortage of donor corneas, necessitating amelioration of tissue survival and storage after harvesting. Recently it was reported that the ROCK inhibitor Y-27632 promotes adhesion, inhibits apoptosis, increases the number of proliferating monkey corneal endothelial cells in vitro and enhance corneal endothelial wound healing both in vitro and in vivo in animal models. Using organ culture human cornea (N = 34), the effect of ROCK inhibitor was evaluated in vitro and ex vivo. Toxicity, corneal endothelial cell density, cell proliferation, apoptosis, cell morphometry, adhesion and wound healing process were evaluated by live/dead assay standard cell counting method, EdU labelling, Ki67, Caspase3, Zo-1 and Actin immunostaining. We demonstrated for the first time in human corneal endothelial cells ex vivo and in vitro, that ROCK inhibitor did not induce any toxicity effect and did not alter cell viability. ROCK inhibitor treatment did not induce human corneal endothelial cells proliferation. However, ROCK inhibitor significantly enhanced adhesion and wound healing. The present study shows that the selective ROCK inhibitor Y-27632 has no effect on human corneal endothelial cells proliferative capacities, but alters cellular behaviours. It induces changes in cell shape, increases cell adhesion and enhances wound healing ex vivo and in vitro. Its absence of toxicity, as demonstrated herein, is relevant for its use in human therapy.
Resumo:
The Cancer Vaccine Consortium of the Sabin Vaccine Institute (CVC/SVI) is conducting an ongoing large-scale immune monitoring harmonization program through its members and affiliated associations. This effort was brought to life as an external validation program by conducting an international Elispot proficiency panel with 36 laboratories in 2005, and was followed by a second panel with 29 participating laboratories in 2006 allowing for application of learnings from the first panel. Critical protocol choices, as well as standardization and validation practices among laboratories were assessed through detailed surveys. Although panel participants had to follow general guidelines in order to allow comparison of results, each laboratory was able to use its own protocols, materials and reagents. The second panel recorded an overall significantly improved performance, as measured by the ability to detect all predefined responses correctly. Protocol choices and laboratory practices, which can have a dramatic effect on the overall assay outcome, were identified and lead to the following recommendations: (A) Establish a laboratory SOP for Elispot testing procedures including (A1) a counting method for apoptotic cells for determining adequate cell dilution for plating, and (A2) overnight rest of cells prior to plating and incubation, (B) Use only pre-tested serum optimized for low background: high signal ratio, (C) Establish a laboratory SOP for plate reading including (C1) human auditing during the reading process and (C2) adequate adjustments for technical artifacts, and (D) Only allow trained personnel, which is certified per laboratory SOPs to conduct assays. Recommendations described under (A) were found to make a statistically significant difference in assay performance, while the remaining recommendations are based on practical experiences confirmed by the panel results, which could not be statistically tested. These results provide initial harmonization guidelines to optimize Elispot assay performance to the immunotherapy community. Further optimization is in process with ongoing panels.
Resumo:
This paper addresses the issue of double counting of health impacts in the context of cost of illness valuation. Double counting occurs when estimates are jointly used, which rely on valuation techniques that overlap. As a solution, we propose to limit the scope of each of the valuation method to a specific range of impacts. In order to limit the contingentvaluation method to the exclusive valuation of intangible costs, we propose a three steps approach : (1) leave the respondents free to valuate the consequences which matter to them, (2) elicit respondent's motivations, (3) control for the influence motivations have on elicited values. This procedure was applied in a Swiss contingent-valuation. An econometric treatment was applied in order to limit the scope of the estimates of the contingent valuation method to intangibles,therefore the possibility to a combination of methods with the risk of double-counting and underestimating costs being kept to a minimum.
Resumo:
A solution of (18)F was standardised with a 4pibeta-4pigamma coincidence counting system in which the beta detector is a one-inch diameter cylindrical UPS89 plastic scintillator, positioned at the bottom of a well-type 5''x5'' NaI(Tl) gamma-ray detector. Almost full detection efficiency-which was varied downwards electronically-was achieved in the beta-channel. Aliquots of this (18)F solution were also measured using 4pigamma NaI(Tl) integral counting and Monte Carlo calculated efficiencies as well as the CIEMAT-NIST method. Secondary measurements of the same solution were also performed with an IG11 ionisation chamber whose equivalent activity is traceable to the Système International de Référence through the contribution IRA-METAS made to it in 2001; IRA's degree of equivalence was found to be close to the key comparison reference value (KCRV). The (18)F activity predicted by this coincidence system agrees closely with the ionisation chamber measurement and is compatible within one standard deviation of the other primary measurements. This work demonstrates that our new coincidence system can standardise short-lived radionuclides used in nuclear medicine.
Resumo:
When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.
Resumo:
BACKGROUND: The diagnosis of malignant hematologic diseases has become increasingly complex during the last decade. It is based on the interpretation of results from different laboratory analyses, which range from microscopy to gene expression profiling. Recently, a method for the analysis of RNA phenotypes has been developed, the nCounter technology (Nanostring® Technologies), which allows for simultaneous quantification of hundreds of RNA molecules in biological samples. We evaluated this technique in a Swiss multi-center study on eighty-six samples from acute leukemia patients. METHODS: mRNA and protein profiles were established for normal peripheral blood and bone marrow samples. Signal intensities of the various tested antigens with surface expression were similar to those found in previously performed Affymetrix microarray analyses. Acute leukemia samples were analyzed for a set of twenty-two validated antigens and the Pearson Correlation Coefficient for nCounter and flow cytometry results was calculated. RESULTS: Highly significant values between 0.40 and 0.97 were found for the twenty-two antigens tested. A second correlation analysis performed on a per sample basis resulted in concordant results between flow cytometry and nCounter in 44-100% of the antigens tested (mean = 76%), depending on the number of blasts present in a sample, the homogeneity of the blast population, and the type of leukemia (AML or ALL). CONCLUSIONS: The nCounter technology allows for fast and easy depiction of a mRNA profile from hematologic samples. This technology has the potential to become a valuable tool for the diagnosis of acute leukemias, in addition to multi-color flow cytometry.
Resumo:
In mammalian circadian clockwork, the CLOCK-BMAL1 complex binds to DNA enhancers of target genes and drives circadian oscillation of transcription. Here we identified 7,978 CLOCK-binding sites in mouse liver by chromatin immunoprecipitation-sequencing (ChIP-Seq), and a newly developed bioinformatics method, motif centrality analysis of ChIP-Seq (MOCCS), revealed a genome-wide distribution of previously unappreciated noncanonical E-boxes targeted by CLOCK. In vitro promoter assays showed that CACGNG, CACGTT, and CATG(T/C)G are functional CLOCK-binding motifs. Furthermore, we extensively revealed rhythmically expressed genes by poly(A)-tailed RNA-Seq and identified 1,629 CLOCK target genes within 11,926 genes expressed in the liver. Our analysis also revealed rhythmically expressed genes that have no apparent CLOCK-binding site, indicating the importance of indirect transcriptional and posttranscriptional regulations. Indirect transcriptional regulation is represented by rhythmic expression of CLOCK-regulated transcription factors, such as Krüppel-like factors (KLFs). Indirect posttranscriptional regulation involves rhythmic microRNAs that were identified by small-RNA-Seq. Collectively, CLOCK-dependent direct transactivation through multiple E-boxes and indirect regulations polyphonically orchestrate dynamic circadian outputs.
Resumo:
The aim of this retrospective study was to compare the clinical and radiographic results after TKA (PFC, DePuy), performed either by computer assisted navigation (CAS, Brainlab, Johnson&Johnson) or by conventional means. Material and methods: Between May and December 2006 we reviewed 36 conventional TKA performed between 2002 and 2003 (group A) and 37 navigated TKA performed between 2005 and 2006 (group B) by the same experienced surgeon. The mean age in group A was 74 years (range 62-90) and 73 (range 58-85) in group B with a similar age distribution. The preoperative mechanical axes in group A ranged from -13° varus to +13° valgus (mean absolute deviation 6.83°, SD 3.86), in group B from -13° to +16° (mean absolute deviation 5.35, SD 4.29). Patients with a previous tibial osteotomy or revision arthroplasty were excluded from the study. Examination was done by an experienced orthopedic resident independent of the surgeon. All patients had pre- and postoperative long standing radiographs. The IKSS and the WOMAC were utilized to determine the clinical outcome. Patient's degree of satisfaction was assessed on a visual analogous scale (VAS). Results: 32 of the 37 navigated TKAs (86,5%) showed a postoperative mechanical axis within the limits of 3 degrees of valgus or varus deviation compared to only 24 (66%) of the 36 standard TKAs. This difference was significant (p = 0.045). The mean absolute deviation from neutral axis was 3.00° (range -5° to +9°, SD: 1.75) in group A in comparison to 1.54° (range -5° to +4°, SD: 1.41) in group B with a highly significant difference (p = 0.000). Furthermore, both groups showed a significant postoperative improvement of their mean IKSS-values (group A: 89 preoperative to 169 postoperative, group B 88 to 176) without a significant difference between the two groups. Neither the WOMAC nor the patient's degree of satisfaction - as assessed by VAS - showed significant differences. Operation time was significantly higher in group B (mean 119.9 min.) than in group A (mean 99.6 min., p <0.000). Conclusion: Our study showed consistent significant improvement of postoperative frontal alignment in TKA by computer assisted navigation (CAS) compared to standard methods, even in the hands of a surgeon well experienced in standard TKA implantation. However, the follow-up time of this study was not long enough to judge differences in clinical outcome. Thus, the relevance of computer navigation for clinical outcome and survival of TKA remains to be proved in long term studies to justify the longer operation time. References 1 Stulberg SD. Clin Orth Rel Res. 2003;(416):177-84. 2 Chauhan SK. JBJS Br. 2004;86(3):372-7. 3 Bäthis H, et al. Orthopäde. 2006;35(10):1056-65.
Resumo:
Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication.