905 resultados para Probability of fixation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study was to evaluate the immunoexpression of MMP-2, MMP-9 and CD31/microvascular density in squamous cell carcinomas of the floor of the mouth and to correlate the results with demographic, survival, clinical (TNM staging) and histopathological variables (tumor grade, perineural invasion, embolization and bone invasion). Data from medical records and diagnoses of 41 patients were reviewed. Histological sections were subjected to immunostaining using primary antibodies for human MMP-2, MMP-9 and CD31 and streptavidin-biotin-immunoperoxidase system. Histomorphometric analyses quantified positivity for MMPs (20 fields per slide, 100?points grade, ×200) and for CD31 (microvessels <50?µm in the area of the highest vascularization, 5 fields per slide, 100?points grade, ×400). Statistical design was composed by non-parametric Mann-Whitney U test (investigating the association between numerical variables and immunostainings), chi-square frequency test (in contingency tables), Fisher's exact test (when at least one expected frequency was less than 5 in 2×2 tables), Kaplan-Meier method (estimated probabilities of overall survival) and Iogrank test (comparison of survival curves), all with a significance level of 5%. There was a statistically significant correlation between immunostaining for MMP-2 and lymph node metastasis. Factors associated negatively with survival were N stage, histopathological grade, perineural invasion and immunostaining for MMP-9. There was no significant association between immunoexpression of CD31 and the other variables. The intensity of immunostaining for MMP-2 can be indicative of metastasis in lymph nodes and for MMP-9 of a lower probability of survival

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Structural durability is an important criterion that must be evaluated for every type of structure. Concerning reinforced concrete members, chloride diffusion process is widely used to evaluate durability, especially when these structures are constructed in aggressive atmospheres. The chloride ingress triggers the corrosion of reinforcements; therefore, by modelling this phenomenon, the corrosion process can be better evaluated as well as the structural durability. The corrosion begins when a threshold level of chloride concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in literature, deterministic approaches fail to predict accurately the corrosion time initiation due the inherent randomness observed in this process. In this regard, structural durability can be more realistically represented using probabilistic approaches. This paper addresses the analyses of probabilistic corrosion time initiation in reinforced concrete structures exposed to chloride penetration. The chloride penetration is modelled using the Fick's diffusion law. This law simulates the chloride diffusion process considering time-dependent effects. The probability of failure is calculated using Monte Carlo simulation and the first order reliability method, with a direct coupling approach. Some examples are considered in order to study these phenomena. Moreover, a simplified method is proposed to determine optimal values for concrete cover.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. METHODS: To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. RESULTS: A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. CONCLUSION: The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

AIM: To evaluate the bond strength of brackets fixed with different materials (two light-cured nanofilled resins - Transbond Supreme LV and Flow Tain LV, a light-cured resin - Transbond XT (control) and two chemically cured resins for indirect bonding - Sondhi Rapid- Set and Custom I.Q.) using the indirect bonding technique after 10 min and 24 h, and evaluate the type of failure. METHODS: One hundred premolars were selected and randomly divided into groups (n=10) according to the material and fixation period. The brackets were bonded through the indirect technique following the manufacturer's instructions and stored in deionized water at 37°C for 10 min or 24 h. After, the specimens were submitted to a shear bond strength (SBS) test (Instron) at 0.5 mm/min and evaluated for adhesive remnant index (ARI). The data were submitted to ANOVA and Tukey's test (p<0.05) and the ARI scores were submitted to the chi-square test. RESULTS: It could be observed a significant difference among the materials (Flow Tain LV = Transbond Supreme LV = Transbond XT> Sondhi Rapid-Set > Custom I.Q.). There was no significant difference in resistance values between 10 min and 24 h, regardless of the materials. Most groups showed adhesive remaining adhered to the enamel (scores 2 and 3) without statistically significant difference (p>0.05). CONCLUSIONS: It was concluded that the light-cured nanofilled materials used in indirect bonding showed greater resistance than the chemically cured materials. The period of fixation had no influence on the resistance for different materials.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[EN]The age and growth of the sand sole Pegusa lascaris from the Canarian Archipelago were studied from 2107 fish collected between January 2005 and December 2007. To find an appropriate method for age determination, sagittal otoliths were observed by surface-reading and frontal section and the results were compared. The two methods did not differ significantly in estimated age but the surface-reading method is superior in terms of cost and time efficiency. The sand sole has a moderate life span, with ages up to 10 years recorded. Individuals grow quickly in their first two years, attaining approximately 48% of their maximum standard length; after the second year, their growth rate drops rapidly as energy is diverted to reproduction. Males and females show dimorphism in growth, with females reaching a slightly greater length and age than males. Von Bertalanffy, seasonalized von Bertalanfy, Gompertz, and Schnute growth models were fitted to length-at-age data. Akaike weights for the seasonalized von Bertalanffy growth model indicated that the probability of choosing the correct model from the group of models used was >0.999 for males and females. The seasonalized von Bertalanffy growth parameters estimated were: L? = 309 mm standard length, k = 0.166 yr?1, t0 = ?1.88 yr, C = 0.347, and ts = 0.578 for males; and L? = 318 mm standard length, k = 0.164 yr?1, t0 = ?1.653 yr, C = 0.820, and ts = 0.691 for females. Fish standard length and otolith radius are closely correlated (R2 = 0.902). The relation between standard length and otolith radius is described by a power function (a = 85.11, v = 0.906)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[EN] Introduction: Candidemia in critically ill patients is usually a severe and life-threatening condition with a high crude mortality. Very few studies have focused on the impact of candidemia on ICU patient outcome and attributable mortality still remains controversial. This study was carried out to determine the attributable mortality of ICU-acquired candidemia in critically ill patients using propensity score matching analysis. Methods: A prospective observational study was conducted of all consecutive non-neutropenic adult patients admitted for at least seven days to 36 ICUs in Spain, France, and Argentina between April 2006 and June 2007. The probability of developing candidemia was estimated using a multivariate logistic regression model. Each patient with ICU-acquired candidemia was matched with two control patients with the nearest available Mahalanobis metric matching within the calipers defined by the propensity score. Standardized differences tests (SDT) for each variable before and after matching were calculated. Attributable mortality was determined by a modified Poisson regression model adjusted by those variables that still presented certain misalignments defined as a SDT > 10%. Results: Thirty-eight candidemias were diagnosed in 1,107 patients (34.3 episodes/1,000 ICU patients). Patients with and without candidemia had an ICU crude mortality of 52.6% versus 20.6% (P < 0.001) and a crude hospital mortality of 55.3% versus 29.6% (P = 0.01), respectively. In the propensity matched analysis, the corresponding figures were 51.4% versus 37.1% (P = 0.222) and 54.3% versus 50% (P = 0.680). After controlling residual confusion by the Poisson regression model, the relative risk (RR) of ICU- and hospital-attributable mortality from candidemia was RR 1.298 (95% confidence interval (CI) 0.88 to 1.98) and RR 1.096 (95% CI 0.68 to 1.69), respectively. Conclusions: ICU-acquired candidemia in critically ill patients is not associated with an increase in either ICU or hospital mortality.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This Thesis is devoted to the study of the optical companions of Millisecond Pulsars in Galactic Globular Clusters (GCs) as a part of a large project started at the Department of Astronomy of the Bologna University, in collaboration with other institutions (Astronomical Observatory of Cagliari and Bologna, University of Virginia), specifically dedicated to the study of the environmental effects on passive stellar evolution in galactic GCs. Globular Clusters are very efficient “Kilns” for generating exotic object, such as Millisecond Pulsars (MSP), low mass X-ray binaries(LMXB) or Blue Straggler Stars (BSS). In particular MSPs are formed in binary systems containing a Neutron Star which is spun up through mass accretion from the evolving companion (e.g. Bhattacharia & van den Heuvel 1991). The final stage of this recycling process is either the core of a peeled star (generally an Helium white dwarf) or a very light almos exhausted star, orbiting a very fast rotating Neutron Star (a MSP). Despite the large difference in total mass between the disk of the Galaxy and the Galactic GC system (up a factor 103), the percentage of fast rotating pulsar in binary systems found in the latter is very higher. MSPs in GCs show spin periods in the range 1.3 ÷ 30ms, slowdown rates ˙P 1019 s/s and a lower magnetic field, respect to ”normal” radio pulsars, B 108 gauss . The high probability of disruption of a binary systems after a supernova explosion, explain why we expect only a low percentage of recycled millisecond pulsars respect to the whole pulsar population. In fact only the 10% of the known 1800 radio pulsars are radio MSPs. Is not surprising, that MSP are overabundant in GCs respect to Galactic field, since in the Galactic Disk, MSPs can only form through the evolution of primordial binaries, and only if the binary survives to the supernova explosion which lead to the neutron star formation. On the other hand, the extremely high stellar density in the core of GCs, relative to most of the rest of the Galaxy, favors the formation of several different binary systems, suitable for the recycling of NSs (Davies at al. 1998). In this thesis we will present the properties two millisecond pulsars companions discovered in two globular clusters, the Helium white dwarf orbiting the MSP PSR 1911-5958A in NGC 6752 and the second case of a tidally deformed star orbiting an eclipsing millisecond pulsar, PSR J1701-3006B in NGC6266

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[EN] The citation potential is a measure of the probability of being cited. Obviously, it is different among fields of science because of systematic differences in publication and citation behaviour across disciplines. In the past, the citation potential was studied at journal level considering the average number of references in established groups of journals. In this paper, some characterizations of the author’s scientific research through three different research dimensions are proposed: production (journal papers), impact (journal citations), and reference (bibliographical sources). An empirical application, in a set of 120 randomly selected authors in four subject areas, shows that the ratio between production and impact dimensions is a normalized measure of the citation potential at the level of individual authors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Performance-Based Earthquake Engineering (PBEE), evaluating the seismic performance (or seismic risk) of a structure at a designed site has gained major attention, especially in the past decade. One of the objectives in PBEE is to quantify the seismic reliability of a structure (due to the future random earthquakes) at a site. For that purpose, Probabilistic Seismic Demand Analysis (PSDA) is utilized as a tool to estimate the Mean Annual Frequency (MAF) of exceeding a specified value of a structural Engineering Demand Parameter (EDP). This dissertation focuses mainly on applying an average of a certain number of spectral acceleration ordinates in a certain interval of periods, Sa,avg (T1,…,Tn), as scalar ground motion Intensity Measure (IM) when assessing the seismic performance of inelastic structures. Since the interval of periods where computing Sa,avg is related to the more or less influence of higher vibration modes on the inelastic response, it is appropriate to speak about improved IMs. The results using these improved IMs are compared with a conventional elastic-based scalar IMs (e.g., pseudo spectral acceleration, Sa ( T(¹)), or peak ground acceleration, PGA) and the advanced inelastic-based scalar IM (i.e., inelastic spectral displacement, Sdi). The advantages of applying improved IMs are: (i ) "computability" of the seismic hazard according to traditional Probabilistic Seismic Hazard Analysis (PSHA), because ground motion prediction models are already available for Sa (Ti), and hence it is possibile to employ existing models to assess hazard in terms of Sa,avg, and (ii ) "efficiency" or smaller variability of structural response, which was minimized to assess the optimal range to compute Sa,avg. More work is needed to assess also "sufficiency" and "scaling robustness" desirable properties, which are disregarded in this dissertation. However, for ordinary records (i.e., with no pulse like effects), using the improved IMs is found to be more accurate than using the elastic- and inelastic-based IMs. For structural demands that are dominated by the first mode of vibration, using Sa,avg can be negligible relative to the conventionally-used Sa (T(¹)) and the advanced Sdi. For structural demands with sign.cant higher-mode contribution, an improved scalar IM that incorporates higher modes needs to be utilized. In order to fully understand the influence of the IM on the seismis risk, a simplified closed-form expression for the probability of exceeding a limit state capacity was chosen as a reliability measure under seismic excitations and implemented for Reinforced Concrete (RC) frame structures. This closed-form expression is partuclarly useful for seismic assessment and design of structures, taking into account the uncertainty in the generic variables, structural "demand" and "capacity" as well as the uncertainty in seismic excitations. The assumed framework employs nonlinear Incremental Dynamic Analysis (IDA) procedures in order to estimate variability in the response of the structure (demand) to seismic excitations, conditioned to IM. The estimation of the seismic risk using the simplified closed-form expression is affected by IM, because the final seismic risk is not constant, but with the same order of magnitude. Possible reasons concern the non-linear model assumed, or the insufficiency of the selected IM. Since it is impossibile to state what is the "real" probability of exceeding a limit state looking the total risk, the only way is represented by the optimization of the desirable properties of an IM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Precipitation retrieval over high latitudes, particularly snowfall retrieval over ice and snow, using satellite-based passive microwave spectrometers, is currently an unsolved problem. The challenge results from the large variability of microwave emissivity spectra for snow and ice surfaces, which can mimic, to some degree, the spectral characteristics of snowfall. This work focuses on the investigation of a new snowfall detection algorithm specific for high latitude regions, based on a combination of active and passive sensors able to discriminate between snowing and non snowing areas. The space-borne Cloud Profiling Radar (on CloudSat), the Advanced Microwave Sensor units A and B (on NOAA-16) and the infrared spectrometer MODIS (on AQUA) have been co-located for 365 days, from October 1st 2006 to September 30th, 2007. CloudSat products have been used as truth to calibrate and validate all the proposed algorithms. The methodological approach followed can be summarised into two different steps. In a first step, an empirical search for a threshold, aimed at discriminating the case of no snow, was performed, following Kongoli et al. [2003]. This single-channel approach has not produced appropriate results, a more statistically sound approach was attempted. Two different techniques, which allow to compute the probability above and below a Brightness Temperature (BT) threshold, have been used on the available data. The first technique is based upon a Logistic Distribution to represent the probability of Snow given the predictors. The second technique, defined Bayesian Multivariate Binary Predictor (BMBP), is a fully Bayesian technique not requiring any hypothesis on the shape of the probabilistic model (such as for instance the Logistic), which only requires the estimation of the BT thresholds. The results obtained show that both methods proposed are able to discriminate snowing and non snowing condition over the Polar regions with a probability of correct detection larger than 0.5, highlighting the importance of a multispectral approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the post genomic era with the massive production of biological data the understanding of factors affecting protein stability is one of the most important and challenging tasks for highlighting the role of mutations in relation to human maladies. The problem is at the basis of what is referred to as molecular medicine with the underlying idea that pathologies can be detailed at a molecular level. To this purpose scientific efforts focus on characterising mutations that hamper protein functions and by these affect biological processes at the basis of cell physiology. New techniques have been developed with the aim of detailing single nucleotide polymorphisms (SNPs) at large in all the human chromosomes and by this information in specific databases are exponentially increasing. Eventually mutations that can be found at the DNA level, when occurring in transcribed regions may then lead to mutated proteins and this can be a serious medical problem, largely affecting the phenotype. Bioinformatics tools are urgently needed to cope with the flood of genomic data stored in database and in order to analyse the role of SNPs at the protein level. In principle several experimental and theoretical observations are suggesting that protein stability in the solvent-protein space is responsible of the correct protein functioning. Then mutations that are found disease related during DNA analysis are often assumed to perturb protein stability as well. However so far no extensive analysis at the proteome level has investigated whether this is the case. Also computationally methods have been developed to infer whether a mutation is disease related and independently whether it affects protein stability. Therefore whether the perturbation of protein stability is related to what it is routinely referred to as a disease is still a big question mark. In this work we have tried for the first time to explore the relation among mutations at the protein level and their relevance to diseases with a large-scale computational study of the data from different databases. To this aim in the first part of the thesis for each mutation type we have derived two probabilistic indices (for 141 out of 150 possible SNPs): the perturbing index (Pp), which indicates the probability that a given mutation effects protein stability considering all the “in vitro” thermodynamic data available and the disease index (Pd), which indicates the probability of a mutation to be disease related, given all the mutations that have been clinically associated so far. We find with a robust statistics that the two indexes correlate with the exception of all the mutations that are somatic cancer related. By this each mutation of the 150 can be coded by two values that allow a direct comparison with data base information. Furthermore we also implement computational methods that starting from the protein structure is suited to predict the effect of a mutation on protein stability and find that overpasses a set of other predictors performing the same task. The predictor is based on support vector machines and takes as input protein tertiary structures. We show that the predicted data well correlate with the data from the databases. All our efforts therefore add to the SNP annotation process and more importantly found the relationship among protein stability perturbation and the human variome leading to the diseasome.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sample scanning confocal optical microscope (SCOM) was designed and constructed in order to perform local measurements of fluorescence, light scattering and Raman scattering. This instrument allows to measure time resolved fluorescence, Raman scattering and light scattering from the same diffraction limited spot. Fluorescence from single molecules and light scattering from metallic nanoparticles can be studied. First, the electric field distribution in the focus of the SCOM was modelled. This enables the design of illumination modes for different purposes, such as the determination of the three-dimensional orientation of single chromophores. Second, a method for the calculation of the de-excitation rates of a chromophore was presented. This permits to compare different detection schemes and experimental geometries in order to optimize the collection of fluorescence photons. Both methods were combined to calculate the SCOM fluorescence signal of a chromophore in a general layered system. The fluorescence excitation and emission of single molecules through a thin gold film was investigated experimentally and modelled. It was demonstrated that, due to the mediation of surface plasmons, single molecule fluorescence near a thin gold film can be excited and detected with an epi-illumination scheme through the film. Single molecule fluorescence as close as 15nm to the gold film was studied in this manner. The fluorescence dynamics (fluorescence blinking and excited state lifetime) of single molecules was studied in the presence and in the absence of a nearby gold film in order to investigate the influence of the metal on the electronic transition rates. The trace-histogram and the autocorrelation methods for the analysis of single molecule fluorescence blinking were presented and compared via the analysis of Monte-Carlo simulated data. The nearby gold influences the total decay rate in agreement to theory. The gold presence produced no influence on the ISC rate from the excited state to the triplet but increased by a factor of 2 the transition rate from the triplet to the singlet ground state. The photoluminescence blinking of Zn0.42Cd0.58Se QDs on glass and ITO substrates was investigated experimentally as a function of the excitation power (P) and modelled via Monte-Carlo simulations. At low P, it was observed that the probability of a certain on- or off-time follows a negative power-law with exponent near to 1.6. As P increased, the on-time fraction reduced on both substrates whereas the off-times did not change. A weak residual memory effect between consecutive on-times and consecutive off-times was observed but not between an on-time and the adjacent off-time. All of this suggests the presence of two independent mechanisms governing the lifetimes of the on- and off-states. The simulated data showed Poisson-distributed off- and on-intensities, demonstrating that the observed non-Poissonian on-intensity distribution of the QDs is not a product of the underlying power-law probability and that the blinking of QDs occurs between a non-emitting off-state and a distribution of emitting on-states with different intensities. All the experimentally observed photo-induced effects could be accounted for by introducing a characteristic lifetime tPI of the on-state in the simulations. The QDs on glass presented a tPI proportional to P-1 suggesting the presence of a one-photon process. Light scattering images and spectra of colloidal and C-shaped gold nano-particles were acquired. The minimum size of a metallic scatterer detectable with the SCOM lies around 20 nm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.