951 resultados para PROBABILITIES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In dieser Arbeit wird eine Klasse von stochastischen Prozessen untersucht, die eine abstrakte Verzweigungseigenschaft besitzen. Die betrachteten Prozesse sind homogene Markov-Prozesse in stetiger Zeit mit Zuständen im mehrdimensionalen reellen Raum und dessen Ein-Punkt-Kompaktifizierung. Ausgehend von Minimalforderungen an die zugehörige Übergangsfunktion wird eine vollständige Charakterisierung der endlichdimensionalen Verteilungen mehrdimensionaler kontinuierlicher Verzweigungsprozesse vorgenommen. Mit Hilfe eines erweiterten Laplace-Kalküls wird gezeigt, dass jeder solche Prozess durch eine bestimmte spektral positive unendlich teilbare Verteilung eindeutig bestimmt ist. Umgekehrt wird nachgewiesen, dass zu jeder solchen unendlich teilbaren Verteilung ein zugehöriger Verzweigungsprozess konstruiert werden kann. Mit Hilfe der allgemeinen Theorie Markovscher Operatorhalbgruppen wird sichergestellt, dass jeder mehrdimensionale kontinuierliche Verzweigungsprozess eine Version mit Pfaden im Raum der cadlag-Funktionen besitzt. Ferner kann die (funktionale) schwache Konvergenz der Prozesse auf die vage Konvergenz der zugehörigen Charakterisierungen zurückgeführt werden. Hieraus folgen allgemeine Approximations- und Konvergenzsätze für die betrachtete Klasse von Prozessen. Diese allgemeinen Resultate werden auf die Unterklasse der sich verzweigenden Diffusionen angewendet. Es wird gezeigt, dass für diese Prozesse stets eine Version mit stetigen Pfaden existiert. Schließlich wird die allgemeinste Form der Fellerschen Diffusionsapproximation für mehrtypige Galton-Watson-Prozesse bewiesen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assessment of the RAMS (Reliability, Availability, Maintainability and Safety) performances of system generally includes the evaluations of the “Importance” of its components and/or of the basic parameters of the model through the use of the Importance Measures. The analytical equations proposed in this study allow the estimation of the first order Differential Importance Measure on the basis of the Birnbaum measures of components, under the hypothesis of uniform percentage changes of parameters. The aging phenomena are introduced into the model by assuming exponential-linear or Weibull distributions for the failure probabilities. An algorithm based on a combination of MonteCarlo simulation and Cellular Automata is applied in order to evaluate the performance of a networked system, made up of source nodes, user nodes and directed edges subjected to failure and repair. Importance Sampling techniques are used for the estimation of the first and total order Differential Importance Measures through only one simulation of the system “operational life”. All the output variables are computed contemporaneously on the basis of the same sequence of the involved components, event types (failure or repair) and transition times. The failure/repair probabilities are forced to be the same for all components; the transition times are sampled from the unbiased probability distributions or it can be also forced, for instance, by assuring the occurrence of at least a failure within the system operational life. The algorithm allows considering different types of maintenance actions: corrective maintenance that can be performed either immediately upon the component failure or upon finding that the component has failed for hidden failures that are not detected until an inspection; and preventive maintenance, that can be performed upon a fixed interval. It is possible to use a restoration factor to determine the age of the component after a repair or any other maintenance action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vinylphosphonic acid (VPA) was polymerized at 80 ºC by free radical polymerization to give polymers (PVPA) of different molecular weight depending on the initiator concentration. The highest molecular weight, Mw, achieved was 6.2 x 104 g/mol as determined by static light scattering. High resolution nuclear magnetic resonance (NMR) spectroscopy was used to gain microstructure information about the polymer chain. Information based on tetrad probabilities was utilized to deduce an almost atactic configuration. In addition, 13C-NMR gave evidence for the presence of head-head and tail-tail links. Refined analysis of the 1H NMR spectra allowed for the quantitative determination of the fraction of these links (23.5 percent of all links). Experimental evidence suggested that the polymerization proceeded via cyclopolymerization of the vinylphosphonic acid anhydride as an intermediate. Titration curves indicated that high molecular weight poly(vinylphosphonic acid) PVPA behaved as a monoprotic acid. Proton conductors with phosphonic acid moieties as protogenic groups are promising due to their high charge carrier concentration, thermal stability, and oxidation resistivity. Blends and copolymers of PVPA have already been reported, but PVPA has not been characterized sufficiently with respect to its polymer properties. Therefore, we also studied the proton conductivity behaviour of a well-characterized PVPA. PVPA is a conductor; however, the conductivity depends strongly on the water content of the material. The phosphonic acid functionality in the resulting polymer, PVPA, undergoes condensation leading to the formation of phosphonic anhydride groups at elevated temperature. Anhydride formation was found to be temperature dependent by solid state NMR. Anhydride formation affects the proton conductivity to a large extent because not only the number of charge carriers but also the mobility of the charge carriers seems to change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Questa tesi di dottorato è inserita nell’ambito della convenzione tra ARPA_SIMC (che è l’Ente finanziatore), l’Agenzia Regionale di Protezione Civile ed il Dipartimento di Scienze della Terra e Geologico - Ambientali dell’Ateneo di Bologna. L’obiettivo principale è la determinazione di possibili soglie pluviometriche di innesco per i fenomeni franosi in Emilia Romagna che possano essere utilizzate come strumento di supporto previsionale in sala operativa di Protezione Civile. In un contesto geologico così complesso, un approccio empirico tradizionale non è sufficiente per discriminare in modo univoco tra eventi meteo innescanti e non, ed in generale la distribuzione dei dati appare troppo dispersa per poter tracciare una soglia statisticamente significativa. È stato quindi deciso di applicare il rigoroso approccio statistico Bayesiano, innovativo poiché calcola la probabilità di frana dato un certo evento di pioggia (P(A|B)) , considerando non solo le precipitazioni innescanti frane (quindi la probabilità condizionata di avere un certo evento di precipitazione data l’occorrenza di frana, P(B|A)), ma anche le precipitazioni non innescanti (quindi la probabilità a priori di un evento di pioggia, P(A)). L’approccio Bayesiano è stato applicato all’intervallo temporale compreso tra il 1939 ed il 2009. Le isolinee di probabilità ottenute minimizzano i falsi allarmi e sono facilmente implementabili in un sistema di allertamento regionale, ma possono presentare limiti previsionali per fenomeni non rappresentati nel dataset storico o che avvengono in condizioni anomale. Ne sono esempio le frane superficiali con evoluzione in debris flows, estremamente rare negli ultimi 70 anni, ma con frequenza recentemente in aumento. Si è cercato di affrontare questo problema testando la variabilità previsionale di alcuni modelli fisicamente basati appositamente sviluppati a questo scopo, tra cui X – SLIP (Montrasio et al., 1998), SHALSTAB (SHALlow STABility model, Montgomery & Dietrich, 1994), Iverson (2000), TRIGRS 1.0 (Baum et al., 2002), TRIGRS 2.0 (Baum et al., 2008).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The original idea of the thesis draws on interrelated assumptions: 1) among the tools used, in the markets for technology, for the acquisition of external knowledge, the licensing agreements are acknowledged as one of the most important contractual mechanisms; 2) the liabilities of newness and the liabilities of smallness force new venture to strongly rely on external knowledge sources. Albeit the relevance of this topic, little attention has been paid so far to its investigation, especially in the licensing context; 3) nowadays there is an increasing trend in licensing practices, but the literature on markets for technology focuses almost exclusively on the incentives and rationales that foster firms’ decisions to trade their technologies, under-investigating the role of the acquiring firm, the licensee, overlooking the demand side of the market. Therefore, the thesis investigates the inward licensing phenomenon within the context of new ventures. The main questions that new venture licensee has to address if it decides to undertake an inward licensing strategy, can be summarized as follows: 1) Is convenient for a new venture to choose, as initial technology strategy, the implementation of an inward licensing ? 2) Does this decision affect its survival probabilities? 3) Does the age, at which a new venture becomes a licensee, affect its innovative capabilities? Is it better to undertake a licensing-in strategy soon after founding or to postpone this strategy until the new venture has accumulated significant resources? The findings suggest that new ventures licensees survive less than their non-licensee counterparts; the survival rates are directly connected to the time taken by firms to reach the market;being engaged in licensing-in deals some years after its inception allows a new venture licensee to increase its subsequent capacity to produce innovations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topic of this work concerns nonparametric permutation-based methods aiming to find a ranking (stochastic ordering) of a given set of groups (populations), gathering together information from multiple variables under more than one experimental designs. The problem of ranking populations arises in several fields of science from the need of comparing G>2 given groups or treatments when the main goal is to find an order while taking into account several aspects. As it can be imagined, this problem is not only of theoretical interest but it also has a recognised relevance in several fields, such as industrial experiments or behavioural sciences, and this is reflected by the vast literature on the topic, although sometimes the problem is associated with different keywords such as: "stochastic ordering", "ranking", "construction of composite indices" etc., or even "ranking probabilities" outside of the strictly-speaking statistical literature. The properties of the proposed method are empirically evaluated by means of an extensive simulation study, where several aspects of interest are let to vary within a reasonable practical range. These aspects comprise: sample size, number of variables, number of groups, and distribution of noise/error. The flexibility of the approach lies mainly in the several available choices for the test-statistic and in the different types of experimental design that can be analysed. This render the method able to be tailored to the specific problem and the to nature of the data at hand. To perform the analyses an R package called SOUP (Stochastic Ordering Using Permutations) has been written and it is available on CRAN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Faxaflói bay is a short, wide and shallow bay situated in the southwest of Iceland. Although hosting a rather high level of marine traffic, this area is inhabited by many different species of cetaceans, among which the white-beaked dolphin (Lagenorhynchus albirostris), found here all year-round. This study aimed to evaluate the potential effect of increasing marine traffic on white-beaked dolphins distribution and behaviour, and to determine whether or not a variation in sighting frequencies have occurred throughout years (2008 – 2014). Data on sightings and on behaviour, as well as photographic one, has been collected daily taking advantage of the whale-watching company “Elding” operating in the bay. Results have confirmed the importance of this area for white-beaked dolphins, which have shown a certain level of site fidelity. Despite the high level of marine traffic, this dolphin appears to tolerate the presence of boats: no differences in encounter durations and locations over the study years have occurred, even though with increasing number of vessels, an increase in avoidance strategies has been displayed. Furthermore, seasonal differences in probabilities of sightings, with respect to the time of the day, have been found, leading to suggest the existence of a daily cycle of their movements and activities within the bay. This study has also described a major decline in sighting rates throughout years raising concern about white-beaked dolphin conservation status in Icelandic waters. It is therefore highly recommended a new dedicated survey to be conducted in order to document the current population estimate, to better investigate on the energetic costs that chronic exposure to disturbances may cause, and to plan a more suitable conservation strategy for white-beaked dolphin around Iceland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerosi lavori apparsi sulla letteratura scientifica negli ultimi decenni hanno evidenziato come, dall’inizio del XX secolo, la temperatura media globale sia aumentata. Tale fenomeno si è fatto più evidente dagli anni ’80, infatti ognuno degli ultimi tre decenni risulta più caldo dei precedenti. L’Europa e l’area mediterranea sono fra le regioni in cui il riscaldamento risulta più marcato, soprattutto per le temperature massime (dal 1951 sono cresciute di +0.39 °C per decennio) che hanno mostrato trend maggiori delle minime. Questo comportamento è stato osservato anche a scala nazionale (+0.25°C/dec per le massime e +0.20°C/dec per le minime). Accanto all’aumento dei valori medi è stato osservato un aumento (diminuzione) degli eventi di caldo (freddo) estremo, studiati attraverso la definizione di alcuni indici basati sui percentili delle distribuzioni. Resta aperto il dibattito su quali siano le cause delle variazioni negli eventi estremi: se le variazioni siano da attribuire unicamente ad un cambiamento nei valori medi, quindi ad uno shift rigido della distribuzione, o se parte del segnale sia dovuto ad una variazione nella forma della stessa, con un conseguente cambiamento nella variabilità. In questo contesto si inserisce la presente tesi con l’obiettivo di studiare l’andamento delle temperature giornaliere sul Trentino-Alto-Adige a partire dal 1926, ricercando cambiamenti nella media e negli eventi estremi in due fasce altimetriche. I valori medi delle temperature massime e minime hanno mostrato un evidente riscaldamento sull’intero periodo specialmente per le massime a bassa quota (`0.13 ̆ 0.03 °C/dec), con valori più alti per la primavera (`0.22 ̆ 0.05 °C/dec) e l’estate (`0.17 ̆ 0.05 °C/dec). Questi trends sono maggiori dopo il 1980 e non significativi in precedenza. L’andamento del numero di giorni con temperature al di sopra e al di sotto delle soglie dei percentili più estremi (stimate sull’intero periodo) indica un chiaro aumento degli estremi caldi, con valori più alti per le massime ad alta quota ( fino a +26.8% per il 99-esimo percentile) e una diminuzione degli estremi freddi (fino a -8.5% per il primo percentile delle minime a bassa quota). Inoltre, stimando anno per anno le soglie di un set di percentili e confrontando i loro trend con quelli della mediana, si è osservato, unicamente per le massime, un trend non uniforme verso temperature più alte, con i percentili più bassi (alti) caratterizzati da trend inferiori (superiori) rispetto a quello della mediana, suggerendo un allargamento della PDF.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigates one-dimensional random walks in random environment whose transition probabilities might have an infinite variance. The ergodicity of the dynamical system ''from the point of view of the particle'' is proved under the assumptions of transitivity and existence of an absolutely continuous steady state on the space of the environments. We show that, if the average of the local drift over the environments is summable and null, then the RWRE is recurrent. We provide an example satisfying all the hypotheses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this in vitro study was to compare the performance of two laser fluorescence devices (LF, LFpen), conventional visual criteria (VE), ICDAS and radiographic examination on occlusal surfaces of primary teeth. Thirty-seven primary human molars were selected from a pool of extracted teeth, which were stored frozen at -20°C until use. Teeth were assessed twice by two experienced examiners using laser fluorescence devices (LF and LFpen), conventional visual criteria, ICDAS and bitewing radiographs, with a 2-week interval between measurements. After measurement, the teeth were histologically prepared and assessed for caries extension. The highest sensitivity was observed for ICDAS at D(1) and D(3) thresholds, with no statistically significant difference when compared to the LF devices, except at the D(3) threshold. Bitewing radiographs presented the lowest values of sensitivity. Specificity at D(1) was higher for LFpen (0.90) and for VE at D(3) (0.94). When VE was combined with LFpen the post-test probabilities were the highest (94.0% and 89.2% at D(1) and D(3) thresholds, respectively). High values were observed for the combination of ICDAS and LFpen (92.0% and 80.0%, respectively). LF and LFpen showed the highest values of ICC for interexaminer reproducibility. However, regarding ICDAS, BW and VE, intraexaminer reproducibility was not the same for the two examiners. After primary visual inspection using ICDAS or not, the use of LFpen may aid in the detection of occlusal caries in primary teeth. Bitewing radiographs may be indicated only for approximal caries detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In older patients with acute myeloid leukemia (AML), the prevention of relapse has remained one of the major therapeutic challenges, with more than 75% relapses after complete remission. The anti-CD33 immunotoxin conjugate gemtuzumab ozogamicin (GO) has shown antileukemic remission induction activity in patients with relapsed AML. Patients with AML or refractory anemia with excess blasts in first complete remission attained after intensive induction chemotherapy were randomized between 3 cycles of GO (6 mg/m(2) every 4 weeks) or no postremission therapy (control) to assess whether GO would improve outcome. The 2 treatment groups (113 patients receiving GO vs 119 control patients) were comparable with regard to age (60-78 years, median 67 years), performance status, and cytogenetics. A total of 110 of 113 received at least 1 cycle of GO, and 65 of 113 patients completed the 3 cycles. Premature discontinuation was mainly attributable to incomplete hematologic recovery or intercurrent relapse. Median time to recovery of platelets 50 x 10(9)/L and neutrophils 0.5 x 10(9)/L after GO was 14 days and 20 days. Nonhematologic toxicities were mild overall, but there was 1 toxic death caused by liver failure. There were no significant differences between both treatment groups with regard to relapse probabilities, nonrelapse mortality, overall survival, or disease-free survival (17% vs 16% at 5 years). Postremission treatment with GO in older AML patients does not provide benefits regarding any clinical end points. The HOVON-43 study is registered at The Netherlands Trial Registry (number NTR212) and at http://www.controlled-trials.com as ISRCTN77039377.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Doubly charged ion mass spectra of alkyl-substituted furans and pyrroles were obtained using a double-focusing magnetic mass spectrometer operated at 3.2 kV accelerating voltage. Molecular ions were the dominant species found in doubly charged spectra of lower molecular weight heterocydic compounds, whereas the spectra of the higher weight homologues were typified by abundant fragment ions from extensive decomposition. Measured doubly charged ionization and appearance energies ranged from 22.8 to 47.9 eV. Ionization energies were correlated with values calculated using self-consistent field–molecular orbital techniques. A multichannel diabatic curve-crossing model was developed to investigate the fundamental organic ion reactions responsible for development of doubly charged ion mass spectra. Probabilities for Landau–Zener type transitions between reactant and product curves were determined and used in the collision model to predict charge-transfer cross-sections, which compared favorably with experimental cross-sections obtained using time-of-flight techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The WHO fracture risk assessment tool FRAX® is a computer based algorithm that provides models for the assessment of fracture probability in men and women. The approach uses easily obtained clinical risk factors (CRFs) to estimate 10-year probability of a major osteoporotic fracture (hip, clinical spine, humerus or wrist fracture) and the 10-year probability of a hip fracture. The estimate can be used alone or with femoral neck bone mineral density (BMD) to enhance fracture risk prediction. FRAX® is the only risk engine which takes into account the hazard of death as well as that of fracture. Probability of fracture is calculated in men and women from age, body mass index, and dichotomized variables that comprise a prior fragility fracture, parental history of hip fracture, current tobacco smoking, ever long-term use of oral glucocorticoids, rheumatoid arthritis, other causes of secondary osteoporosis, daily alcohol consumption of 3 or more units daily. The relationship between risk factors and fracture probability was constructed using information of nine population-based cohorts from around the world. CRFs for fracture had been identified that provided independent information on fracture risk based on a series of meta-analyses. The FRAX® algorithm was validated in 11 independent cohorts with in excess of 1 million patient-years, including the Swiss SEMOF cohort. Since fracture risk varies markedly in different regions of the world, FRAX® models need to be calibrated to those countries where the epidemiology of fracture and death is known. Models are currently available for 31 countries across the world. The Swiss-specific FRAX® model was developed very soon after the first release of FRAX® in 2008 and was published in 2009, using Swiss epidemiological data, integrating fracture risk and death hazard of our country. Two FRAX®-based approaches may be used to explore intervention thresholds. They have recently been investigated in the Swiss setting. In the first approach the guideline that individuals with a fracture probability equal to or exceeding that of women with a prior fragility fracture should be considered for treatment is translated into thresholds using 10-year fracture probabilities. In that case the threshold is age-dependent and increases from 16 % at the age of 60 ys to 40 % at the age of 80 ys. The second approach is a cost-effectiveness approach. Using a FRAX®-based intervention threshold of 15 % for both, women and men 50 years and older, should permit cost-effective access to therapy to patients at high fracture probability in our country and thereby contribute to further reduce the growing burden of osteoporotic fractures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In birds with facultative brood reduction, survival of the junior chick is thought to be regulated primarily by food availability. In black-legged kittiwakes (Rissa tridactyla) where parents and chicks are provided with unlimited access to supplemental food during the breeding season, brood reduction still occurs and varies interannually. Survival of the junior chick is therefore affected by factors in addition to the amount of food directly available to them. Maternally deposited yolk androgens affect competitive dynamics within a brood, and may be one of the mechanisms by which mothers mediate brood reduction in response to a suite of environmental and physiological cues. The goal of this study was to determine whether food supplementation during the pre-lay period affected patterns of yolk androgen deposition in free-living kittiwakes in two years (2003 and 2004) that varied in natural food availability. Chick survival was measured concurrently in other nests where eggs were not collected. In both years, supplemental feeding increased female investment in eggs by increasing egg mass. First-laid ("A") eggs were heavier but contained less testosterone and androstenedione than second-laid ("B") eggs across years and treatments. Yolk testosterone was higher in 2003 (the year with higher B chick survival) across treatments. The difference in yolk testosterone levels between eggs within a clutch varied among years and treatments such that it was relatively small when B chick experienced the lowest and the highest survival probabilities, and increased with intermediate B chick survival probabilities. The magnitude of testosterone asymmetry in a clutch may allow females to optimize fitness by either predisposing a brood for reduction or facilitating survival of younger chicks.