887 resultados para probability of occurrence


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The detection of signals in the presence of noise is one of the most basic and important problems encountered by communication engineers. Although the literature abounds with analyses of communications in Gaussian noise, relatively little work has appeared dealing with communications in non-Gaussian noise. In this thesis several digital communication systems disturbed by non-Gaussian noise are analysed. The thesis is divided into two main parts. In the first part, a filtered-Poisson impulse noise model is utilized to calulate error probability characteristics of a linear receiver operating in additive impulsive noise. Firstly the effect that non-Gaussian interference has on the performance of a receiver that has been optimized for Gaussian noise is determined. The factors affecting the choice of modulation scheme so as to minimize the deterimental effects of non-Gaussian noise are then discussed. In the second part, a new theoretical model of impulsive noise that fits well with the observed statistics of noise in radio channels below 100 MHz has been developed. This empirical noise model is applied to the detection of known signals in the presence of noise to determine the optimal receiver structure. The performance of such a detector has been assessed and is found to depend on the signal shape, the time-bandwidth product, as well as the signal-to-noise ratio. The optimal signal to minimize the probability of error of; the detector is determined. Attention is then turned to the problem of threshold detection. Detector structure, large sample performance and robustness against errors in the detector parameters are examined. Finally, estimators of such parameters as. the occurrence of an impulse and the parameters in an empirical noise model are developed for the case of an adaptive system with slowly varying conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Everyday human behaviour relies on our ability to predict outcomes on the basis of moment by moment information. Long-range neural phase synchronization has been hypothesized as a mechanism by which ‘predictions’ can exert an effect on the processing of incoming sensory events. Using magnetoencephalography (MEG) we have studied the relationship between the modulation of phase synchronization in a cerebral network of areas involved in visual target processing and the predictability of target occurrence. Our results reveal a striking increase in the modulation of phase synchronization associated with an increased probability of target occurrence. These observations are consistent with the hypothesis that long-range phase synchronization plays a critical functional role in humans' ability to effectively employ predictive heuristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To analyze, in a general population sample, clustering of delusional and hallucinatory experiences in relation to environmental exposures and clinical parameters. METHOD: General population-based household surveys of randomly selected adults between 18 and 65 years of age were carried out. SETTING: 52 countries participating in the World Health Organization's World Health Survey were included. PARTICIPANTS: 225 842 subjects (55.6% women), from nationally representative samples, with an individual response rate of 98.5% within households participated. RESULTS: Compared with isolated delusions and hallucinations, co-occurrence of the two phenomena was associated with poorer outcome including worse general health and functioning status (OR = 0.93; 95% CI: 0.92-0.93), greater severity of symptoms (OR = 2.5 95% CI: 2.0-3.0), higher probability of lifetime diagnosis of psychotic disorder (OR = 12.9; 95% CI: 11.5-14.4), lifetime treatment for psychotic disorder (OR = 19.7; 95% CI: 17.3-22.5), and depression during the last 12 months (OR = 11.6; 95% CI: 10.9-12.4). Co-occurrence was also associated with adversity and hearing problems (OR = 2.0; 95% CI: 1.8-2.3). CONCLUSION: The results suggest that the co-occurrence of hallucinations and delusions in populations is not random but instead can be seen, compared with either phenomenon in isolation, as the result of more etiologic loading leading to a more severe clinical state.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extensive data sets on water quality and seagrass distributions in Florida Bay have been assembled under complementary, but independent, monitoring programs. This paper presents the landscape-scale results from these monitoring programs and outlines a method for exploring the relationships between two such data sets. Seagrass species occurrence and abundance data were used to define eight benthic habitat classes from 677 sampling locations in Florida Bay. Water quality data from 28 monitoring stations spread across the Bay were used to construct a discriminant function model that assigned a probability of a given benthic habitat class occurring for a given combination of water quality variables. Mean salinity, salinity variability, the amount of light reaching the benthos, sediment depth, and mean nutrient concentrations were important predictor variables in the discriminant function model. Using a cross-validated classification scheme, this discriminant function identified the most likely benthic habitat type as the actual habitat type in most cases. The model predicted that the distribution of benthic habitat types in Florida Bay would likely change if water quality and water delivery were changed by human engineering of freshwater discharge from the Everglades. Specifically, an increase in the seasonal delivery of freshwater to Florida Bay should cause an expansion of seagrass beds dominated by Ruppia maritima and Halodule wrightii at the expense of the Thalassia testudinum-dominated community that now occurs in northeast Florida Bay. These statistical techniques should prove useful for predicting landscape-scale changes in community composition in diverse systems where communities are in quasi-equilibrium with environmental drivers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of behavioural indicators of suffering and welfare in captive animals has produced ambiguous results. In comparisons between groups, those in worse condition tend to exhibit increased overall rate of Behaviours Potentially Indicative of Stress (BPIS), but when comparing within groups, individuals differ in their stress coping strategies. This dissertation presents analyses to unravel the Behavioural Profile of a sample of 26 captive capuchin monkeys, of three different species (Sapajus libidinosus, S. flavius and S. xanthosternos), kept in different enclosure types. In total, 147,17 hours of data were collected. We explored four type of analysis: Activity Budgets, Diversity indexes, Markov chains and Sequence analyses, and Social Network Analyses, resulting in nine indexes of behavioural occurrence and organization. In chapter One we explore group differences. Results support predictions of minor sex and species differences and major differences in behavioural profile due to enclosure type: i. individuals in less enriched enclosures exhibited a more diverse BPIS repertoire and a decreased probability of a sequence with six Genus Normative Behaviour; ii. number of most probable behavioural transitions including at least one BPIS was higher in less enriched enclosures; iii. proeminence indexes indicate that BPIS function as dead ends of behavioural sequences, and proeminence of three BPIS (pacing, self-direct, active I) were higher in less enriched enclosures. Overall, these data are not supportive of BPIS as a repetitive pattern, with a mantra-like calming effect. Rather, the picture that emerges is more supportive of BPIS as activities that disrupt organization of behaviours, introducing “noise” that compromises optimal activity budget. In chapter Two we explored individual differences in stress coping strategies. We classified individuals along six axes of exploratory behaviour. These were only weakly correlated indicating low correlation among behavioural indicators of syndromes. Nevertheless, the results are suggestive of two broad stress coping strategies, similar to the bold/proactive and shy/reactive pattern: more exploratory capuchin monkeys exhibited increased values of proeminence in Pacing, aberrant sexual display and Active 1 BPIS, while less active animals exhibited increased probability in significant sequences involving at least one BPIS, and increased prominence in own stereotypy. Capuchin monkeys are known for their cognitive capacities and behavioural flexibility, therefore, the search for a consistent set of behavioural indictors of welfare and individual differences requires further studies and larger data sets. With this work we aim contributing to design scientifically grounded and statistically correct protocols for collection of behavioural data that permits comparability of results and meta-analyses, from whatever theoretical perspective interpretation it may receive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sudden hydrocarbon influx from the formation into the wellbore poses a serious risk to the safety of the well. This sudden influx is termed a kick, which, if not controlled, may lead to a blowout. Therefore, early detection of the kick is crucial to minimize the possibility of a blowout occurrence. There is a high probability of delay in kick detection, apart from other issues when using a kick detection system that is exclusively based on surface monitoring. Down-hole monitoring techniques have a potential to detect a kick at its early stage. Down-hole monitoring could be particularly beneficial when the influx occurs as a result of a lost circulation scenario. In a lost circulation scenario, when the down-hole pressure becomes lower than the formation pore pressure, the formation fluid may starts to enter the wellbore. The lost volume of the drilling fluid is compensated by the formation fluid flowing into the well bore, making it difficult to identify the kick based on pit (mud tank) volume observations at the surface. This experimental study investigates the occurrence of a kick based on relative changes in the mass flow rate, pressure, density, and the conductivity of the fluid in the down-hole. Moreover, the parameters that are most sensitive to formation fluid are identified and a methodology to detect a kick without false alarms is reported. Pressure transmitter, the Coriolis flow and density meter, and the conductivity sensor are employed to observe the deteriorating well conditions in the down-hole. These observations are used to assess the occurrence of a kick and associated blowout risk. Monitoring of multiple down-hole parameters has a potential to improve the accuracy of interpretation related to kick occurrence, reduces the number of false alarms, and provides a broad picture of down-hole conditions. The down-hole monitoring techniques have a potential to reduce the kick detection period. A down-hole assembly of the laboratory scale drilling rig model and kick injection setup were designed, measuring instruments were acquired, a frame was fabricated, and the experimental set-up was assembled and tested. This set-up has the necessary features to evaluate kick events while implementing down-hole monitoring techniques. Various kick events are simulated on the drilling rig model. During the first set of experiments compressed air (which represents the formation fluid) is injected with constant pressure margin. In the second set of experiments the compressed air is injected with another pressure margin. The experiments are repeated with another pump (flow) rate as well. This thesis consists of three main parts. The first part gives the general introduction, motivation, outline of the thesis, and a brief description of influx: its causes, various leading and lagging indicators, and description of the several kick detection systems that are in practice in the industry. The second part describes the design and construction of the laboratory scale down-hole assembly of the drilling rig and kick injection setup, which is used to implement the proposed methodology for early kick detection. The third part discusses the experimental work, describes the methodology for early kick detection, and presents experimental results that show how different influx events affect the mass flow rate, pressure, conductivity, and density of the fluid in the down-hole, and the discussion of the results. The last chapter contains summary of the study and future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sediment drifts on the continental rise are located proximal to the western side of the Antarctic Peninsula and recorded changes in glacial volume and thermal regime over the last ca. 15 m.y. At Ocean Drilling Program (ODP) Site 1101 (Leg 178), which recovered sediments back to 3.1 Ma, glacial-interglacial cyclicity was identified based on the biogenic component and sedimentary structures observed in X-radiographs, magnetic susceptibility and lithofacies descriptions. Glacial intervals are dominated by fine-grained laminated mud and interglacial units consist of bioturbated muds enriched in biogenic components. From 2.2 to 0.76 Ma, planktonic foraminifera and calcareous nannofossils dominate in the interglacials suggesting a shift of the Antarctic Polar Front (APF) to the south near the drifts. Prior to 2.2 Ma, cyclicity cannot be identified and diatoms dominate the biogenic component and high percent opal suggests warmer conditions south of the APF and reduced sea ice over the drifts. Analyses of the coarse-grained terrigenous fraction (pebbles and coarse sand) from Sites 1096 and 1101 record glaciers at sea-level releasing iceberg-rafted debris (IRD) throughout the last 3.1 m.y. Analyses of quartz sand grains in IRD with the scanning electron microscope (SEM) show an abrupt change in the frequency of occurrence of microtextures at ~1.35 Ma. During the Late Pliocene to Early Pleistocene, the population of quartz grains included completely weathered grains and a low frequency of crushing and abrasion, suggesting that glaciers were small and did not inundate the topography. Debris shed from mountain peaks was transported supraglacially or englacially allowing weathered grains to pass through the glacier unmodified. During glacial periods from 1.35-0.76 Ma, glaciers expanded in size. The IRD flux was very high and dropstones have diverse lithologies. Conditions resembling those at the Last Glacial Maximum (LGM) have been episodically present on the Antarctic Peninsula since ~0.76 Ma. Quartz sand grains show high relief, fracture and abrasion common under thick ice and the IRD flux is low with a more restricted range of dropstone lithologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While we know much about poverty (or “low income”) in Canada in a static context, our understanding of the underlying dynamics remains very limited. This is particularly problematic from a policy perspective and the country has been increasingly left out on an international level in this regard. The contribution of this paper is to report the results of an empirical analysis of low income (“poverty”) dynamics in Canada using the recently available “LAD” tax-based database. The paper first describes the general nature of individuals’ poverty profiles (how many are short-term versus longterm, etc.)., the breakdown of the poor population in any given year amongst these different types, and the characterisation of poverty profiles by sex and family type. It then reports the estimation of various econometric models, starting with a set which specifies entry into and exit from poverty in any given year as a function of a variety of personal attributes and situational characteristics, including family status and changes therein, province of residence, inter-provincial mobility, language, area size of residence and calendar year (to capture trend effects). A set of proper hazard models then adds duration effects to these specifications to see how exit and re-entry probabilities shift with the amount of time spent in a poverty spell or after having exited a previous spell. A final set of specifications then investigates “occurrence dependence” effects by including past poverty spells first in an entry model and then with respect to the probability of being poor in a given year. Policy implications are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem addressed concerns the determination of the average numberof successive attempts of guessing a word of a certain length consisting of letters withgiven probabilities of occurrence. Both first- and second-order approximations to a naturallanguage are considered. The guessing strategy used is guessing words in decreasing orderof probability. When word and alphabet sizes are large, approximations are necessary inorder to estimate the number of guesses. Several kinds of approximations are discusseddemonstrating moderate requirements regarding both memory and central processing unit(CPU) time. When considering realistic sizes of alphabets and words (100), the numberof guesses can be estimated within minutes with reasonable accuracy (a few percent) andmay therefore constitute an alternative to, e.g., various entropy expressions. For manyprobability distributions, the density of the logarithm of probability products is close to anormal distribution. For those cases, it is possible to derive an analytical expression for theaverage number of guesses. The proportion of guesses needed on average compared to thetotal number decreases almost exponentially with the word length. The leading term in anasymptotic expansion can be used to estimate the number of guesses for large word lengths.Comparisons with analytical lower bounds and entropy expressions are also provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the inner relations between classical sub-scheme probability and statistic probability, subjective probability and objective probability, prior probability and posterior probability, transition probability and probability of utility, and further analysis the goal, method, and its practical economic purpose which represent by these various probability from the perspective of mathematics, so as to deeply understand there connotation and its relation with economic decision making, thus will pave the route for scientific predication and decision making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My thesis consists of three essays that investigate strategic interactions between individuals engaging in risky collective action in uncertain environments. The first essay analyzes a broad class of incomplete information coordination games with a wide range of applications in economics and politics. The second essay draws from the general model developed in the first essay to study decisions by individuals of whether to engage in protest/revolution/coup/strike. The final essay explicitly integrates state response to the analysis. The first essay, Coordination Games with Strategic Delegation of Pivotality, exhaustively analyzes a class of binary action, two-player coordination games in which players receive stochastic payoffs only if both players take a ``stochastic-coordination action''. Players receive conditionally-independent noisy private signals about the normally distributed stochastic payoffs. With this structure, each player can exploit the information contained in the other player's action only when he takes the “pivotalizing action”. This feature has two consequences: (1) When the fear of miscoordination is not too large, in order to utilize the other player's information, each player takes the “pivotalizing action” more often than he would based solely on his private information, and (2) best responses feature both strategic complementarities and strategic substitutes, implying that the game is not supermodular nor a typical global game. This class of games has applications in a wide range of economic and political phenomena, including war and peace, protest/revolution/coup/ strike, interest groups lobbying, international trade, and adoption of a new technology. My second essay, Collective Action with Uncertain Payoffs, studies the decision problem of citizens who must decide whether to submit to the status quo or mount a revolution. If they coordinate, they can overthrow the status quo. Otherwise, the status quo is preserved and participants in a failed revolution are punished. Citizens face two types of uncertainty. (a) non-strategic: they are uncertain about the relative payoffs of the status quo and revolution, (b) strategic: they are uncertain about each other's assessments of the relative payoff. I draw on the existing literature and historical evidence to argue that the uncertainty in the payoffs of status quo and revolution is intrinsic in politics. Several counter-intuitive findings emerge: (1) Better communication between citizens can lower the likelihood of revolution. In fact, when the punishment for failed protest is not too harsh and citizens' private knowledge is accurate, then further communication reduces incentives to revolt. (2) Increasing strategic uncertainty can increase the likelihood of revolution attempts, and even the likelihood of successful revolution. In particular, revolt may be more likely when citizens privately obtain information than when they receive information from a common media source. (3) Two dilemmas arise concerning the intensity and frequency of punishment (repression), and the frequency of protest. Punishment Dilemma 1: harsher punishments may increase the probability that punishment is materialized. That is, as the state increases the punishment for dissent, it might also have to punish more dissidents. It is only when the punishment is sufficiently harsh, that harsher punishment reduces the frequency of its application. Punishment Dilemma 1 leads to Punishment Dilemma 2: the frequencies of repression and protest can be positively or negatively correlated depending on the intensity of repression. My third essay, The Repression Puzzle, investigates the relationship between the intensity of grievances and the likelihood of repression. First, I make the observation that the occurrence of state repression is a puzzle. If repression is to succeed, dissidents should not rebel. If it is to fail, the state should concede in order to save the costs of unsuccessful repression. I then propose an explanation for the “repression puzzle” that hinges on information asymmetries between the state and dissidents about the costs of repression to the state, and hence the likelihood of its application by the state. I present a formal model that combines the insights of grievance-based and political process theories to investigate the consequences of this information asymmetry for the dissidents' contentious actions and for the relationship between the magnitude of grievances (formulated here as the extent of inequality) and the likelihood of repression. The main contribution of the paper is to show that this relationship is non-monotone. That is, as the magnitude of grievances increases, the likelihood of repression might decrease. I investigate the relationship between inequality and the likelihood of repression in all country-years from 1981 to 1999. To mitigate specification problem, I estimate the probability of repression using a generalized additive model with thin-plate splines (GAM-TPS). This technique allows for flexible relationship between inequality, the proxy for the costs of repression and revolutions (income per capita), and the likelihood of repression. The empirical evidence support my prediction that the relationship between the magnitude of grievances and the likelihood of repression is non-monotone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strong convective events can produce extreme precipitation, hail, lightning or gusts, potentially inducing severe socio-economic impacts. These events have a relatively small spatial extension and, in most cases, a short lifetime. In this study, a model is developed for estimating convective extreme events based on large scale conditions. It is shown that strong convective events can be characterized by a Weibull distribution of radar-based rainfall with a low shape and high scale parameter value. A radius of 90km around a station reporting a convective situation turned out to be suitable. A methodology is developed to estimate the Weibull parameters and thus the occurrence probability of convective events from large scale atmospheric instability and enhanced near-surface humidity, which are usually found on a larger scale than the convective event itself. Here, the probability for the occurrence of extreme convective events is estimated from the KO-index indicating the stability, and relative humidity at 1000hPa. Both variables are computed from ERA-Interim reanalysis. In a first version of the methodology, these two variables are applied to estimate the spatial rainfall distribution and to estimate the occurrence of a convective event. The developed method shows significant skill in estimating the occurrence of convective events as observed at synoptic stations, lightning measurements, and severe weather reports. In order to take frontal influences into account, a scheme for the detection of atmospheric fronts is implemented. While generally higher instability is found in the vicinity of fronts, the skill of this approach is largely unchanged. Additional improvements were achieved by a bias-correction and the use of ERA-Interim precipitation. The resulting estimation method is applied to the ERA-Interim period (1979-2014) to establish a ranking of estimated convective extreme events. Two strong estimated events that reveal a frontal influence are analysed in detail. As a second application, the method is applied to GCM-based decadal predictions in the period 1979-2014, which were initialized every year. It is shown that decadal predictive skill for convective event frequencies over Germany is found for the first 3-4 years after the initialization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Sawfishes currently are among the most threatened elasmobranchs in the world. Only two species inhabit Atlantic waters: the largetooth sawfish (Pristis pristis) and the smalltooth sawfish (Pristis pectinata), both having suffered dramatic declines in their ranges. 2. The goal of this study was to evaluate the status of P. pristis in the Atlantic, and estimate local extinction risk based on historical and recent occurrence records. In order to accomplish these goals, a thorough search for historical and recent records of P. pristis in the Atlantic was conducted, by reviewing scientific and popular literature, museum specimens, and contacting regional scientists from the species’ historical range. 3. In total, 801 P. pristis records (1830–2009) document its occurrence in four major regions in the Atlantic: USA (n =41), Mexico and Central America (n =535), South America (n=162), and West Africa (n =48). Locality data were not available for 15 records. 4. Historical abundance centres were the Colorado-San Juan River system in Nicaragua and Costa Rica (and secondarily Lake Izabal of Guatemala), the Amazon estuary, and coastal Guinea-Bissau. 5. Currently, the species faces drastic depletion throughout its entire former range and centres of abundance. It appears to have been extirpated from several areas. The probability of extinction was highest in the USA, northern South America (Colombia to Guyane), and southern West Africa (Cameroon to Namibia). 6. Currently, the Amazon estuary appears to have the highest remaining abundance of P. pristis in the Atlantic, followed by the Colorado–San Juan River system in Nicaragua and Costa Rica and the Bissagos Archipelago in Guinea Bissau. Therefore the protection of these populations is crucial for the preservation and recovery of the species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biomarkers are nowadays essential tools to be one step ahead for fighting disease, enabling an enhanced focus on disease prevention and on the probability of its occurrence. Research in a multidisciplinary approach has been an important step towards the repeated discovery of new biomarkers. Biomarkers are defined as biochemical measurable indicators of the presence of disease or as indicators for monitoring disease progression. Currently, biomarkers have been used in several domains such as oncology, neurology, cardiovascular, inflammatory and respiratory disease, and several endocrinopathies. Bridging biomarkers in a One Health perspective has been proven useful in almost all of these domains. In oncology, humans and animals are found to be subject to the same environmental and genetic predisposing factors: examples include the existence of mutations in BR-CA1 gene predisposing to breast cancer, both in human and dogs, with increased prevalence in certain dog breeds and human ethnic groups. Also, breast feeding frequency and duration has been related to a decreased risk of breast cancer in women and bitches. When it comes to infectious diseases, this parallelism is prone to be even more important, for as much as 75% of all emerging diseases are believed to be zoonotic. Examples of successful use of biomarkers have been found in several zoonotic diseases such as Ebola, dengue, leptospirosis or West Nile virus infections. Acute Phase Proteins (APPs) have been used for quite some time as biomarkers of inflammatory conditions. These have been used in human health but also in the veterinary field such as in mastitis evaluation and PRRS (porcine respiratory and reproductive syndrome) diagnosis. Advantages rely on the fact that these biomarkers can be much easier to assess than other conventional disease diagnostic approaches (example: measured in easy to collect saliva samples). Another domain in which biomarkers have been essential is food safety: the possibility to measure exposure to chemical contaminants or other biohazards present in the food chain, which are sometimes analytical challenges due to their low bioavailability in body fluids, is nowadays a major breakthrough. Finally, biomarkers are considered the key to provide more personalized therapies, with more efficient outcomes and fewer side effects. This approach is expected to be the correct path to follow also in veterinary medicine, in the near future.