402 resultados para Gaussian probability function


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diabetic neuropathy is a significant clinical problem that currently has no effective therapy, and in advanced cases, leads to foot ulceration and lower limb amputation. The accurate detection, characterisation and quantification of this condition are important in order to define at-risk patients, anticipate deterioration, monitor progression and assess new therapies. This thesis evaluates novel corneal methods of assessing diabetic neuropathy. Over the past several years two new non-invasive corneal markers have emerged, and in cross-sectional studies have demonstrated their ability to stratify the severity of this disease. Corneal confocal microscopy (CCM) allows quantification of corneal nerve parameters and non-contact corneal aesthesiometry (NCCA), the presumed functional correlate of corneal structure, assesses the sensitivity of the cornea. Both these techniques are quick to perform, produce little or no discomfort for the patient, and with automatic analysis paradigms developed, are suitable for clinical settings. Each has advantages and disadvantages over established techniques for assessing diabetic neuropathy. New information is presented regarding measurement bias of CCM images, and a unique sampling paradigm and associated accuracy determination method of combinations is described. A novel high-speed corneal nerve mapping procedure has been developed and application of this procedure in individuals with neuropathy has revealed regions of sub-basal nerve plexus that dictate further evaluation, as they appear to show earlier signs of damage than the central region of the cornea that has to date been examined. The discriminative capacity of corneal sensitivity measured by NCCA is revealed to have reasonable potential as a marker of diabetic neuropathy. Application of these new corneal markers for longitudinal evaluation of diabetic neuropathy has the potential to reduce dependence on more invasive, costly, and time-consuming assessments, such as skin biopsy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The serviceability and safety of bridges are crucial to people’s daily lives and to the national economy. Every effort should be taken to make sure that bridges function safely and properly as any damage or fault during the service life can lead to transport paralysis, catastrophic loss of property or even casualties. Nonetheless, aggressive environmental conditions, ever-increasing and changing traffic loads and aging can all contribute to bridge deterioration. With often constrained budget, it is of significance to identify bridges and bridge elements that should be given higher priority for maintenance, rehabilitation or replacement, and to select optimal strategy. Bridge health prediction is an essential underpinning science to bridge maintenance optimization, since the effectiveness of optimal maintenance decision is largely dependent on the forecasting accuracy of bridge health performance. The current approaches for bridge health prediction can be categorised into two groups: condition ratings based and structural reliability based. A comprehensive literature review has revealed the following limitations of the current modelling approaches: (1) it is not evident in literature to date that any integrated approaches exist for modelling both serviceability and safety aspects so that both performance criteria can be evaluated coherently; (2) complex system modelling approaches have not been successfully applied to bridge deterioration modelling though a bridge is a complex system composed of many inter-related bridge elements; (3) multiple bridge deterioration factors, such as deterioration dependencies among different bridge elements, observed information, maintenance actions and environmental effects have not been considered jointly; (4) the existing approaches are lacking in Bayesian updating ability to incorporate a variety of event information; (5) the assumption of series and/or parallel relationship for bridge level reliability is always held in all structural reliability estimation of bridge systems. To address the deficiencies listed above, this research proposes three novel models based on the Dynamic Object Oriented Bayesian Networks (DOOBNs) approach. Model I aims to address bridge deterioration in serviceability using condition ratings as the health index. The bridge deterioration is represented in a hierarchical relationship, in accordance with the physical structure, so that the contribution of each bridge element to bridge deterioration can be tracked. A discrete-time Markov process is employed to model deterioration of bridge elements over time. In Model II, bridge deterioration in terms of safety is addressed. The structural reliability of bridge systems is estimated from bridge elements to the entire bridge. By means of conditional probability tables (CPTs), not only series-parallel relationship but also complex probabilistic relationship in bridge systems can be effectively modelled. The structural reliability of each bridge element is evaluated from its limit state functions, considering the probability distributions of resistance and applied load. Both Models I and II are designed in three steps: modelling consideration, DOOBN development and parameters estimation. Model III integrates Models I and II to address bridge health performance in both serviceability and safety aspects jointly. The modelling of bridge ratings is modified so that every basic modelling unit denotes one physical bridge element. According to the specific materials used, the integration of condition ratings and structural reliability is implemented through critical failure modes. Three case studies have been conducted to validate the proposed models, respectively. Carefully selected data and knowledge from bridge experts, the National Bridge Inventory (NBI) and existing literature were utilised for model validation. In addition, event information was generated using simulation to demonstrate the Bayesian updating ability of the proposed models. The prediction results of condition ratings and structural reliability were presented and interpreted for basic bridge elements and the whole bridge system. The results obtained from Model II were compared with the ones obtained from traditional structural reliability methods. Overall, the prediction results demonstrate the feasibility of the proposed modelling approach for bridge health prediction and underpin the assertion that the three models can be used separately or integrated and are more effective than the current bridge deterioration modelling approaches. The primary contribution of this work is to enhance the knowledge in the field of bridge health prediction, where more comprehensive health performance in both serviceability and safety aspects are addressed jointly. The proposed models, characterised by probabilistic representation of bridge deterioration in hierarchical ways, demonstrated the effectiveness and pledge of DOOBNs approach to bridge health management. Additionally, the proposed models have significant potential for bridge maintenance optimization. Working together with advanced monitoring and inspection techniques, and a comprehensive bridge inventory, the proposed models can be used by bridge practitioners to achieve increased serviceability and safety as well as maintenance cost effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background We have previously demonstrated that human kidney proximal tubule epithelial cells (PTEC) are able to modulate autologous T and B lymphocyte responses. It is well established that dendritic cells (DC) are responsible for the initiation and direction of adaptive immune responses and that these cells occur in the renal interstitium in close apposition to PTEC under inflammatory disease settings. However, there is no information regarding the interaction of PTEC with DC in an autologous human context. Methods Human monocytes were differentiated into monocyte-derived DC (MoDC) in the absence or presence of primary autologous activated PTEC and matured with polyinosinic:polycytidylic acid [poly(I:C)], while purified, pre-formed myeloid blood DC (CD1c+ BDC) were cultured with autologous activated PTEC in the absence or presence of poly(I:C) stimulation. DC responses were monitored by surface antigen expression, cytokine secretion, antigen uptake capacity and allogeneic T-cell-stimulatory ability. Results The presence of autologous activated PTEC inhibited the differentiation of monocytes to MoDC. Furthermore, MoDC differentiated in the presence of PTEC displayed an immature surface phenotype, efficient phagocytic capacity and, upon poly(I:C) stimulation, secreted low levels of pro-inflammatory cytokine interleukin (IL)-12p70, high levels of anti-inflammatory cytokine IL-10 and induced weak Th1 responses. Similarly, pre-formed CD1c+ BDC matured in the presence of PTEC exhibited an immature tolerogenic surface phenotype, strong endocytic and phagocytic ability and stimulated significantly attenuated T-cell proliferative responses. Conclusions Our data suggest that activated PTEC regulate human autologous immunity via complex interactions with DC. The ability of PTEC to modulate autologous DC function has important implications for the dampening of pro-inflammatory immune responses within the tubulointerstitium in renal injuries. Further dissection of the mechanisms of PTEC modulation of autologous immune responses may offer targets for therapeutic intervention in renal medicine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Binge-like patterns of excessive drinking during young adulthood increase the propensity for alcohol use disorders (AUDs) later in adult life; however, the mechanisms that drive this are not completely understood. Previous studies showed that the δ-opioid peptide receptor (DOP-R) is dynamically regulated by exposure to ethanol and that the DOP-R plays a role in ethanol-mediated behaviors. The aim of this study was to determine the role of the DOP-R in high ethanol consumption from young adulthood through to late adulthood by measuring DOP-R-mediated [(35)S]GTPγS binding in brain membranes and DOP-R-mediated analgesia using a rat model of high ethanol consumption in Long Evans rats. We show that DOP-R activity in the dorsal striatum and DOP-R-mediated analgesia changes during development, being highest during early adulthood and reduced in late adulthood. Intermittent access to ethanol but not continuous ethanol or water from young adulthood leads to an increase in DOP-R activity in the dorsal striatum and DOP-R-mediated analgesia into late adulthood. Multiple microinfusions of naltrindole into the dorsal striatum or multiple systemic administration of naltrindole reduces ethanol consumption, and following termination of treatment, DOP-R activity in the dorsal striatum is attenuated. These findings suggest that DOP-R activity in the dorsal striatum plays a role in high levels of ethanol consumption and suggest that targeting the DOP-R is an alternative strategy for the treatment of AUDs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The available wind power is stochastic and requires appropriate tools in the OPF model for economic and reliable power system operation. This paper exhibit the OPF formulation with factors involved in the intermittency of wind power. Weibull distribution is adopted to find the stochastic wind speed and power distribution. The reserve requirement is evaluated based on the wind distribution and risk of under/over estimation of the wind power. In addition, the Wind Energy Conversion System (WECS) is represented by Doubly Fed Induction Generator (DFIG) based wind farms. The reactive power capability for DFIG based wind farm is also analyzed. The study is performed on IEEE-30 bus system with wind farm located at different buses and with different wind profiles. Also the reactive power capacity to be installed in the wind farm to maintain a satisfactory voltage profile under the various wind flow scenario is demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno’s sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib’s and Pelletier’s (2011) theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substan- tial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Exercise for Health was a randomized, controlled trial designed to evaluate two modes of delivering (face-to-face [FtF] and over-the-telephone [Tel]) an 8-month translational exercise intervention, commencing 6-weeks post-breast cancer surgery (PS). Methods Outcomes included quality of life (QoL), function (fitness and upper-body) and treatment-related side effects (fatigue, lymphoedema, body mass index, menopausal symptoms, anxiety, depression and pain). Generalised estimating equation modelling determined time (baseline [5-weeks PS], mid-intervention [6-months PS], post-intervention [12-months PS]), group (FtF, Tel, Usual Care [UC]) and time-by-group effects. 194 women representative of the breast cancer population were randomised to the FtF (n=67), Tel (n=67) and UC (n=60) groups. Results: There were significant (p<0.05) interaction effects on QoL, fitness and fatigue, with differences being observed between the treatment groups and the UC group. Trends observed for the treatment groups were similar. The treatment groups reported improved QoL, fitness and fatigue over time and changes observed between baseline and post-intervention were clinically relevant. In contrast, the UC group experienced no change, or worsening QoL, fitness and fatigue, mid-intervention. Although improvements in the UC group occurred by 12-months post-surgery, the change did not meet the clinically relevant threshold. There were no differences in other treatment-related side-effects between groups. Conclusion This translational intervention trial, delivered either face-to-face or over-the-telephone, supports exercise as a form of adjuvant breast cancer therapy that can prevent declines in fitness and function during treatment and optimise recovery post-treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard approach to tax compliance applies the economics-of-crime methodology pioneered by Becker (1968): in its first application, due to Allingham and Sandmo (1972) it models the behaviour of agents as a decision involving a choice of the extent of their income to report to tax authorities, given a certain institutional environment, represented by parameters such as the probability of detection and penalties in the event the agent is caught. While this basic framework yields important insights on tax compliance behavior, it has some critical limitations. Specifically, it indicates a level of compliance that is significantly below what is observed in the data. This thesis revisits the original framework with a view towards addressing this issue, and examining the political economy implications of tax evasion for progressivity in the tax structure. The approach followed involves building a macroeconomic, dynamic equilibrium model for the purpose of examining these issues, by using a step-wise model building procedure starting with some very simple variations of the basic Allingham and Sandmo construct, which are eventually integrated to a dynamic general equilibrium overlapping generations framework with heterogeneous agents. One of the variations involves incorporating the Allingham and Sandmo construct into a two-period model of a small open economy of the type originally attributed to Fisher (1930). A further variation of this simple construct involves allowing agents to initially decide whether to evade taxes or not. In the event they decide to evade, the agents then have to decide the extent of income or wealth they wish to under-report. We find that the ‘evade or not’ assumption has strikingly different and more realistic implications for the extent of evasion, and demonstrate that it is a more appropriate modeling strategy in the context of macroeconomic models, which are essentially dynamic in nature, and involve consumption smoothing across time and across various states of nature. Specifically, since deciding to undertake tax evasion impacts on the consumption smoothing ability of the agent by creating two states of nature in which the agent is ‘caught’ or ‘not caught’, there is a possibility that their utility under certainty, when they choose not to evade, is higher than the expected utility obtained when they choose to evade. Furthermore, the simple two-period model incorporating an ‘evade or not’ choice can be used to demonstrate some strikingly different political economy implications relative to its Allingham and Sandmo counterpart. In variations of the two models that allow for voting on the tax parameter, we find that agents typically choose to vote for a high degree of progressivity by choosing the highest available tax rate from the menu of choices available to them. There is, however, a small range of inequality levels for which agents in the ‘evade or not’ model vote for a relatively low value of the tax rate. The final steps in the model building procedure involve grafting the two-period models with a political economy choice into a dynamic overlapping generations setting with more general, non-linear tax schedules and a ‘cost-of evasion’ function that is increasing in the extent of evasion. Results based on numerical simulations of these models show further improvement in the model’s ability to match empirically plausible levels of tax evasion. In addition, the differences between the political economy implications of the ‘evade or not’ version of the model and its Allingham and Sandmo counterpart are now very striking; there is now a large range of values of the inequality parameter for which agents in the ‘evade or not’ model vote for a low degree of progressivity. This is because, in the ‘evade or not’ version of the model, low values of the tax rate encourages a large number of agents to choose the ‘not-evade’ option, so that the redistributive mechanism is more ‘efficient’ relative to the situations in which tax rates are high. Some further implications of the models of this thesis relate to whether variations in the level of inequality, and parameters such as the probability of detection and penalties for tax evasion matter for the political economy results. We find that (i) the political economy outcomes for the tax rate are quite insensitive to changes in inequality, and (ii) the voting outcomes change in non-monotonic ways in response to changes in the probability of detection and penalty rates. Specifically, the model suggests that changes in inequality should not matter, although the political outcome for the tax rate for a given level of inequality is conditional on whether there is a large or small or large extent of evasion in the economy. We conclude that further theoretical research into macroeconomic models of tax evasion is required to identify the structural relationships underpinning the link between inequality and redistribution in the presence of tax evasion. The models of this thesis provide a necessary first step in that direction.