488 resultados para Reliability level
Resumo:
The serviceability and safety of bridges are crucial to people’s daily lives and to the national economy. Every effort should be taken to make sure that bridges function safely and properly as any damage or fault during the service life can lead to transport paralysis, catastrophic loss of property or even casualties. Nonetheless, aggressive environmental conditions, ever-increasing and changing traffic loads and aging can all contribute to bridge deterioration. With often constrained budget, it is of significance to identify bridges and bridge elements that should be given higher priority for maintenance, rehabilitation or replacement, and to select optimal strategy. Bridge health prediction is an essential underpinning science to bridge maintenance optimization, since the effectiveness of optimal maintenance decision is largely dependent on the forecasting accuracy of bridge health performance. The current approaches for bridge health prediction can be categorised into two groups: condition ratings based and structural reliability based. A comprehensive literature review has revealed the following limitations of the current modelling approaches: (1) it is not evident in literature to date that any integrated approaches exist for modelling both serviceability and safety aspects so that both performance criteria can be evaluated coherently; (2) complex system modelling approaches have not been successfully applied to bridge deterioration modelling though a bridge is a complex system composed of many inter-related bridge elements; (3) multiple bridge deterioration factors, such as deterioration dependencies among different bridge elements, observed information, maintenance actions and environmental effects have not been considered jointly; (4) the existing approaches are lacking in Bayesian updating ability to incorporate a variety of event information; (5) the assumption of series and/or parallel relationship for bridge level reliability is always held in all structural reliability estimation of bridge systems. To address the deficiencies listed above, this research proposes three novel models based on the Dynamic Object Oriented Bayesian Networks (DOOBNs) approach. Model I aims to address bridge deterioration in serviceability using condition ratings as the health index. The bridge deterioration is represented in a hierarchical relationship, in accordance with the physical structure, so that the contribution of each bridge element to bridge deterioration can be tracked. A discrete-time Markov process is employed to model deterioration of bridge elements over time. In Model II, bridge deterioration in terms of safety is addressed. The structural reliability of bridge systems is estimated from bridge elements to the entire bridge. By means of conditional probability tables (CPTs), not only series-parallel relationship but also complex probabilistic relationship in bridge systems can be effectively modelled. The structural reliability of each bridge element is evaluated from its limit state functions, considering the probability distributions of resistance and applied load. Both Models I and II are designed in three steps: modelling consideration, DOOBN development and parameters estimation. Model III integrates Models I and II to address bridge health performance in both serviceability and safety aspects jointly. The modelling of bridge ratings is modified so that every basic modelling unit denotes one physical bridge element. According to the specific materials used, the integration of condition ratings and structural reliability is implemented through critical failure modes. Three case studies have been conducted to validate the proposed models, respectively. Carefully selected data and knowledge from bridge experts, the National Bridge Inventory (NBI) and existing literature were utilised for model validation. In addition, event information was generated using simulation to demonstrate the Bayesian updating ability of the proposed models. The prediction results of condition ratings and structural reliability were presented and interpreted for basic bridge elements and the whole bridge system. The results obtained from Model II were compared with the ones obtained from traditional structural reliability methods. Overall, the prediction results demonstrate the feasibility of the proposed modelling approach for bridge health prediction and underpin the assertion that the three models can be used separately or integrated and are more effective than the current bridge deterioration modelling approaches. The primary contribution of this work is to enhance the knowledge in the field of bridge health prediction, where more comprehensive health performance in both serviceability and safety aspects are addressed jointly. The proposed models, characterised by probabilistic representation of bridge deterioration in hierarchical ways, demonstrated the effectiveness and pledge of DOOBNs approach to bridge health management. Additionally, the proposed models have significant potential for bridge maintenance optimization. Working together with advanced monitoring and inspection techniques, and a comprehensive bridge inventory, the proposed models can be used by bridge practitioners to achieve increased serviceability and safety as well as maintenance cost effectiveness.
Resumo:
This paper considers the conditions that are necessary at system and local levels for teacher assessment to be valid, reliable and rigorous. With sustainable assessment cultures as a goal, the paper examines how education systems can support local level efforts for quality learning and dependable teacher assessment. This is achieved through discussion of relevant research and consideration of a case study involving an evaluation of a cross-sectoral approach to promoting confidence in school-based assessment in Queensland, Australia. Building on the reported case study, essential characteristics for developing sustainable assessment cultures are presented, including: leadership in learning; alignment of curriculum, pedagogy and assessment; the design of quality assessment tasks and accompanying standards, and evidence-based judgement and moderation. Taken together, these elements constitute a new framework for building assessment capabilities and promoting quality assurance.
Resumo:
The Texas Department of Transportation (TxDOT) is concerned about the widening gap between pavement preservation needs and available funding. Thus, the TxDOT Austin District Pavement Engineer (DPE) has investigated methods to strategically allocate available pavement funding to potential projects that improve the overall performance of the District and Texas highway systems. The primary objective of the study presented in this paper is to develop a network-level project screening and ranking method that supports the Austin District 4-year pavement management plan development. The study developed candidate project selection and ranking algorithms that evaluated pavement conditions of each project candidate using data contained in the Pavement Management Information system (PMIS) database and incorporated insights from Austin District pavement experts; and implemented the developed method and supporting algorithm. This process previously required weeks to complete, but now requires about 10 minutes including data preparation and running the analysis algorithm, which enables the Austin DPE to devote more time and resources to conducting field visits, performing project-level evaluation and testing candidate projects. The case study results showed that the proposed method assisted the DPE in evaluating and prioritizing projects and allocating funds to the right projects at the right time.
Resumo:
Safety at Railway Level Crossings (RLXs) is an important issue within the Australian transport system. Crashes at RLXs involving road vehicles in Australia are estimated to cost $10 million each year. Such crashes are mainly due to human factors; unintentional errors contribute to 46% of all fatal collisions and are far more common than deliberate violations. This suggests that innovative intervention targeting drivers are particularly promising to improve RLX safety. In recent years there has been a rapid development of a variety of affordable technologies which can be used to increase driver’s risk awareness around crossings. To date, no research has evaluated the potential effects of such technologies at RLXs in terms of safety, traffic and acceptance of the technology. Integrating driving and traffic simulations is a safe and affordable approach for evaluating these effects. This methodology will be implemented in a driving simulator, where we recreated realistic driving scenario with typical road environments and realistic traffic. This paper presents a methodology for evaluating comprehensively potential benefits and negative effects of such interventions: this methodology evaluates driver awareness at RLXs , driver distraction and workload when using the technology . Subjective assessment on perceived usefulness and ease of use of the technology is obtained from standard questionnaires. Driving simulation will provide a model of driving behaviour at RLXs which will be used to estimate the effects of such new technology on a road network featuring RLX for different market penetrations using a traffic simulation. This methodology can assist in evaluating future safety interventions at RLXs.
Resumo:
There is consistent evidence showing that driver behaviour contributes to crashes and near miss incidents at railway level crossings (RLXs). The development of emerging Vehicle-to-Vehicle and Vehicle-to-Infrastructure technologies is a highly promising approach to improve RLX safety. To date, research has not evaluated comprehensively the potential effects of such technologies on driving behaviour at RLXs. This paper presents an on-going research programme assessing the impacts of such new technologies on human factors and drivers’ situational awareness at RLX. Additionally, requirements for the design of such promising technologies and ways to display safety information to drivers were systematically reviewed. Finally, a methodology which comprehensively assesses the effects of in-vehicle and road-based interventions warning the driver of incoming trains at RLXs is discussed, with a focus on both benefits and potential negative behavioural adaptations. The methodology is designed for implementation in a driving simulator and covers compliance, control of the vehicle, distraction, mental workload and drivers’ acceptance. This study has the potential to provide a broad understanding of the effects of deploying new in-vehicle and road-based technologies at RLXs and hence inform policy makers on safety improvements planning for RLX.
Resumo:
Dengue is currently the most important arthropod-borne viral disease of humans. Recent work has shown dengue virus displays limited replication in its primary vector, the mosquito Aedes aegypti, when the insect harbors the endosymbiotic bacterium Wolbachia pipientis. Wolbachia-mediated inhibition of virus replication may lead to novel methods of arboviral control, yet the functional and cellular mechanisms that underpin it are unknown.
Resumo:
There is general agreement in the scientific community that entrepreneurship plays a central role in the growth and development of an economy in rapidly changing environments (Acs & Virgill 2010). In particular, when business activities are regarded as a vehicle for sustainable growth at large, that goes beyond mere economic returns of singular entities, encompassing also social problems and heavily relying on collaborative actions, then we more precisely fall into the domain of ‘social entrepreneurship’(Robinson et al. 2009). In the entrepreneurship literature, prior studies demonstrated the role of intentionality as the best predictor of planned behavior (Ajzen 1991), and assumed that the intention to start a business derives from the perception of desirability and feasibility and from a propensity to act upon an opportunity (Fishbein & Ajzen 1975). Recognizing that starting a business is an intentional act (Krueger et al. 2000) and entrepreneurship is a planned behaviour (Katz & Gartner 1988), models of entrepreneurial intentions have substantial implications for intentionality research in entrepreneurship. The purpose of this paper is to explore the emerging practice of social entrepreneurship by comparing the determinants of entrepreneurial intention in general versus those leading to startups with a social mission. Social entrepreneurial intentions clearly merit to be investigated given that the opportunity identification process is an intentional process not only typical of for profit start-ups, and yet there is a lack of research examining opportunity recognition in social entrepreneurship (Haugh 2005). The key argument is that intentionality in both traditional and social entrepreneurs during the decision-making process of new venture creation is influenced by an individual's perceptions toward opportunities (Fishbein & Ajzen 1975). Besides opportunity recognition, at least two other aspects can substantially influence intentionality: human and social capital (Davidsson, 2003). This paper is set to establish if and to what extent the social intentions of potential entrepreneurs, at the cognitive level, are influenced by opportunities recognition, human capital, and social capital. By applying established theoretical constructs, the paper draws comparisons between ‘for-profit’ and ‘social’ intentionality using two samples of students enrolled in Economy and Business Administration at the University G. d’Annunzio in Pescara, Italy. A questionnaire was submitted to 310 potential entrepreneurs to test the robustness of the model. The collected data were used to measure the theoretical constructs of the paper. Reliability of the multi-item scale for each dimension was measured using Cronbach alpha, and for all the dimensions measures of reliability are above 0.70. We empirically tested the model using structural equation modeling with AMOS. The results allow us to empirically contribute to the argument regarding the influence of human and social cognitive capital on social and non-social entrepreneurial intentions. Moreover, we highlight the importance for further researchers to look deeper into the determinants of traditional and social entrepreneurial intention so that governments can one day define better polices and regulations that promote sustainable businesses with a social imprint, rather than inhibit their formation and growth.
Resumo:
Railway level crossings present an arguably unique interface between two transport systems that differ markedly in their performance characteristics, their degrees of regulation and their safety cultures. Railway level crossings also differ dramatically in the importance they represent as safety issues for the two modes. For rail, they are the location of a large proportion of fatalities within the system and are therefore the focus of much safety concern. For the road system, they comprise only a few percent of all fatalities, although the potential for catastrophic outcomes exist. Rail operators and regulators have traditionally required technologies to be failsafe and to demonstrate high levels of reliability. The resultant level of complexity and cost has both limited their extent of application and led to a need to better understand how motorists comprehend and respond to these systems.
Resumo:
Background Lower extremity amputation results in significant global morbidity and mortality. Australia appears to have a paucity of studies investigating lower extremity amputation. The primary aim of this retrospective study was to investigate key conditions associated with lower extremity amputations in an Australian population. Secondary objectives were to determine the influence of age and sex on lower extremity amputations, and the reliability of hospital coded amputations. Methods: Lower extremity amputation cases performed at the Princess Alexandra Hospital (Brisbane, Australia) between July 2006 and June 2007 were identified through the relevant hospital discharge dataset (n = 197). All eligible clinical records were interrogated for age, sex, key condition associated with amputation, amputation site, first ever amputation status and the accuracy of the original hospital coding. Exclusion criteria included records unavailable for audit and cases where the key condition was unable to be determined. Chi-squared, t-tests, ANOVA and post hoc tests were used to determine differences between groups. Kappa statistics were used to measure reliability between coded and audited amputations. A minimum significance level of p < 0.05 was used throughout. Results: One hundred and eighty-six cases were eligible and audited. Overall 69% were male, 56% were first amputations, 54% were major amputations, and mean age was 62 ± 16 years. Key conditions associated included type 2 diabetes (53%), peripheral arterial disease (non-diabetes) (18%), trauma (8%), type 1 diabetes (7%) and malignant tumours (5%). Differences in ages at amputation were associated with trauma 36 ± 10 years, type 1 diabetes 52 ± 12 years and type 2 diabetes 67 ± 10 years (p < 0.01). Reliability of original hospital coding was high with Kappa values over 0.8 for all variables. Conclusions: This study, the first in over 20 years to report on all levels of lower extremity amputations in Australia, found that people undergoing amputation are more likely to be older, male and have diabetes. It is recommended that large prospective studies are implemented and national lower extremity amputation rates are established to address the large preventable burden of lower extremity amputation in Australia.
Resumo:
We constructed a novel autonomously replicating gene expression shuttle vector, with the aim of developing a system for transiently expressing proteins at levels useful for commercial production of vaccines and other proteins in plants. The vector, pRIC, is based on the mild strain of the geminivirus Bean yellow dwarf virus (BeYDV-m) and is replicationally released into plant cells from a recombinant Agrobacterium tumefaciens Ti plasmid. pRIC differs from most other geminivirus-based vectors in that the BeYDV replication-associated elements were included in cis rather than from a co-transfected plasmid, while the BeYDV capsid protein (CP) and movement protein (MP) genes were replaced by an antigen encoding transgene expression cassette derived from the non-replicating A. tumefaciens vector, pTRAc. We tested vector efficacy in Nicotiana benthamiana by comparing transient cytoplasmic expression between pRIC and pTRAc constructs encoding either enhanced green fluorescent protein (EGFP) or the subunit vaccine antigens, human papillomavirus subtype 16 (HPV-16) major CP L1 and human immunodeficiency virus subtype C p24 antigen. The pRIC constructs were amplified in planta by up to two orders of magnitude by replication, while 50% more HPV-16 L1 and three- to seven-fold more EGFP and HIV-1 p24 were expressed from pRIC than from pTRAc. Vector replication was shown to be correlated with increased protein expression. We anticipate that this new high-yielding plant expression vector will contribute towards the development of a viable plant production platform for vaccine candidates and other pharmaceuticals. © 2009 Blackwell Publishing Ltd.
Resumo:
Purpose – As a consequence of rapid urbanisation and globalisation, cities have become the engines of population and economic growth. Hence, natural resources in and around the cities have been exposed to externalities of urban development processes. This paper introduces a new sustainability assessment approach that is tested in a pilot study. The paper aims to assist policy-makers and planners investigating the impacts of development on environmental systems, and produce effective policies for sustainable urban development. Design/methodology/approach – The paper introduces an indicator-based indexing model entitled “Indexing Model for the Assessment of Sustainable Urban Ecosystems” (ASSURE). The ASSURE indexing model produces a set of micro-level environmental sustainability indices that is aimed to be used in the evaluation and monitoring of the interaction between human activities and urban ecosystems. The model is an innovative approach designed to assess the resilience of ecosystems towards impacts of current development plans and the results serve as a guide for policymakers to take actions towards achieving sustainability. Findings – The indexing model has been tested in a pilot case study within the Gold Coast City, Queensland, Australia. This paper presents the methodology of the model and outlines the preliminary findings of the pilot study. The paper concludes with a discussion on the findings and recommendations put forward for future development and implementation of the model. Originality/value – Presently, there is a few sustainability indices developed to measure the sustainability at local, regional, national and international levels. However, due to challenges in data collection difficulties and availability of local data, there is no effective assessment model at the microlevel that the assessment of urban ecosystem sustainability accurately. The model introduced in this paper fills this gap by focusing on parcel-scale and benchmarking the environmental performance in micro-level.
Resumo:
Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.
Resumo:
Reliable ambiguity resolution (AR) is essential to Real-Time Kinematic (RTK) positioning and its applications, since incorrect ambiguity fixing can lead to largely biased positioning solutions. A partial ambiguity fixing technique is developed to improve the reliability of AR, involving partial ambiguity decorrelation (PAD) and partial ambiguity resolution (PAR). Decorrelation transformation could substantially amplify the biases in the phase measurements. The purpose of PAD is to find the optimum trade-off between decorrelation and worst-case bias amplification. The concept of PAR refers to the case where only a subset of the ambiguities can be fixed correctly to their integers in the integer least-squares (ILS) estimation system at high success rates. As a result, RTK solutions can be derived from these integer-fixed phase measurements. This is meaningful provided that the number of reliably resolved phase measurements is sufficiently large for least-square estimation of RTK solutions as well. Considering the GPS constellation alone, partially fixed measurements are often insufficient for positioning. The AR reliability is usually characterised by the AR success rate. In this contribution an AR validation decision matrix is firstly introduced to understand the impact of success rate. Moreover the AR risk probability is included into a more complete evaluation of the AR reliability. We use 16 ambiguity variance-covariance matrices with different levels of success rate to analyse the relation between success rate and AR risk probability. Next, the paper examines during the PAD process, how a bias in one measurement is propagated and amplified onto many others, leading to more than one wrong integer and to affect the success probability. Furthermore, the paper proposes a partial ambiguity fixing procedure with a predefined success rate criterion and ratio-test in the ambiguity validation process. In this paper, the Galileo constellation data is tested with simulated observations. Numerical results from our experiment clearly demonstrate that only when the computed success rate is very high, the AR validation can provide decisions about the correctness of AR which are close to real world, with both low AR risk and false alarm probabilities. The results also indicate that the PAR procedure can automatically chose adequate number of ambiguities to fix at given high-success rate from the multiple constellations instead of fixing all the ambiguities. This is a benefit that multiple GNSS constellations can offer.
Resumo:
Extraction of groundwater for onion and other cash crop production has been increasing rapidly during the last two decades in the dry zone areas of Sri Lanka. As a result of overuse, the quantity of available groundwater is gradually declining, while water quality is deteriorating. The deteriorating water quality has a negative impact on agricultural production, especially for crops (such as onions) that are sensitive to increases in salinity levels. This issue is examined with respect to onion production in Sri Lanka. A stochastic frontier production function (SFPF) is used, in which technical efficiency and the determinants of inefficiencies are estimated simultaneously. The results show that farmers are overusing groundwater in their onion cultivation, which has resulted in decreasing yields. Factors contributing to inefficiency in production are also identified. The results have important policy implications.