503 resultados para Examples


Relevância:

10.00% 10.00%

Publicador:

Resumo:

One approach to reducing the yield losses caused by banana viral diseases is the use of genetic engineering and pathogen-derived resistance strategies to generate resistant cultivars. The development of transgenic virus resistance requires an efficient banana transformation method, particularly for commercially important 'Cavendish' type cultivars such as 'Grand Nain'. Prior to this study, only two examples of the stable transformation of banana had been reported, both of which demonstrated the principle of transformation but did not characterise transgenic plants in terms of the efficiency at which individual transgenic lines were generated, relative activities of promoters in stably transformed plants, and the stability of transgene expression. The aim of this study was to develop more efficient transformation methods for banana, assess the activity of some commonly used and also novel promoters in stably transformed plants, and transform banana with genes that could potentially confer resistance to banana bunchy top nanovirus (BBTV) and banana bract mosaic potyvirus (BBrMV). A regeneration system using immature male flowers as the explant was established. The frequency of somatic embryogenesis in male flower explants was influenced by the season in which the inflorescences were harvested. Further, the media requirements of various banana cultivars in respect to the 2,4-D concentration in the initiation media also differed. Following the optimisation of these and other parameters, embryogenic cell suspensions of several banana (Musa spp.) cultivars including 'Grand Nain' (AAA), 'Williams' (AAA), 'SH-3362' (AA), 'Goldfinger' (AAAB) and 'Bluggoe' (ABB) were successfully generated. Highly efficient transformation methods were developed for both 'Bluggoe' and 'Grand Nain'; this is the first report of microprojectile bombardment transformation of the commercially important 'Grand Nain' cultivar. Following bombardment of embryogenic suspension cells, regeneration was monitored from single transfom1ed cells to whole plants using a reporter gene encoding the green fluorescent protein (gfp). Selection with kanamycin enabled the regeneration of a greater number of plants than with geneticin, while still preventing the regeneration of non-transformed plants. Southern hybridisation confirmed the neomycin phosphotransferase gene (npt II) was stably integrated into the banana genome and that multiple transgenic lines were derived from single bombardments. The activity, stability and tissue specificity of the cauliflower mosaic virus 358 (CaMV 35S) and maize polyubiquitin-1 (Ubi-1) promoters were examined. In stably transformed banana, the Ubi-1 promoter provided approximately six-fold higher p-glucuronidase (GUS) activity than the CaMV 35S promoter, and both promoters remained active in glasshouse grown plants for the six months they were observed. The intergenic regions ofBBTV DNA-I to -6 were isolated and fused to either the uidA (GUS) or gfjJ reporter genes to assess their promoter activities. BBTV promoter activity was detected in banana embryogenic cells using the gfp reporter gene. Promoters derived from BBTV DNA-4 and -5 generated the highest levels of transient activity, which were greater than that generated by the maize Ubi-1 promoter. In transgenic banana plants, the activity of the BBTV DNA-6 promoter (BT6.1) was restricted to the phloem of leaves and roots, stomata and root meristems. The activity of the BT6.1 promoter was enhanced by the inclusion of intron-containing fragments derived from the maize Ubi-1, rice Act-1, and sugarcane rbcS 5' untranslated regions in GUS reporter gene constructs. In transient assays in banana, the rice Act-1 and maize Ubi-1 introns provided the most significant enhancement, increasing expression levels 300-fold and 100-fold, respectively. The sugarcane rbcS intron increased expression about 10-fold. In stably transformed banana plants, the maize Ubi-1 intron enhanced BT6.1 promoter activity to levels similar to that of the CaMV 35S promoter, but did not appear to alter the tissue specificity of the promoter. Both 'Grand Nain' and 'Bluggoe' were transformed with constructs that could potentially confer resistance to BBTV and BBrMV, including constructs containing BBTV DNA-1 major and internal genes, BBTV DNA-5 gene, and the BBrMV coat protein-coding region all under the control of the Ubi-1 promoter, while the BT6 promoter was used to drive the npt II selectable marker gene. At least 30 transgenic lines containing each construct were identified and replicates of each line are currently being generated by micropropagation in preparation for virus challenge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation is primarily an applied statistical modelling investigation, motivated by a case study comprising real data and real questions. Theoretical questions on modelling and computation of normalization constants arose from pursuit of these data analytic questions. The essence of the thesis can be described as follows. Consider binary data observed on a two-dimensional lattice. A common problem with such data is the ambiguity of zeroes recorded. These may represent zero response given some threshold (presence) or that the threshold has not been triggered (absence). Suppose that the researcher wishes to estimate the effects of covariates on the binary responses, whilst taking into account underlying spatial variation, which is itself of some interest. This situation arises in many contexts and the dingo, cypress and toad case studies described in the motivation chapter are examples of this. Two main approaches to modelling and inference are investigated in this thesis. The first is frequentist and based on generalized linear models, with spatial variation modelled by using a block structure or by smoothing the residuals spatially. The EM algorithm can be used to obtain point estimates, coupled with bootstrapping or asymptotic MLE estimates for standard errors. The second approach is Bayesian and based on a three- or four-tier hierarchical model, comprising a logistic regression with covariates for the data layer, a binary Markov Random field (MRF) for the underlying spatial process, and suitable priors for parameters in these main models. The three-parameter autologistic model is a particular MRF of interest. Markov chain Monte Carlo (MCMC) methods comprising hybrid Metropolis/Gibbs samplers is suitable for computation in this situation. Model performance can be gauged by MCMC diagnostics. Model choice can be assessed by incorporating another tier in the modelling hierarchy. This requires evaluation of a normalization constant, a notoriously difficult problem. Difficulty with estimating the normalization constant for the MRF can be overcome by using a path integral approach, although this is a highly computationally intensive method. Different methods of estimating ratios of normalization constants (N Cs) are investigated, including importance sampling Monte Carlo (ISMC), dependent Monte Carlo based on MCMC simulations (MCMC), and reverse logistic regression (RLR). I develop an idea present though not fully developed in the literature, and propose the Integrated mean canonical statistic (IMCS) method for estimating log NC ratios for binary MRFs. The IMCS method falls within the framework of the newly identified path sampling methods of Gelman & Meng (1998) and outperforms ISMC, MCMC and RLR. It also does not rely on simplifying assumptions, such as ignoring spatio-temporal dependence in the process. A thorough investigation is made of the application of IMCS to the three-parameter Autologistic model. This work introduces background computations required for the full implementation of the four-tier model in Chapter 7. Two different extensions of the three-tier model to a four-tier version are investigated. The first extension incorporates temporal dependence in the underlying spatio-temporal process. The second extensions allows the successes and failures in the data layer to depend on time. The MCMC computational method is extended to incorporate the extra layer. A major contribution of the thesis is the development of a fully Bayesian approach to inference for these hierarchical models for the first time. Note: The author of this thesis has agreed to make it open access but invites people downloading the thesis to send her an email via the 'Contact Author' function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stream ciphers are encryption algorithms used for ensuring the privacy of digital telecommunications. They have been widely used for encrypting military communications, satellite communications, pay TV encryption and for voice encryption of both fixed lined and wireless networks. The current multi year European project eSTREAM, which aims to select stream ciphers suitable for widespread adoptation, reflects the importance of this area of research. Stream ciphers consist of a keystream generator and an output function. Keystream generators produce a sequence that appears to be random, which is combined with the plaintext message using the output function. Most commonly, the output function is binary addition modulo two. Cryptanalysis of these ciphers focuses largely on analysis of the keystream generators and of relationships between the generator and the keystream it produces. Linear feedback shift registers are widely used components in building keystream generators, as the sequences they produce are well understood. Many types of attack have been proposed for breaking various LFSR based stream ciphers. A recent attack type is known as an algebraic attack. Algebraic attacks transform the problem of recovering the key into a problem of solving multivariate system of equations, which eventually recover the internal state bits or the key bits. This type of attack has been shown to be effective on a number of regularly clocked LFSR based stream ciphers. In this thesis, algebraic attacks are extended to a number of well known stream ciphers where at least one LFSR in the system is irregularly clocked. Applying algebriac attacks to these ciphers has only been discussed previously in the open literature for LILI-128. In this thesis, algebraic attacks are first applied to keystream generators using stop-and go clocking. Four ciphers belonging to this group are investigated: the Beth-Piper stop-and-go generator, the alternating step generator, the Gollmann cascade generator and the eSTREAM candidate: the Pomaranch cipher. It is shown that algebraic attacks are very effective on the first three of these ciphers. Although no effective algebraic attack was found for Pomaranch, the algebraic analysis lead to some interesting findings including weaknesses that may be exploited in future attacks. Algebraic attacks are then applied to keystream generators using (p; q) clocking. Two well known examples of such ciphers, the step1/step2 generator and the self decimated generator are investigated. Algebraic attacks are shown to be very powerful attack in recovering the internal state of these generators. A more complex clocking mechanism than either stop-and-go or the (p; q) clocking keystream generators is known as mutual clock control. In mutual clock control generators, the LFSRs control the clocking of each other. Four well known stream ciphers belonging to this group are investigated with respect to algebraic attacks: the Bilateral-stop-and-go generator, A5/1 stream cipher, Alpha 1 stream cipher, and the more recent eSTREAM proposal, the MICKEY stream ciphers. Some theoretical results with regards to the complexity of algebraic attacks on these ciphers are presented. The algebraic analysis of these ciphers showed that generally, it is hard to generate the system of equations required for an algebraic attack on these ciphers. As the algebraic attack could not be applied directly on these ciphers, a different approach was used, namely guessing some bits of the internal state, in order to reduce the degree of the equations. Finally, an algebraic attack on Alpha 1 that requires only 128 bits of keystream to recover the 128 internal state bits is presented. An essential process associated with stream cipher proposals is key initialization. Many recently proposed stream ciphers use an algorithm to initialize the large internal state with a smaller key and possibly publicly known initialization vectors. The effect of key initialization on the performance of algebraic attacks is also investigated in this thesis. The relationships between the two have not been investigated before in the open literature. The investigation is conducted on Trivium and Grain-128, two eSTREAM ciphers. It is shown that the key initialization process has an effect on the success of algebraic attacks, unlike other conventional attacks. In particular, the key initialization process allows an attacker to firstly generate a small number of equations of low degree and then perform an algebraic attack using multiple keystreams. The effect of the number of iterations performed during key initialization is investigated. It is shown that both the number of iterations and the maximum number of initialization vectors to be used with one key should be carefully chosen. Some experimental results on Trivium and Grain-128 are then presented. Finally, the security with respect to algebraic attacks of the well known LILI family of stream ciphers, including the unbroken LILI-II, is investigated. These are irregularly clock- controlled nonlinear filtered generators. While the structure is defined for the LILI family, a particular paramater choice defines a specific instance. Two well known such instances are LILI-128 and LILI-II. The security of these and other instances is investigated to identify which instances are vulnerable to algebraic attacks. The feasibility of recovering the key bits using algebraic attacks is then investigated for both LILI- 128 and LILI-II. Algebraic attacks which recover the internal state with less effort than exhaustive key search are possible for LILI-128 but not for LILI-II. Given the internal state at some point in time, the feasibility of recovering the key bits is also investigated, showing that the parameters used in the key initialization process, if poorly chosen, can lead to a key recovery using algebraic attacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of learning paradigms of identification in the limit, we address the question: why is uncertainty sometimes desirable? We use mind change bounds on the output hypotheses as a measure of uncertainty, and interpret ‘desirable’ as reduction in data memorization, also defined in terms of mind change bounds. The resulting model is closely related to iterative learning with bounded mind change complexity, but the dual use of mind change bounds — for hypotheses and for data — is a key distinctive feature of our approach. We show that situations exists where the more mind changes the learner is willing to accept, the lesser the amount of data it needs to remember in order to converge to the correct hypothesis. We also investigate relationships between our model and learning from good examples, set-driven, monotonic and strong-monotonic learners, as well as class-comprising versus class-preserving learnability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural health monitoring has been accepted as a justified effort for long-span bridges, which are critical to a region's economic vitality. As the most heavily instrumented bridge project in the world, WASHMS - Wind And Structural Health Monitoring System has been developed and installed on the cable-supported bridges in Hong Kong (Wong and Ni 2009a). This chapter aims to share some of the experience gained through the operations and studies on the application of WASHMS. It is concluded that Structural Health Monitoring should be composed of two main components: Structural Performance Monitoring (SPM) and Structural Safety Evaluation (SSE). As an example to illustrate how the WASHMS could be used for structural performance monitoring, the layout of the sensory system installed on the Tsing Ma Bridge is briefly described. To demonstrate the two broad approaches of structural safety evaluation - Structural Health Assessment and Damage Detection, three examples in the application of SHM information are presented. These three examples can be considered as pioneer works for the research and development of the structural diagnosis and prognosis tools required by the structural health monitoring for monitoring and evaluation applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most statistical methods use hypothesis testing. Analysis of variance, regression, discrete choice models, contingency tables, and other analysis methods commonly used in transportation research share hypothesis testing as the means of making inferences about the population of interest. Despite the fact that hypothesis testing has been a cornerstone of empirical research for many years, various aspects of hypothesis tests commonly are incorrectly applied, misinterpreted, and ignored—by novices and expert researchers alike. On initial glance, hypothesis testing appears straightforward: develop the null and alternative hypotheses, compute the test statistic to compare to a standard distribution, estimate the probability of rejecting the null hypothesis, and then make claims about the importance of the finding. This is an oversimplification of the process of hypothesis testing. Hypothesis testing as applied in empirical research is examined here. The reader is assumed to have a basic knowledge of the role of hypothesis testing in various statistical methods. Through the use of an example, the mechanics of hypothesis testing is first reviewed. Then, five precautions surrounding the use and interpretation of hypothesis tests are developed; examples of each are provided to demonstrate how errors are made, and solutions are identified so similar errors can be avoided. Remedies are provided for common errors, and conclusions are drawn on how to use the results of this paper to improve the conduct of empirical research in transportation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reflective skills are widely regarded as a means of improving students’ lifelong learning and professional practice in higher education (Rogers 2001). While the value of reflective practice is widely accepted in educational circles, a critical issue is that reflective writing is complex, and has high rhetorical demands, making it difficult to master unless it is taught in an explicit and systematic way. This paper argues that a functional-semantic approach to language (Eggins 2004), based on Halliday’s (1978) systemic functional linguistics can be used to develop a shared language to explicitly teach and assess reflective writing in higher education courses. The paper outlines key theories and scales of reflection, and then uses systemic functional linguistics to develop a social semiotic model for reflective writing. Examples of reflective writing are analysed to show how such a model can be used explicitly to improve the reflective writing skills of higher education students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predictions that result from scientific research hold great appeal for decision-makers who are grappling with complex and controversial environmental issues, by promising to enhance their ability to determine a need for and outcomes of alternative decisions. A problem exists in that decision-makers and scientists in the public and private sectors solicit, produce, and use such predictions with little understanding of their accuracy or utility, and often without systematic evaluation or mechanisms of accountability. In order to contribute to a more effective role for ecological science in support of decision-making, this paper discusses three ``best practices'' for quantitative ecosystem modeling and prediction gleaned from research on modeling, prediction, and decision-making in the atmospheric and earth sciences. The lessons are distilled from a series of case studies and placed into the specific context of examples from ecological science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

My research investigates why nouns are learned disproportionately more frequently than other kinds of words during early language acquisition (Gentner, 1982; Gleitman, et al., 2004). This question must be considered in the context of cognitive development in general. Infants have two major streams of environmental information to make meaningful: perceptual and linguistic. Perceptual information flows in from the senses and is processed into symbolic representations by the primitive language of thought (Fodor, 1975). These symbolic representations are then linked to linguistic input to enable language comprehension and ultimately production. Yet, how exactly does perceptual information become conceptualized? Although this question is difficult, there has been progress. One way that children might have an easier job is if they have structures that simplify the data. Thus, if particular sorts of perceptual information could be separated from the mass of input, then it would be easier for children to refer to those specific things when learning words (Spelke, 1990; Pylyshyn, 2003). It would be easier still, if linguistic input was segmented in predictable ways (Gentner, 1982; Gleitman, et al., 2004) Unfortunately the frequency of patterns in lexical or grammatical input cannot explain the cross-cultural and cross-linguistic tendency to favor nouns over verbs and predicates. There are three examples of this failure: 1) a wide variety of nouns are uttered less frequently than a smaller number of verbs and yet are learnt far more easily (Gentner, 1982); 2) word order and morphological transparency offer no insight when you contrast the sentence structures and word inflections of different languages (Slobin, 1973) and 3) particular language teaching behaviors (e.g. pointing at objects and repeating names for them) have little impact on children's tendency to prefer concrete nouns in their first fifty words (Newport, et al., 1977). Although the linguistic solution appears problematic, there has been increasing evidence that the early visual system does indeed segment perceptual information in specific ways before the conscious mind begins to intervene (Pylyshyn, 2003). I argue that nouns are easier to learn because their referents directly connect with innate features of the perceptual faculty. This hypothesis stems from work done on visual indexes by Zenon Pylyshyn (2001, 2003). Pylyshyn argues that the early visual system (the architecture of the "vision module") segments perceptual data into pre-conceptual proto-objects called FINSTs. FINSTs typically correspond to physical things such as Spelke objects (Spelke, 1990). Hence, before conceptualization, visual objects are picked out by the perceptual system demonstratively, like a finger pointing indicating ‘this’ or ‘that’. I suggest that this primitive system of demonstration elaborates on Gareth Evan's (1982) theory of nonconceptual content. Nouns are learnt first because their referents attract demonstrative visual indexes. This theory also explains why infants less often name stationary objects such as plate or table, but do name things that attract the focal attention of the early visual system, i.e., small objects that move, such as ‘dog’ or ‘ball’. This view leaves open the question how blind children learn words for visible objects and why children learn category nouns (e.g. 'dog'), rather than proper nouns (e.g. 'Fido') or higher taxonomic distinctions (e.g. 'animal').

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the expected safety performance of rural signalized intersections is critical for (a) identifying high-risk sites where the observed safety performance is substantially worse than the expected safety performance, (b) understanding influential factors associated with crashes, and (c) predicting the future performance of sites and helping plan safety-enhancing activities. These three critical activities are routinely conducted for safety management and planning purposes in jurisdictions throughout the United States and around the world. This paper aims to develop baseline expected safety performance functions of rural signalized intersections in South Korea, which to date have not yet been established or reported in the literature. Data are examined from numerous locations within South Korea for both three-legged and four-legged configurations. The safety effects of a host of operational and geometric variables on the safety performance of these sites are also examined. In addition, supplementary tables and graphs are developed for comparing the baseline safety performance of sites with various geometric and operational features. These graphs identify how various factors are associated with safety. The expected safety prediction tables offer advantages over regression prediction equations by allowing the safety manager to isolate specific features of the intersections and examine their impact on expected safety. The examination of the expected safety performance tables through illustrated examples highlights the need to correct for regression-to-the-mean effects, emphasizes the negative impacts of multicollinearity, shows why multivariate models do not translate well to accident modification factors, and illuminates the need to examine road safety carefully and methodically. Caveats are provided on the use of the safety performance prediction graphs developed in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An essential challenge for organizations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. Using three case examples, this paper explores how Enterprise 2.0 technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering organizations. The paper is intended to be a timely introduction to the benefits and issues associated with the use of Enterprise 2.0 technologies with the aim of achieving the positive outcomes associated with knowledge management

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the simulation model development of passenger flow in a metro station. The model allows studies of passenger flow in stations with different layouts and facilities, thus providing valuable information, such as passenger flow and density of passenger at critical locations and passenger-handling facilities within a station, to the operators. The adoption of the concept of Petri nets in the simulation model is discussed. Examples are provided to demonstrate its application to passenger flow analysis, train scheduling and the testing of alternative station layouts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper outlines some of the experiences of Indigenous women academics in higher education. The author offers these experiences, not to position Indigenous women academics as victims, but to expose the problematic nature of racism, systemic marginalisation, white race privilege and radicalised subjectivity played out within Australian higher education institutions. By utilising the experiences and examples she seeks to bring the theoretical into the everyday world of being Indigenous within academe. In analysing these examples, the author reveals the relationships between oppression, white race privilege, institutional privilege and the epistemology that maintains them. She argues that, in moving from a position of being silent to speaking about what she has witnessed and experienced, she is able to move from the position of object to subject and gain a form of liberated voice (hooks 1989: 9) for herself and other Indigenous women. She seeks to challenge the practices within universities that continue to subjugate Indigenous women academics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Groundwater is increasingly recognised as an important yet vulnerable natural resource, and a key consideration in water cycle management. However, communication of sub-surface water system behaviour, as an important part of encouraging better water management, is visually difficult. Modern 3D visualisation techniques can be used to effectively communicate these complex behaviours to engage and inform community stakeholders. Most software developed for this purpose is expensive and requires specialist skills. The Groundwater Visualisation System (GVS) developed by QUT integrates a wide range of surface and sub-surface data, to produce a 3D visualisation of the behaviour, structure and connectivity of groundwater/surface water systems. Surface data (elevation, surface water, land use, vegetation and geology) and data collected from boreholes (bore locations and subsurface geology) are combined to visualise the nature, structure and connectivity of groundwater/surface water systems. Time-series data (water levels, groundwater quality, rainfall, stream flow and groundwater abstraction) is displayed as an animation within the 3D framework, or graphically, to show water system condition changes over time. GVS delivers an interactive, stand-alone 3D Visualisation product that can be used in a standard PC environment. No specialised training or modelling skills are required. The software has been used extensively in the SEQ region to inform and engage both water managers and the community alike. Examples will be given of GVS visualisations developed in areas where there have been community concerns around groundwater over-use and contamination.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Music making affects relationships with self and others by generating a sense of belonging to a culture or ideology (Bamford, 2006; Barovick, 2001; Dillon & Stewart, 2006; Fiske, 2000; Hallam, 2001). Whilst studies from arts education research present compelling examples of these relationships, others argue that they do not present sufficiently validated evidence of a causal link between music making experiences and cognitive or social change (Winner & Cooper, 2000; Winner & Hetland, 2000a, 2000b, 2001). I have suggested elsewhere that this disconnection between compelling evidence and observations of the effects of music making are in part due to the lack of rigor in research and the incapacity of many methods to capture these experiences in meaningful ways (Dillon, 2006). Part of the answer to these questions about rigor and causality lay in the creative use of new media technologies that capture the results of relationships in music artefacts. Crucially, it is the effective management of these artefacts within computer systems that allows researchers and practitioners to collect, organize, analyse and then theorise such music making experiences.