973 resultados para pre-emphasis technique


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most unsignalised intersection capacity calculation procedures are based on gap acceptance models. Accuracy of critical gap estimation affects accuracy of capacity and delay estimation. Several methods have been published to estimate drivers’ sample mean critical gap, the Maximum Likelihood Estimation (MLE) technique regarded as the most accurate. This study assesses three novel methods; Average Central Gap (ACG) method, Strength Weighted Central Gap method (SWCG), and Mode Central Gap method (MCG), against MLE for their fidelity in rendering true sample mean critical gaps. A Monte Carlo event based simulation model was used to draw the maximum rejected gap and accepted gap for each of a sample of 300 drivers across 32 simulation runs. Simulation mean critical gap is varied between 3s and 8s, while offered gap rate is varied between 0.05veh/s and 0.55veh/s. This study affirms that MLE provides a close to perfect fit to simulation mean critical gaps across a broad range of conditions. The MCG method also provides an almost perfect fit and has superior computational simplicity and efficiency to the MLE. The SWCG method performs robustly under high flows; however, poorly under low to moderate flows. Further research is recommended using field traffic data, under a variety of minor stream and major stream flow conditions for a variety of minor stream movement types, to compare critical gap estimates using MLE against MCG. Should the MCG method prove as robust as MLE, serious consideration should be given to its adoption to estimate critical gap parameters in guidelines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Pre-participation screening is commonly used to measure and assess potential intrinsic injury risk. The single leg squat is one such clinical screening measure used to assess lumbopelvic stability and associated intrinsic injury risk. With the addition of a decline board, the single leg decline squat (SLDS) has been shown to reduce ankle dorsiflexion restrictions and allowed greater sagittal plane movement of the hip and knee. On this basis, the SLDS has been employed in the Cricket Australia physiotherapy screening protocols as a measure of lumbopelvic control in the place of the more traditional single leg flat squat (SLFS). Previous research has failed to demonstrate which squatting technique allows for a more comprehensive assessment of lumbopelvic stability. Tenuous links are drawn between kinematics and hip strength measures within the literature for the SLS. Formal evaluation of subjective screening methods has also been suggested within the literature. Purpose: This study had several focal points namely 1) to compare the kinematic differences between the two single leg squatting conditions, primarily the five key kinematic variables fundamental to subjectively assess lumbopelvic stability; 2) determine the effect of ankle dorsiflexion range of motion has on squat kinematics in the two squat techniques; 3) examine the association between key kinematics and subjective physiotherapists’ assessment; and finally 4) explore the association between key kinematics and hip strength. Methods: Nineteen (n=19) subjects performed five SLDS and five SLFS on each leg while being filmed by an 8 camera motion analysis system. Four hip strength measures (internal/external rotation and abd/adduction) and ankle dorsiflexion range of motion were measured using a hand held dynamometer and a goniometer respectively on 16 of these subjects. The same 16 participants were subjectively assessed by an experienced physiotherapist for lumbopelvic stability. Paired samples t-tests were performed on the five predetermined kinematic variables to assess the differences between squat conditions. A Bonferroni correction for multiple comparisons was used which adjusted the significance value to p = 0.005 for the paired t-tests. Linear regressions were used to assess the relationship between kinematics, ankle range of motion and hip strength measures. Bivariate correlations between hip strength measures and kinematics and pelvic obliquity were employed to investigate any possible relationships. Results: 1) Significant kinematic differences between squats were observed in dominant (D) and non-dominant (ND) end of range hip external rotation (ND p = <0.001; D p = 0.004) and hip adduction kinematics (ND p = <0.001; D p = <0.001). With the mean angle, only the non-dominant leg observed significant differences in hip adduction (p = 0.001) and hip external rotation (p = <0.001); 2) Significant linear relationships were observed between clinical measures of ankle dorsiflexion and sagittal plane kinematic namely SLFS dominant ankle (p = 0.006; R2 = .429), SLFS non-dominant knee (p = 0.015; R2 = .352) and SLFS non-dominant ankle (p = 0.027; R2 = .305) kinematics. Only the dominant ankle (p = 0.020; R2 = .331) was found to have a relationship with the decline squat. 3) Strength measures had tenuous associations with the subjective assessments of lumbopelvic stability with no significant relationships being observed. 4) For the non-dominant leg, external rotation strength and abduction strength were found to be significantly correlated with hip rotation kinematics (Newtons r = 0.458 p = 0.049; Normalised for bodyweight: r = 0.469; p = 0.043) and pelvic obliquity (normalised for bodyweight: r = 0.498 p = 0.030) respectively for the SLFS only. No significant relationships were observed in the dominant leg for either squat condition. Some elements of the hip strength screening protocols had linear relationships with kinematics of the lower limb, particularly the sagittal plane movements of the knee and ankle. Strength measures had tenuous associations with the subjective assessments of lumbopelvic stability with no significant relationships being observed; Discussion: The key finding of this study illustrated that kinematic differences can occur at the hip without significant kinematic differences at the knee as a result of the introduction of a decline board. Further observations reinforce the role of limited ankle dorsiflexion range of motion on sagittal plane movement of the hip and knee and in turn multiplanar kinematics of the lower limb. The kinematic differences between conditions have clinical implications for screening protocols that employ frontal plane movement of the knee as a guide for femoral adduction and rotation. Subjects who returned stronger hip strength measurements also appeared to squat deeper as characterised by differences in sagittal plane kinematics of the knee and ankle. Despite the aforementioned findings, the relationship between hip strength and lower limb kinematics remains largely tenuous in the assessment of the lumbopelvic stability using the SLS. The association between kinematics and the subjective measures of lumbopelvic stability also remain tenuous between and within SLS screening protocols. More functional measures of hip strength are needed to further investigate these relationships. Conclusion: The type of SLS (flat or decline) should be taken into account when screening for lumbopelvic stability. Changes to lower limb kinematics, especially around the hip and pelvis, were observed with the introduction of a decline board despite no difference in frontal plane knee movements. Differences in passive ankle dorsiflexion range of motion yielded variations in knee and ankle kinematics during a self-selected single leg squatting task. Clinical implications of removing posterior ankle restraints and using the knee as a guide to illustrate changes at the hip may result in inaccurate screening of lumbopelvic stability. The relationship between sagittal plane lower limb kinematics and hip strength may illustrate that self-selected squat depth may presumably be a useful predictor of the lumbopelvic stability. Further research in this area is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a law enforcement standpoint, the ability to search for a person matching a semantic description (i.e. 1.8m tall, red shirt, jeans) is highly desirable. While a significant research effort has focused on person re-detection (the task of identifying a previously observed individual in surveillance video), these techniques require descriptors to be built from existing image or video observations. As such, person re-detection techniques are not suited to situations where footage of the person of interest is not readily available, such as a witness reporting a recent crime. In this paper, we present a novel framework that is able to search for a person based on a semantic description. The proposed approach uses size and colour cues, and does not require a person detection routine to locate people in the scene, improving utility in crowded conditions. The proposed approach is demonstrated with a new database that will be made available to the research community, and we show that the proposed technique is able to correctly localise a person in a video based on a simple semantic description.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence is mounting that values education is providing positive outcomes for students, teachers and schools (Benninga, Berkowitz, Kuehn, & Smith, 2006; DEST, 2008; Hattie, 2003; Lovat, Clement, Dally, & Toomey, 2010). Despite this, Australian pre-service teacher education does not appear to be changing in ways necessary to support skilling teachers to teach with a values focus (Lovat, Dally, Clement, and Toomey, 2011). This article presents findings from a case study that explored current teachers’ perceptions of the skills pre-service teachers need to teach values education effectively. Teachers who currently teach with a values focus highlighted that pre-service teacher education degrees need to encourage an ongoing commitment to continual learning, critical reflection and growth in pre-service teachers, along with excellent questioning and listening skills. Further, they argued that pre-service teachers need to be skilled in recognising and responding to student diversity. This article ends by arguing for some changes that need to occur in pre-service teacher education in order for teachers to teach effectively with a values focus, including the need for stronger connections between pre-service and experienced teachers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of battery energy storage (BES) system for the grid-connected doubly fed induction generator (DFIG). The BES would help in storing/releasing additional power in case of higher/lower wind speed to maintain constant grid power. The DC link capacitor is replaced with the BES system in a DFIG-based wind turbine to achieve the above-mentioned goal. The control scheme is modified and the co-ordinated tuning of the associated controllers to enhance the damping of the oscillatory modes is presented using bacterial foraging technique. The results from eigenvalue analysis and the time domain simulation studies are presented to elucidate the effectiveness of the BES systems in maintaining the grid stability under normal operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To develop a rapid optimized technique of wide-field imaging of the human corneal subbasal nerve plexus. A dynamic fixation target was developed and, coupled with semiautomated tiling software, a rapid method of capturing and montaging multiple corneal confocal microscopy images was created. To illustrate the utility of this technique, wide-field maps of the subbasal nerve plexus were produced in 2 participants with diabetes, 1 with and 1 without neuropathy. The technique produced montages of the central 3 mm of the subbasal corneal nerve plexus. The maps seem to show a general reduction in the number of nerve fibers and branches in the diabetic participant with neuropathy compared with the individual without neuropathy. This novel technique will allow more routine and widespread use of subbasal nerve plexus mapping in clinical and research situations. The significant reduction in the time to image the corneal subbasal nerve plexus should expedite studies of larger groups of diabetic patients and those with other conditions affecting nerve fibers. The inferior whorl and the surrounding areas may show the greatest loss of nerve fibers in individuals with diabetic neuropathy, but this should be further investigated in a larger cohort.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large-scale molecular dynamics simulations are performed to characterize the effects of pre-existing surface defects on the vibrational properties of Ag nanowires. It is found that the first order natural frequency of the nanowire appears insensitive to different surface defects, indicating a defect insensitivity property of the nanowire’s Young’s modulus. In the meanwhile, an increase of the quality (Q)-factor is observed due to the presence of defects. Particular, a beat phenomenon is observed for the nanowire with the presence of a surface edge defect, which is driven by a single actuation. It is concluded that different surface defects could act as an effective mean to tune the vibrational properties of nanowires. This study sheds lights on the better understanding of nanowire’s mechanical performance when surface defects are presented, which would benefit the development of nanowire-based devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pre-packaged administrations have been prevalent in the UK for years. However, Australia's voluntary administration regime has been more restrictive of the practice. This article analyses the evolution of UK pre-packs, why they are not prevalent in Australia and the challenges for UK and Australian lawmakers in striking the right balance with pre-packs in their respective administration regimes. The article proposes a mechanism that might make ‘connected-party’ pre-pack business sales work more fairly for stakeholders — that is, by obligating a connected-party purchaser to make a future-income contribution in favour of the insolvent company whose business has been ‘rescued’ by a pre-packaged sale in administration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How do you identify "good" teaching practice in the complexity of a real classroom? How do you know that beginning teachers can recognise effective digital pedagogy when they see it? How can teacher educators see through their students’ eyes? The study in this paper has arisen from our interest in what pre-service teachers “see” when observing effective classroom practice and how this might reveal their own technological, pedagogical and content knowledge. We asked 104 pre-service teachers from Early Years, Primary and Secondary cohorts to watch and comment upon selected exemplary videos of teachers using ICT (information and communication technologies) in Science. The pre-service teachers recorded their observations using a simple PMI (plus, minus, interesting) matrix which were then coded using the SOLO Taxonomy to look for evidence of their familiarity with and judgements of digital pedagogies. From this, we determined that the majority of preservice teachers we surveyed were using a descriptive rather than a reflective strategy, that is, not extending beyond what was demonstrated in the teaching exemplar or differentiating between action and purpose. We also determined that this method warrants wider trialling as a means of evaluating students’ understandings of the complexity of the digital classroom.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The technique of femoral cement-in-cement revision is well established, but there are no previous series reporting its use on the acetabular side at the time of revision total hip arthroplasty. We describe the surgical technique and report the outcome of 60 consecutive cement-in-cement revisions of the acetabular component at a mean follow-up of 8.5 years (range 5-12 years). All had a radiologically and clinically well fixed acetabular cement mantle at the time of revision. 29 patients died. No case was lost to follow-up. The 2 most common indications for acetabular revision were recurrent dislocation (77%) and to compliment a femoral revision (20%). There were 2 cases of aseptic cup loosening (3.3%) requiring re-revision. No other hip was clinically or radiologically loose (96.7%) at latest follow-up. One case was re-revised for infection, 4 for recurrent dislocation and 1 for disarticulation of a constrained component. At 5 years, the Kaplan-Meier survival rate was 100% for aseptic loosening and 92.2% (95% CI; 84.8-99.6%) with revision for all causes as the endpoint. These results support the use of the cement-in-cement revision technique in appropriate cases on the acetabular side. Theoretical advantages include preservation of bone stock, reduced operating time, reduced risk of complications and durable fixation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most current computer systems authorise the user at the start of a session and do not detect whether the current user is still the initial authorised user, a substitute user, or an intruder pretending to be a valid user. Therefore, a system that continuously checks the identity of the user throughout the session is necessary without being intrusive to end-user and/or effectively doing this. Such a system is called a continuous authentication system (CAS). Researchers have applied several approaches for CAS and most of these techniques are based on biometrics. These continuous biometric authentication systems (CBAS) are supplied by user traits and characteristics. One of the main types of biometric is keystroke dynamics which has been widely tried and accepted for providing continuous user authentication. Keystroke dynamics is appealing for many reasons. First, it is less obtrusive, since users will be typing on the computer keyboard anyway. Second, it does not require extra hardware. Finally, keystroke dynamics will be available after the authentication step at the start of the computer session. Currently, there is insufficient research in the CBAS with keystroke dynamics field. To date, most of the existing schemes ignore the continuous authentication scenarios which might affect their practicality in different real world applications. Also, the contemporary CBAS with keystroke dynamics approaches use characters sequences as features that are representative of user typing behavior but their selected features criteria do not guarantee features with strong statistical significance which may cause less accurate statistical user-representation. Furthermore, their selected features do not inherently incorporate user typing behavior. Finally, the existing CBAS that are based on keystroke dynamics are typically dependent on pre-defined user-typing models for continuous authentication. This dependency restricts the systems to authenticate only known users whose typing samples are modelled. This research addresses the previous limitations associated with the existing CBAS schemes by developing a generic model to better identify and understand the characteristics and requirements of each type of CBAS and continuous authentication scenario. Also, the research proposes four statistical-based feature selection techniques that have highest statistical significance and encompasses different user typing behaviors which represent user typing patterns effectively. Finally, the research proposes the user-independent threshold approach that is able to authenticate a user accurately without needing any predefined user typing model a-priori. Also, we enhance the technique to detect the impostor or intruder who may take over during the entire computer session.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Cervical cancer and infection with human immunodeficiency virus (HIV) are both important public health problems in South Africa (SA). The aim of this study was to determine the prevalence of cervical squamous intraepithelial lesions (SILs), high-risk human papillomavirus (HR-HPV), HPV viral load and HPV genotypes in HIV positive women initiating anti-retroviral (ARV) therapy. Methods A cross-sectional survey was conducted at an anti-retroviral (ARV) treatment clinic in Cape Town, SA in 2007. Cervical specimens were taken for cytological analysis and HPV testing. The Digene Hybrid Capture 2 (HC2) test was used to detect HR-HPV. Relative light units (RLU) were used as a measure of HPV viral load. HPV types were determined using the Roche Linear Array HPV Genotyping test. Crude associations with abnormal cytology were tested and multiple logistic regression was used to determine independent risk factors for abnormal cytology. Results The median age of the 109 participants was 31 years, the median CD4 count was 125/mm3, 66.3% had an abnormal Pap smear, the HR-HPV prevalence was 78.9% (Digene), the median HPV viral load was 181.1 RLU (HC2 positive samples only) and 78.4% had multiple genotypes. Among women with abnormal smears the most prevalent HR-HPV types were HPV types 16, 58 and 51, all with a prevalence of 28.5%. On univariate analysis HR-HPV, multiple HPV types and HPV viral load were significantly associated with the presence of low and high-grade SILs (LSIL/HSIL). The multivariate logistic regression showed that HPV viral load was associated with an increased odds of LSIL/HSIL, odds ratio of 10.7 (95% CI 2.0 – 57.7) for those that were HC2 positive and had a viral load of ≤ 181.1 RLU (the median HPV viral load), and 33.8 (95% CI 6.4 – 178.9) for those that were HC2 positive with a HPV viral load > 181.1 RLU. Conclusion Women initiating ARVs have a high prevalence of abnormal Pap smears and HR-HPV. Our results underscore the need for locally relevant, rigorous screening protocols for the increasing numbers of women accessing ARV therapy so that the benefits of ARVs are not partially offset by an excess risk in cervical cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pretretament is an essential and expensive processing step for the manufacturing of ethanol from lignocellulosic raw materials. Ionic liquids are a new class of solvents that have the potential to be used as pretreatment agents. The attractive characteristics of ionic liquid pretreatment of lignocellulosics such as thermal stability, dissolution properties, fractionation potential, cellulose decrystallisation capacity and saccharification impact are investigated in this thesis. Dissolution of bagasse with 1-butyl-3-methylimidazolium chloride ([C4mim]Cl) at high temperatures (110 �‹C to 160 �‹C) is investigated as a pretreatment process. Material balances are reported and used along with enzymatic saccharification data to identify optimum pretreatment conditions (150 �‹C for 90 min). At these conditions, the dissolved and reprecipitated material is enriched in cellulose, has a low crystallinity and the cellulose component is efficiently hydrolysed (93 %, 3 h, 15 FPU). At pretreatment temperatures < 150 �‹C, the undissolved material has only slightly lower crystallinity than the starting. At pretreatment temperatures . 150 �‹C, the undissolved material has low crystallinity and when combined with the dissolved material has a saccharification rate and extent similar to completely dissolved material (100 %, 3h, 15 FPU). Complete dissolution is not necessary to maximize saccharification efficiency at temperatures . 150 �‹C. Fermentation of [C4mim]Cl-pretreated, enzyme-saccharified bagasse to ethanol is successfully conducted (85 % molar glucose-to-ethanol conversion efficiency). As compared to standard dilute acid pretreatment, the optimised [C4mim]Cl pretreatment achieves substantially higher ethanol yields (79 % cf. 52 %) in less than half the processing time (pretreatment, saccharification, fermentation). Fractionation of bagasse partially dissolved in [C4mim]Cl to a polysaccharide rich and a lignin rich fraction is attempted using aqueous biphasic systems (ABSs) and single phase systems with preferential precipitation. ABSs of ILs and concentrated aqueous inorganic salt solutions are achievable (e.g. [C4mim]Cl with 200 g L-1 NaOH), albeit they exhibit a number of technical problems including phase convergence (which increases with increasing biomass loading) and deprotonation of imidazolium ILs (5 % - 8 % mol). Single phase fractionation systems comprising lignin solvents / cellulose antisolvents, viz. NaOH (2M) and acetone in water (1:1, volume basis), afford solids with, respectively, 40 % mass and 29 % mass less lignin than water precipitated solids. However, this delignification imparts little increase in saccharification rates and extents of these solids. An alternative single phase fractionation system is achieved simply by using water as an antisolvent. Regulating the water : IL ratio results in a solution that precipitates cellulose and maintains lignin in solution (0.5 water : IL mass ratio) in both [C4mim]Cl and 1-ethyl-3-methylimidazolium acetate ([C2mim]OAc)). This water based fractionation is applied in three IL pretreatments on bagasse ([C4mim]Cl, 1-ethyl-3-methyl imidazolium chloride ([C2mim]Cl) and [C2mim]OAc). Lignin removal of 10 %, 50 % and 60 % mass respectively is achieved although only 0.3 %, 1.5 % and 11.7 % is recoverable even after ample water addition (3.5 water : IL mass ratio) and acidification (pH . 1). In addition the recovered lignin fraction contains 70 % mass hemicelluloses. The delignified, cellulose-rich bagasse recovered from these three ILs is exposed to enzyme saccharification. The saccharification (24 h, 15 FPU) of the cellulose mass in starting bagasse, achieved by these pretreatments rank as: [C2mim]OAc (83 %)>>[C2mim]Cl (53 %)=[C4mim]Cl(53%). Mass balance determinations accounted for 97 % of starting bagasse mass for the [C4mim]Cl pretreatment , 81 % for [C2mim]Cl and 79 %for [C2mim]OAc. For all three IL treatments, the remaining bagasse mass (not accounted for by mass balance determinations) is mainly (more than half) lignin that is not recoverable from the liquid fraction. After pretreatment, 100 % mass of both ions of all three ILs were recovered in the liquid fraction. Compositional characteristics of [C2mim]OAc treated solids such as low lignin, low acetyl group content and preservation of arabinosyl groups are opposite to those of chloride IL treated solids. The former biomass characteristics resemble those imparted by aqueous alkali pretreatment while the latter resemble those of aqueous acid pretreatments. The 100 % mass recovery of cellulose in [C2mim]OAc as opposed to 53 % mass recovery in [C2mim]Cl further demonstrates this since the cellulose glycosidic bonds are protected under alkali conditions. The alkyl chain length decrease in the imidazolium cation of these ILs imparts higher rates of dissolution and losses, and increases the severity of the treatment without changing the chemistry involved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.