540 resultados para Refining
Resumo:
Background: Integrating 3D virtual world technologies into educational subjects continues to draw the attention of educators and researchers alike. The focus of this study is the use of a virtual world, Second Life, in higher education teaching. In particular, it explores the potential of using a virtual world experience as a learning component situated within a curriculum delivered predominantly through face-to-face teaching methods. Purpose: This paper reports on a research study into the development of a virtual world learning experience designed for marketing students taking a Digital Promotions course. The experience was a field trip into Second Life to allow students to investigate how business branding practices were used for product promotion in this virtual world environment. The paper discusses the issues involved in developing and refining the virtual course component over four semesters. Methods: The study used a pedagogical action research approach, with iterative cycles of development, intervention and evaluation over four semesters. The data analysed were quantitative and qualitative student feedback collected after each field trip as well as lecturer reflections on each cycle. Sample: Small-scale convenience samples of second- and third-year students studying in a Bachelor of Business degree, majoring in marketing, taking the Digital Promotions subject at a metropolitan university in Queensland, Australia participated in the study. The samples included students who had and had not experienced the field trip. The numbers of students taking part in the field trip ranged from 22 to 48 across the four semesters. Findings and Implications: The findings from the four iterations of the action research plan helped identify key considerations for incorporating technologies into learning environments. Feedback and reflections from the students and lecturer suggested that an innovative learning opportunity had been developed. However, pedagogical potential was limited, in part, by technological difficulties and by student perceptions of relevance.
Resumo:
This paper presents an overview of NTCIR-9 Cross-lingual Link Discovery (Crosslink) task. The overview includes: the motivation of cross-lingual link discovery; the Crosslink task definition; the run submission specification; the assessment and evaluation framework; the evaluation metrics; and the evaluation results of submitted runs. Cross-lingual link discovery (CLLD) is a way of automatically finding potential links between documents in different languages. The goal of this task is to create a reusable resource for evaluating automated CLLD approaches. The results of this research can be used in building and refining systems for automated link discovery. The task is focused on linking between English source documents and Chinese, Korean, and Japanese target documents.
Resumo:
The presence of colour in raw sugar plays a key role in the marketing strategy of the Australian raw sugar industry. Some sugars are relatively difficult to decolourise during refining and develop colour during storage. A new approach that might result in efficient and cost-effective colour removal during the sugar manufacturing process is the use of an advanced oxidation process (AOP), known as Fenton oxidation, that is, catalytic production of hydroxyl radicals from the decomposition of hydrogen peroxide using ferrous iron. As a first step towards developing this technology, this study determined the composition of colour precursors present in the juice of cane harvested by three different methods. The methods were harvesting cane after burning, harvesting the whole crop with half of the trash extracted and harvesting the whole crop with no trash extracted. The study also investigated the degradation at pH 3, 4 and 5 of a phenolic compound, caffeic acid (3,4–dihydroxycinnamic acid), which is present in sugar cane juice, using both hydrogen peroxide and Fenton’s reagent. The results show that juice expressed from whole crop cane has significantly higher colour than juices expressed from burnt cane. However, the concentrations of phenolic acids were lower in the juices expressed from whole crop cane. The main phenolic acids present in these juices were p-coumaric, vanillic, 2,3–dihydroxybenzoic, gallic and 3,4–dihydroxybenzoic acids. The degradation of caffeic acid significantly improved using Fenton’s reagent in comparison to hydrogen peroxide alone. The Fenton oxidation was optimum at pH 5 when up to ~86 % of caffeic acid degraded within 5 min.
Resumo:
The presence of colour in raw sugar plays a key role in the marketing strategy of the Australian raw sugar industry. Some sugars are relatively difficult to decolourise during refining and develop colour during storage. A new approach that might result in efficient and cost-effective colour removal during the sugar manufacturing process is the use of an advanced oxidation process (AOP), known as Fenton oxidation, that is, catalytic production of hydroxyl radicals from the decomposition of hydrogen peroxide using ferrous iron. As a first step towards developing this technology, this study determined the composition of colour precursors present in the juice of cane harvested by three different methods. The methods were harvesting cane after burning, harvesting the whole crop with half of the trash extracted and harvesting the whole crop with no trash extracted. The study also investigated the degradation at pH 3, 4 and 5 of a phenolic compound, caffeic acid (3,4–dihydroxycinnamic acid), which is present in sugar cane juice, using both hydrogen peroxide and Fenton’s reagent. The results show that juice expressed from whole crop cane has significantly higher colour than juices expressed from burnt cane. However, the concentrations of phenolic acids were lower in the juices expressed from whole crop cane. The main phenolic acids present in these juices were p-coumaric, vanillic, 2,3–dihydroxybenzoic, gallic and 3,4–dihydroxybenzoic acids. The degradation of caffeic acid significantly improved using Fenton’s reagent in comparison to hydrogen peroxide alone. The Fenton oxidation was optimum at pH 5 when up to ~86% of caffeic acid degraded within 5 min.
Resumo:
Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.
Resumo:
"Seventeen peer-reviewed papers cover the latest research on the ignition and combustion of metals and non-metals, oxygen compatibility of components and systems, analysis of ignition and combustion, failure analysis and safety. It includes aerospace, military, scuba diving, and industrial oxygen applications. Topics cover: • Development of safe oxygen systems • Ignition mechanisms within oxygen systems and how to avoid them • Specific hazards that exist with the oxygen mixture breathed by divers in the scuba industry • Issues related to oxygen system level safety • Issues related to oxygen safety in breathing systems • Detailed investigations and discussions related to the burn curves that have been generated for metals that are burning in a standard test fixture This new publication is a valuable resource for professionals in the air separation industries, oxygen manufacturers, manufacturers of materials intended for oxygen service, and users of oxygen and oxygen-enriched atmospheres, including aerospace, medical, industrial gases, chemical processing, steel and metals refining, as well as to military, commercial or recreational diving."--- publisher website
Resumo:
In a previous study we found evidence for an X-linked genetic component for familial typical migraine in two large Australian white pedigrees, designated MF7 and MF14. Significant excess allele sharing was indicated by nonparametric linkage (NPL) analysis using GENEHUNTER (P=0.031 and P=0.012, respectively), with a combined analysis of the two pedigrees showing further increased evidence for linkage, producing a maximum NPL score of 2.87 (P=0.011 ) at DXS 1123 on Xq27. The present study was aimed at refining the localization of the migraine X-chromosomal component by typing additional markers, performing haplotype analysis and applying a more powerful technique in the analysis of linkage data from these two pedigrees. Results from the haplotype analyses, coupled with linkage analyses that produced a peak GENEHUNTER-PLUS LOD* score of 2.388 (P=0.0005), provide compelling evidence for the presence of a migraine susceptibility locus on chromosome Xq24-28.
Resumo:
Background Alcohol is a major contributor to road crashes in China (Li, Xie, Nie, & Zhang, 2012; Cochrane, & Chen, 2003). Two levels of offence are defined in legislation: the lower level is driving under the influence (DUI, also translated as “drink driving”) and the higher level is driving while intoxicated (DWI, also translated as “drunk driving”, where the driver has BAC>0.08mg/100ml). This study focuses on a 2011 legislative amendment that made drunk driving (DWI) a criminal offence. However, it is not known whether drivers are aware of the law, and whether this knowledge, their exposure to enforcement and the existence of alcohol use disorders relate to their drink driving behaviour. This study explored these relationships in a sample of convicted drunk drivers. Method A survey collected information about offenders’ knowledge and practices related to drunk driving in Guangzhou. The Alcohol Use Disorders Identification Test (AUDIT) (Babor, & Grant, 1989; Chen, & Cheng, 2005) assessed hazardous drinking levels. In total, 101 drunk driving offenders were recruited while in detention. Results Males represented 90% of the sample; the average age was 33.6 years (SD=8.7; range 17-59 years). The average age at which offenders reported starting to drink alcohol was 19.5 years (SD=4.1; range 8-30 years). Driver’s licences had been held for a median of 7 years. Knowledge about legal limits for DUI and DWI offences was surprisingly low, at 27.7% and 40.6% respectively. On average, offenders had experienced 1.5 police alcohol breath tests in the previous year (SD=1.3; range 1-10). AUDIT scores indicated that a substantial proportion of the offenders had high levels of alcohol use disorders. Higher AUDIT scores were found among the least experienced drivers, those with lack of knowledge about the legal limits, and recidivist drunk drivers. Discussion and conclusions Limited awareness of legal alcohol limits might contribute to offending; high levels of alcohol consumption by many offenders suggest that hazardous drinking levels may also contribute. Novice drivers are a concern and their higher AUDIT scores merit some followup. Overall, this study provides important information to assist in refining community education and prevention efforts to align with China’s new regulations.
Resumo:
Lesson studies are a powerful form of professional development (Doig and Groves, 2011). The processes of creating, enacting, analyzing, and refining lessons to improve teaching practices are key components of lesson studies. Lesson studies have been the primary form of professional development in Japanese classrooms for many years (Lewis, Perry and Hurd, 2009). This model is now used to improve instruction in many South-East Asian countries (White and Lim, 2008), as well as increasingly in North America (Lesson Study Research Group, 2004), and South Africa (Ono and Ferreira, 2010). In China, this form of professional development aimed at improving teaching, has also been adopted, originating from Soviet models of teacher professional development arising from influences post 1949 (China Education Yearbook, 1986). Thus, China too has a long history of improving teaching and learning through this form of school-based professional learning.
Resumo:
Climate change is affecting and will increasingly influence human health and wellbeing. Children are particularly vulnerable to the impact of climate change. An extensive literature review regarding the impact of climate change on children’s health was conducted in April 2012 by searching electronic databases PubMed, Scopus, ProQuest, ScienceDirect, and Web of Science, as well as relevant websites, such as IPCC and WHO. Climate change affects children’s health through increased air pollution, more weather-related disasters, more frequent and intense heat waves, decreased water quality and quantity, food shortage and greater exposure to toxicants. As a result, children experience greater risk of mental disorders, malnutrition, infectious diseases, allergic diseases and respiratory diseases. Mitigation measures like reducing carbon pollution emissions, and adaptation measures such as early warning systems and post-disaster counseling are strongly needed. Future health research directions should focus on: (1) identifying whether climate change impacts on children will be modified by gender, age and socioeconomic status; (2) refining outcome measures of children’s vulnerability to climate change; (3) projecting children’s disease burden under climate change scenarios; (4) exploring children’s disease burden related to climate change in low-income countries, and ; (5) identifying the most cost-effective mitigation and adaptation actions from a children’s health perspective.
Resumo:
BACKGROUND There is a growing volume of open source ‘education material’ on energy efficiency now available however the Australian government has identified a need to increase the use of such materials in undergraduate engineering education. Furthermore, there is a reported need to rapidly equip engineering graduates with the capabilities in conducting energy efficiency assessments, to improve energy performance across major sectors of the economy. In January 2013, building on several years of preparatory action-research initiatives, the former Department of Industry, Innovation, Climate Change, Science, Research and Tertiary Education (DIICCSRTE) offered $600,000 to develop resources for energy efficiency related graduate attributes, targeting Engineers Australia college disciplines, accreditation requirements and opportunities to address such requirements. PURPOSE This paper discusses a $430,000 successful bid by a university consortium led by QUT and including RMIT, UA, UOW, and VU, to design and pilot several innovative, targeted open-source resources for curriculum renewal related to energy efficiency assessments, in Australian engineering programs (2013-2014), including ‘flat-pack’, ‘media-bites’, ‘virtual reality’ and ‘deep dive’ case study initiatives. DESIGN/ METHOD The paper draws on literature review and lessons learned by the consortium partners in resource development over the last several years to discuss methods for selecting key graduate attributes and providing targeted resources, supporting materials, and innovative delivery options to assist universities deliver knowledge and skills to develop such attributes. This includes strategic industry and key stakeholders engagement. The paper also discusses processes for piloting, validating, peer reviewing, and refining these resources using a rigorous and repeatable approach to engaging with academic and industry colleagues. RESULTS The paper provides an example of innovation in resource development through an engagement strategy that takes advantage of existing networks, initiatives, and funding arrangements, while informing program accreditation requirements, to produce a cost-effective plan for rapid integration of energy efficiency within education. By the conference, stakeholder workshops will be complete. Resources will be in the process of being drafted, building on findings from the stakeholder engagement workshops. Reporting on this project “in progress” provides a significant opportunity to share lessons learned and take on board feedback and input. CONCLUSIONS This paper provides a useful reference document for others considering significant resource development in a consortium approach, summarising benefits and challenges. The paper also provides a basis for documenting the second half of the project, which comprises piloting resources and producing a ‘good practice guide’ for energy efficiency related curriculum renewal.
Resumo:
The endothelins and their associated receptors are important controllers of vascular growth, inflammation and vascular tone. In cancer, they have roles in the control of numerous factors in cancer development and progression, including angiogenesis, stromal reaction, epithelial mesenchymal transitions, apoptosis, invasion, metastases and drug resistance. Also, we consider current information on the role of this signalling system in cancer and examine the state of the current cell, animal and clinical trials utilizing endothelin targeted drugs for cancer management. Although targeting the endothelin axis in cell lines and xenografts show some promise in retarding cellular growth, results from limited clinical trials in prostatic cancer are less encouraging and did not offer significant survival benefit. The ability to target both cancer cells and vasculature via endothelin is an important consideration that necessitates the further refining of therapeutic strategies as we continue to explore the possibilities of the endothelin axis in cancer treatment.
Resumo:
Approximate Bayesian Computation’ (ABC) represents a powerful methodology for the analysis of complex stochastic systems for which the likelihood of the observed data under an arbitrary set of input parameters may be entirely intractable – the latter condition rendering useless the standard machinery of tractable likelihood-based, Bayesian statistical inference [e.g. conventional Markov chain Monte Carlo (MCMC) simulation]. In this paper, we demonstrate the potential of ABC for astronomical model analysis by application to a case study in the morphological transformation of high-redshift galaxies. To this end, we develop, first, a stochastic model for the competing processes of merging and secular evolution in the early Universe, and secondly, through an ABC-based comparison against the observed demographics of massive (Mgal > 1011 M⊙) galaxies (at 1.5 < z < 3) in the Cosmic Assembly Near-IR Deep Extragalatic Legacy Survey (CANDELS)/Extended Groth Strip (EGS) data set we derive posterior probability densities for the key parameters of this model. The ‘Sequential Monte Carlo’ implementation of ABC exhibited herein, featuring both a self-generating target sequence and self-refining MCMC kernel, is amongst the most efficient of contemporary approaches to this important statistical algorithm. We highlight as well through our chosen case study the value of careful summary statistic selection, and demonstrate two modern strategies for assessment and optimization in this regard. Ultimately, our ABC analysis of the high-redshift morphological mix returns tight constraints on the evolving merger rate in the early Universe and favours major merging (with disc survival or rapid reformation) over secular evolution as the mechanism most responsible for building up the first generation of bulges in early-type discs.
Resumo:
Castration is the standard therapy for advanced prostate cancer (PC). Although this treatment is initially effective, tumors invariably relapse as incurable, castration-resistant PC (CRPC). Adaptation of androgen-dependent PC cells to an androgen-depleted environment or selection of pre-existing,CRPC cells have been proposed as mechanisms of CRPC development. Stem cell (SC)-like PC cells have been implicated not only as tumor initiating/maintaining in PC but also as tumor-reinitiating cells in CRPC. Recently, castration-resistant cells expressing the NK3 homeobox 1 (Nkx3-1) (CARNs), the other luminal markers cytokeratin 18 (CK18) and androgen receptor (AR), and possessing SC properties, have been found in castrated mouse prostate and proposed as the cell-of-origin of CRPC. However, the human counterpart of CARNs has not been identified yet. Here, we demonstrate that in the human PC xenograft BM18, preexisting SC-like and neuroendocrine (NE) PC cells are selected by castration and survive as totally quiescent. SClike BM18 cells, displaying the SC markers aldehyde dehydrogenase 1A1 or NANOG, coexpress the luminal markers NKX3-1, CK18, and a low level of AR (ARlow) but not basal or NE markers. These CR luminal SC-like cells, but not NE cells, reinitiate BM18 tumor growth after androgen replacement. The ARlow seems to mediate directly both castration survival and tumor reinitiation. This study identifies for the first time in human PC SC-/CARN-like cells that may represent the cell-of-origin of tumor reinitiation as CRPC. This finding will be fundamental for refining the hierarchy among human PC cancer cells and may have important clinical implications.
Resumo:
A new mesh adaptivity algorithm that combines a posteriori error estimation with bubble-type local mesh generation (BLMG) strategy for elliptic differential equations is proposed. The size function used in the BLMG is defined on each vertex during the adaptive process based on the obtained error estimator. In order to avoid the excessive coarsening and refining in each iterative step, two factor thresholds are introduced in the size function. The advantages of the BLMG-based adaptive finite element method, compared with other known methods, are given as follows: the refining and coarsening are obtained fluently in the same framework; the local a posteriori error estimation is easy to implement through the adjacency list of the BLMG method; at all levels of refinement, the updated triangles remain very well shaped, even if the mesh size at any particular refinement level varies by several orders of magnitude. Several numerical examples with singularities for the elliptic problems, where the explicit error estimators are used, verify the efficiency of the algorithm. The analysis for the parameters introduced in the size function shows that the algorithm has good flexibility.