913 resultados para orders of worth
Resumo:
The results of a numerical investigation into the errors for least squares estimates of function gradients are presented. The underlying algorithm is obtained by constructing a least squares problem using a truncated Taylor expansion. An error bound associated with this method contains in its numerator terms related to the Taylor series remainder, while its denominator contains the smallest singular value of the least squares matrix. Perhaps for this reason the error bounds are often found to be pessimistic by several orders of magnitude. The circumstance under which these poor estimates arise is elucidated and an empirical correction of the theoretical error bounds is conjectured and investigated numerically. This is followed by an indication of how the conjecture is supported by a rigorous argument.
Resumo:
In 2009, Religious Education is a designated key learning area in Catholic schools in the Archdiocese of Brisbane and, indeed, across Australia. Over the years, though, different conceptualisations of the nature and purpose of religious education have led to the construction of different approaches to the classroom teaching of religion. By investigating the development of religious education policy in the Archdiocese of Brisbane from 1984 to 2003, the study seeks to trace the emergence of new discourses on religious education. The study understands religious education to refer to a lifelong process that occurs through a variety of forms (Moran, 1989). In Catholic schools, it refers both to co-curricula activities, such as retreats and school liturgies, and the classroom teaching of religion. It is the policy framework for the classroom teaching of religion that this study explores. The research was undertaken using a policy case study approach to gain a detailed understanding of how new conceptualisations of religious education emerged at a particular site of policy production, in this case, the Archdiocese of Brisbane. The study draws upon Yeatman’s (1998) description of policy as occurring “when social actors think about what they are doing and why in relation to different and alternative possible futures” (p. 19) and views policy as consisting of more than texts themselves. Policy texts result from struggles over meaning (Taylor, 2004) in which specific discourses are mobilised to support particular views. The study has a particular interest in the analysis of Brisbane religious education policy texts, the discursive practices that surrounded them, and the contexts in which they arose. Policy texts are conceptualised in the study as representing “temporary settlements” (Gale, 1999). Such settlements are asymmetrical, temporary and dependent on context: asymmetrical in that dominant actors are favoured; temporary because dominant actors are always under challenge by other actors in the policy arena; and context - dependent because new situations require new settlements. To investigate the official policy documents, the study used Critical Discourse Analysis (hereafter referred to as CDA) as a research tool that affords the opportunity for researchers to map and chart the emergence of new discourses within the policy arena. As developed by Fairclough (2001), CDA is a three-dimensional application of critical analysis to language. In the Brisbane religious education arena, policy texts formed a genre chain (Fairclough, 2004; Taylor, 2004) which was a focus of the study. There are two features of texts that form genre chains: texts are systematically linked to one another; and, systematic relations of recontextualisation exist between the texts. Fairclough’s (2005) concepts of “imaginary space” and “frameworks for action” (p. 65) within the policy arena were applied to the Brisbane policy arena to investigate the relationship between policy statements and subsequent guidelines documents. Five key findings emerged from the study. First, application of CDA to policy documents revealed that a fundamental reconceptualisation of the nature and purpose of classroom religious education in Catholic schools occurred in the Brisbane policy arena over the last twenty-five years. Second, a disjuncture existed between catechetical discourses that continued to shape religious education policy statements, and educational discourses that increasingly shaped guidelines documents. Third, recontextualisation between policy documents was evident and dependent on the particular context in which religious education occurred. Fourth, at subsequent links in the chain, actors created their own “imaginary space”, thereby altering orders of discourse within the policy arena, with different actors being either foregrounded or marginalised. Fifth, intertextuality was more evident in the later links in the genre chain (i.e. 1994 policy statement and 1997 guidelines document) than in earlier documents. On the basis of the findings of the study, six recommendations are made. First, the institutional Church should carefully consider the contribution that the Catholic school can make to the overall pastoral mission of the diocese in twenty-first century Australia. Second, policymakers should articulate a nuanced understanding of the relationship between catechesis and education with regard to the religion classroom. Third, there should be greater awareness of the connections among policies relating to Catholic schools – especially the connection between enrolment policy and religious education policy. Fourth, there should be greater consistency between policy documents. Fifth, policy documents should be helpful for those to whom they are directed (i.e. Catholic schools, teachers). Sixth, “imaginary space” (Fairclough, 2005) in policy documents needs to be constructed in a way that allows for multiple “frameworks for action” (Fairclough, 2005) through recontextualisation. The findings of this study are significant in a number of ways. For religious educators, the study highlights the need to develop a shared understanding of the nature and purpose of classroom religious education. It argues that this understanding must take into account the multifaith nature of Australian society and the changing social composition of Catholic schools themselves. Greater recognition should be given to the contribution that religious studies courses such as Study of Religion make to the overall religious development of a person. In view of the social composition of Catholic schools, there is also an issue of ecclesiological significance concerning the conceptualisation of the relationship between the institutional Catholic Church and Catholic schools. Finally, the study is of significance because of its application of CDA to religious education policy documents. Use of CDA reveals the foregrounding, marginalising, or excluding of various actors in the policy arena.
Resumo:
There is an urgent need to develop safe, effective, dual-purpose contraceptive agents that combine the prevention of pregnancy with protection against sexually transmitted diseases. Here we report the identification of a group of compounds that on contact with human spermatozoa induce a state of “spermostasis,” characterized by the extremely rapid inhibition of sperm movement without compromising cell viability. These spermostatic agents were more active and significantly less toxic than the reagent in current clinical use, nonoxynol 9, giving therapeutic indices (ratio of spermostatic to cytotoxic activity) that were orders of magnitude greater than this traditional spermicide. Although certain compounds could trigger reactive oxygen species generation by spermatozoa, this activity was not correlated with spermostasis. Rather, the latter was associated with alkylation of two major sperm tail proteins that were identified as A Kinase-Anchoring Proteins (AKAP3 and AKAP4) by mass spectrometry. As a consequence of disrupted AKAP function, the abilities of cAMP to drive protein kinase A-dependent activities in the sperm tail, such as the activation of SRC and the consequent stimulation of tyrosine phosphorylation, were suppressed. Furthermore, analysis of microbicidal activity using Chlamydia muridarum revealed powerful inhibitory effects at the same low micromolar doses that suppressed sperm movement. In this case, the microbicidal action was associated with alkylation of Major Outer Membrane Protein (MOMP), a major chlamydial membrane protein. Taken together, these results have identified for the first time a novel set of cellular targets and chemical principles capable of providing simultaneous defense against both fertility and the spread of sexually transmitted disease.
Resumo:
The performance and electron recombination kinetics of dye-sensitized solar cells based on TiO2 films consisting of one-dimensional nanorod arrays (NR-DSSCs) which are sensitized with dye N719, C218 and D205 respectively have been studied. It has been found that the best efficiency is obtained with the dye C218 based NR-DSSCs, benefiting from a 40% higher short-circuit photocurrent density. However, the open circuit photovoltage of the N719 based cell is 40 mV higher than that of the organic dye C218 and D205 based devices. Investigation of the electron recombination kinetics of the NR-DSSCs has revealed that the effective electron lifetime, τn, of the N719 based NR-DSSC is the lowest whereas the τn of the C218 based NR-DSSC is the highest among the three dyes. The higher Voc with the N719 based NR-DSSC is originated from the more negative energy level of the conduction band of the TiO2 film. In addition, in comparison to the DSSCs with conventional nanocrystalline particles based TiO2 films, the NR-DSSCs have shown over two orders of magnitude higher τn when employing N719 as the sensitizer. Nevertheless, the τn of the DSSCs with the C218 based nanorod arrays is only ten-fold higher than the that of the nanoparticles based devices. The remarkable characteristic of the dye C218 in suppressing the electron recombination of DSSCs is discussed.
Practical improvements to simultaneous computation of multi-view geometry and radial lens distortion
Resumo:
This paper discusses practical issues related to the use of the division model for lens distortion in multi-view geometry computation. A data normalisation strategy is presented, which has been absent from previous discussions on the topic. The convergence properties of the Rectangular Quadric Eigenvalue Problem solution for computing division model distortion are examined. It is shown that the existing method can require more than 1000 iterations when dealing with severe distortion. A method is presented for accelerating convergence to less than 10 iterations for any amount of distortion. The new method is shown to produce equivalent or better results than the existing method with up to two orders of magnitude reduction in iterations. Through detailed simulation it is found that the number of data points used to compute geometry and lens distortion has a strong influence on convergence speed and solution accuracy. It is recommended that more than the minimal number of data points be used when computing geometry using a robust estimator such as RANSAC. Adding two to four extra samples improves the convergence rate and accuracy sufficiently to compensate for the increased number of samples required by the RANSAC process.
Resumo:
Due to their large surface area, complex chemical composition and high alveolar deposition rate, ultrafine particles (UFPs) (< 0.1 ìm) pose a significant risk to human health and their toxicological effects have been acknowledged by the World Health Organisation. Since people spend most of their time indoors, there is a growing concern about the UFPs present in some indoor environments. Recent studies have shown that office machines, in particular laser printers, are a significant indoor source of UFPs. The majority of printer-generated UFPs are organic carbon and it is unlikely that these particles are emitted directly from the printer or its supplies (such as paper and toner powder). Thus, it was hypothesised that these UFPs are secondary organic aerosols (SOA). Considering the widespread use of printers and human exposure to these particles, understanding the processes involved in particle formation is of critical importance. However, few studies have investigated the nature (e.g. volatility, hygroscopicity, composition, size distribution and mixing state) and formation mechanisms of these particles. In order to address this gap in scientific knowledge, a comprehensive study including state-of-art instrumental methods was conducted to characterise the real-time emissions from modern commercial laser printers, including particles, volatile organic compounds (VOCs) and ozone (O3). The morphology, elemental composition, volatility and hygroscopicity of generated particles were also examined. The large set of experimental results was analysed and interpreted to provide insight into: (1) Emissions profiles of laser printers: The results showed that UFPs dominated the number concentrations of generated particles, with a quasi unimodal size distribution observed for all tests. These particles were volatile, non-hygroscopic and mixed both externally and internally. Particle microanalysis indicated that semi-volatile organic compounds occupied the dominant fraction of these particles, with only trace quantities of particles containing Ca and Fe. Furthermore, almost all laser printers tested in this study emitted measurable concentrations of VOCs and O3. A positive correlation between submicron particles and O3 concentrations, as well as a contrasting negative correlation between submicron particles and total VOC concentrations were observed during printing for all tests. These results proved that UFPs generated from laser printers are mainly SOAs. (2) Sources and precursors of generated particles: In order to identify the possible particle sources, particle formation potentials of both the printer components (e.g. fuser roller and lubricant oil) and supplies (e.g. paper and toner powder) were investigated using furnace tests. The VOCs emitted during the experiments were sampled and identified to provide information about particle precursors. The results suggested that all of the tested materials had the potential to generate particles upon heating. Nine unsaturated VOCs were identified from the emissions produced by paper and toner, which may contribute to the formation of UFPs through oxidation reactions with ozone. (3) Factors influencing the particle emission: The factors influencing particle emissions were also investigated by comparing two popular laser printers, one showing particle emissions three orders of magnitude higher than the other. The effects of toner coverage, printing history, type of paper and toner, and working temperature of the fuser roller on particle number emissions were examined. The results showed that the temperature of the fuser roller was a key factor driving the emission of particles. Based on the results for 30 different types of laser printers, a systematic positive correlation was observed between temperature and particle number emissions for printers that used the same heating technology and had a similar structure and fuser material. It was also found that temperature fluctuations were associated with intense bursts of particles and therefore, they may have impact on the particle emissions. Furthermore, the results indicated that the type of paper and toner powder contributed to particle emissions, while no apparent relationship was observed between toner coverage and levels of submicron particles. (4) Mechanisms of SOA formation, growth and ageing: The overall hypothesis that UFPs are formed by reactions with the VOCs and O3 emitted from laser printers was examined. The results proved this hypothesis and suggested that O3 may also play a role in particle ageing. In addition, knowledge about the mixing state of generated particles was utilised to explore the detailed processes of particle formation for different printing scenarios, including warm-up, normal printing, and printing without toner. The results indicated that polymerisation may have occurred on the surface of the generated particles to produce thermoplastic polymers, which may account for the expandable characteristics of some particles. Furthermore, toner and other particle residues on the idling belt from previous print jobs were a very clear contributing factor in the formation of laser printer-emitted particles. In summary, this study not only improves scientific understanding of the nature of printer-generated particles, but also provides significant insight into the formation and ageing mechanisms of SOAs in the indoor environment. The outcomes will also be beneficial to governments, industry and individuals.
Resumo:
Purpose This chapter investigates an episode where a supervising teacher on playground duty asks two boys to each give an account of their actions over an incident that had just occurred on some climbing equipment in the playground. Methodology This paper employs an ethnomethodological approach using conversation analysis. The data are taken from a corpus of video recorded interactions of children, aged 7-9 years, and the teacher, in school playgrounds during the lunch recess. Findings The findings show the ways that children work up accounts of their playground practices when asked by the teacher. The teacher initially provided interactional space for each child to give their version of the events. Ultimately, the teacher’s version of how to act in the playground became the sanctioned one. The children and the teacher formulated particular social orders of behavior in the playground through multi-modal devices, direct reported speech and scripts. Such public displays of talk work as socialization practices that frame teacher-sanctioned morally appropriate actions in the playground. Value of paper This chapter shows the pervasiveness of the teacher’s social order, as she presented an institutional social order of how to interact in the playground, showing clearly the disjunction of adult-child orders between the teacher and children.
Resumo:
We constructed a novel autonomously replicating gene expression shuttle vector, with the aim of developing a system for transiently expressing proteins at levels useful for commercial production of vaccines and other proteins in plants. The vector, pRIC, is based on the mild strain of the geminivirus Bean yellow dwarf virus (BeYDV-m) and is replicationally released into plant cells from a recombinant Agrobacterium tumefaciens Ti plasmid. pRIC differs from most other geminivirus-based vectors in that the BeYDV replication-associated elements were included in cis rather than from a co-transfected plasmid, while the BeYDV capsid protein (CP) and movement protein (MP) genes were replaced by an antigen encoding transgene expression cassette derived from the non-replicating A. tumefaciens vector, pTRAc. We tested vector efficacy in Nicotiana benthamiana by comparing transient cytoplasmic expression between pRIC and pTRAc constructs encoding either enhanced green fluorescent protein (EGFP) or the subunit vaccine antigens, human papillomavirus subtype 16 (HPV-16) major CP L1 and human immunodeficiency virus subtype C p24 antigen. The pRIC constructs were amplified in planta by up to two orders of magnitude by replication, while 50% more HPV-16 L1 and three- to seven-fold more EGFP and HIV-1 p24 were expressed from pRIC than from pTRAc. Vector replication was shown to be correlated with increased protein expression. We anticipate that this new high-yielding plant expression vector will contribute towards the development of a viable plant production platform for vaccine candidates and other pharmaceuticals. © 2009 Blackwell Publishing Ltd.
Resumo:
The present paper presents and discusses the use of dierent codes regarding the numerical simulation of a radial-in ow turbine. A radial-in ow turbine test case was selected from published literature [1] and commercial codes (Fluent and CFX) were used to perform the steady-state numerical simulations. An in-house compressible- ow simulation code, Eilmer3 [2] was also adapted in order to make it suitable to perform turbomachinery simulations and preliminary results are presented and discussed. The code itself as well as its adaptation, comprising the addition of terms for the rotating frame of reference, programmable boundary conditions for periodic boundaries and a mixing plane interface between the rotating and non-rotating blocks are also discussed. Several cases with dierent orders of complexity in terms of geometry were considered and the results were compared across the dierent codes. The agreement between these results and published data is also discussed.
Resumo:
The Analytical Electron Microscope (AEM), with which secondary X-ray emission from a thin (<150nm), electron-transparent material is measured, has rapidly become a versatile instrument for qualitative and quantitative elemental analyses of many materials, including minerals. With due regard for sources of error in experimental procedures, it is possible to obtain high spatial resolution (~20nm diameter) and precise elemental analyses (~3% to 5% relative) from many silicate minerals. In addition, by utilizing the orientational dependence of X-ray emission for certain multi-substituted crystal structures, site occupancies for individual elements within a unit cell can be determined though with lower spatial resolution. The relative ease with which many of these compositional data may be obtained depends in part on the nature of the sample, but, in general, is comparable to other solid state analytical techniques such as X-ray diffraction and electron microprobe analysis. However, the improvement in spatial resolution obtained with the AEM (up to two orders of magnitude in analysis diameter) significantly enhances interpretation of fine-grained assemblages in many terrestrial or extraterrestrial rocks.
Resumo:
The steady problem of free surface flow due to a submerged line source is revisited for the case in which the fluid depth is finite and there is a stagnation point on the free surface directly above the source. Both the strength of the source and the fluid speed in the far field are measured by a dimensionless parameter, the Froude number. By applying techniques in exponential asymptotics, it is shown that there is a train of periodic waves on the surface of the fluid with an amplitude which is exponentially small in the limit that the Froude number vanishes. This study clarifies that periodic waves do form for flows due to a source, contrary to a suggestion by Chapman & Vanden-Broeck (2006, J. Fluid Mech., 567, 299--326). The exponentially small nature of the waves means they appear beyond all orders of the original power series expansion; this result explains why attempts at describing these flows using a finite number of terms in an algebraic power series incorrectly predict a flat free surface in the far field.
Resumo:
Purpose – This chapter examines an episode of pretend play amongst a group of young girls in an elementary school in Australia, highlighting how they interact within the membership categorization device ‘family’ to manage their social and power relationships. Approach – Using conversation analysis and membership categorization analysis, an episode of video-recorded interaction that occurs amongst a group of four young girls is analyzed. Findings – As disputes arise amongst the girls, the mother category is produced as authoritative through authoritative actions by the girl in the category of mother, and displays of subordination on the part of the other children, in the categories of sister, dog and cat. Value of paper – Examining play as a social practice provides insight into the social worlds of children. The analysis shows how the children draw upon and co-construct family-style relationships in a pretend play context, in ways that enable them to build and organize peer interaction. Authority is highlighted as a joint accomplishment that is part of the social and moral order continuously being negotiated by the children. The authority of the mother category is produced and oriented to as a means of managing the disputes within the pretend frame of play.
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
The overall aim of our research was to characterize airborne particles from selected nanotechnology processes and to utilize the data to develop and test quantitative particle concentration-based criteria that can be used to trigger an assessment of particle emission controls. We investigated particle number concentration (PNC), particle mass (PM) concentration, count median diameter (CMD), alveolar deposited surface area, elemental composition, and morphology from sampling of aerosols arising from six nanotechnology processes. These included fibrous and non-fibrous particles, including carbon nanotubes (CNTs). We adopted standard occupational hygiene principles in relation to controlling peak emission and exposures, as outlined by both Safe Work Australia, (1) and the American Conference of Governmental Industrial Hygienists (ACGIH®). (2) The results from the study were used to analyses peak and 30-minute averaged particle number and mass concentration values measured during the operation of the nanotechnology processes. Analysis of peak (highest value recorded) and 30-minute averaged particle number and mass concentration values revealed: Peak PNC20–1000 nm emitted from the nanotechnology processes were up to three orders of magnitude greater than the local background particle concentration (LBPC). Peak PNC300–3000 nm was up to an order of magnitude greater, and PM2.5 concentrations up to four orders of magnitude greater. For three of these nanotechnology processes, the 30-minute average particle number and mass concentrations were also significantly different from the LBPC (p-value < 0.001). We propose emission or exposure controls may need to be implemented or modified, or further assessment of the controls be undertaken, if concentrations exceed three times the LBPC, which is also used as the local particle reference value, for more than a total of 30 minutes during a workday, and/or if a single short-term measurement exceeds five times the local particle reference value. The use of these quantitative criteria, which we are terming the universal excursion guidance criteria, will account for the typical variation in LBPC and inaccuracy of instruments, while precautionary enough to highlight peaks in particle concentration likely to be associated with particle emission from the nanotechnology process. Recommendations on when to utilize local excursion guidance criteria are also provided.
Resumo:
Process-Aware Information Systems (PAISs) support executions of operational processes that involve people, resources, and software applications on the basis of process models. Process models describe vast, often infinite, amounts of process instances, i.e., workflows supported by the systems. With the increasing adoption of PAISs, large process model repositories emerged in companies and public organizations. These repositories constitute significant information resources. Accurate and efficient retrieval of process models and/or process instances from such repositories is interesting for multiple reasons, e.g., searching for similar models/instances, filtering, reuse, standardization, process compliance checking, verification of formal properties, etc. This paper proposes a technique for indexing process models that relies on their alternative representations, called untanglings. We show the use of untanglings for retrieval of process models based on process instances that they specify via a solution to the total executability problem. Experiments with industrial process models testify that the proposed retrieval approach is up to three orders of magnitude faster than the state of the art.