929 resultados para EVEN NUMBERS
Resumo:
The topic of the present work is to study the relationship between the power of the learning algorithms on the one hand, and the expressive power of the logical language which is used to represent the problems to be learned on the other hand. The central question is whether enriching the language results in more learning power. In order to make the question relevant and nontrivial, it is required that both texts (sequences of data) and hypotheses (guesses) be translatable from the “rich” language into the “poor” one. The issue is considered for several logical languages suitable to describe structures whose domain is the set of natural numbers. It is shown that enriching the language does not give any advantage for those languages which define a monadic second-order language being decidable in the following sense: there is a fixed interpretation in the structure of natural numbers such that the set of sentences of this extended language true in that structure is decidable. But enriching the original language even by only one constant gives an advantage if this language contains a binary function symbol (which will be interpreted as addition). Furthermore, it is shown that behaviourally correct learning has exactly the same power as learning in the limit for those languages which define a monadic second-order language with the property given above, but has more power in case of languages containing a binary function symbol. Adding the natural requirement that the set of all structures to be learned is recursively enumerable, it is shown that it pays o6 to enrich the language of arithmetics for both finite learning and learning in the limit, but it does not pay off to enrich the language for behaviourally correct learning.
Resumo:
The effective atomic number is widely employed in radiation studies, particularly for the characterisation of interaction processes in dosimeters, biological tissues and substitute materials. Gel dosimeters are unique in that they comprise both the phantom and dosimeter material. In this work, effective atomic numbers for total and partial electron interaction processes have been calculated for the first time for a Fricke gel dosimeter, five hypoxic and nine normoxic polymer gel dosimeters. A range of biological materials are also presented for comparison. The spectrum of energies studied spans 10 keV to 100 MeV, over which the effective atomic number varies by 30 %. The effective atomic numbers of gels match those of soft tissue closely over the full energy range studied; greater disparities exist at higher energies but are typically within 4 %.
Resumo:
We have developed a bioreactor vessel design which has the advantages of simplicity and ease of assembly and disassembly, and with the appropriately determined flow rate, even allows for a scaffold to be suspended freely regardless of its weight. This article reports our experimental and numerical investigations to evaluate the performance of a newly developed non-perfusion conical bioreactor by visualizing the flow through scaffolds with 45° and 90° fiber lay down patterns. The experiments were conducted at the Reynolds numbers (Re) 121, 170, and 218 based on the local velocity and width of scaffolds. The flow fields were captured using short-time exposures of 60 µm particles suspended in the bioreactor and illuminated using a thin laser sheet. The effects of scaffold fiber lay down pattern and Reynolds number were obtained and correspondingly compared to results obtained from a computational fluid dynamics (CFD) software package. The objectives of this article are twofold: to investigate the hypothesis that there may be an insufficient exchange of medium within the interior of the scaffold when using our non-perfusion bioreactor, and second, to compare the flows within and around scaffolds of 45° and 90° fiber lay down patterns. Scaffold porosity was also found to influence flow patterns. It was therefore shown that fluidic transport could be achieved within scaffolds with our bioreactor design, being a non-perfusion vessel. Fluid velocities were generally same of the same or one order lower in magnitude as compared to the inlet flow velocity. Additionally, the 90° fiber lay down pattern scaffold was found to allow for slightly higher fluid velocities within, as compared to the 45° fiber lay down pattern scaffold. This was due to the architecture and pore arrangement of the 90° fiber lay down pattern scaffold, which allows for fluid to flow directly through (channel-like flow).
Resumo:
If current population and accommodation trends continue, Australian cities will, in the future, have noticeable numbers of apartment buildings over 60 storeys high. With an aging population it follows that a significant proportion of those occupying these buildings will be senior citizens, many of whom will have some form of disability. For these occupants a fire emergency in a high rise building presents a serious problem. Currently lifts cannot be used for evacuation and going down 60 storeys in a fire isolated staircase would be physically impossible for many. Therefore, for many, the temptation to remain in one’s unit will be very strong. With an awareness of this behaviour trend in older residents, many researchers have, in recent years, explored the possible wider use of lifts in a fire emergency. So far the use of lifts for evacuation has been approved for a small number of buildings but wide acceptance of this solution is still to be achieved. This paper concludes that even in high-rise apartment buildings where lifts are approved for evacuation, architects should design the building with alternative evacuation routes and provide suitable safe refuge areas for those who cannot use the stairs when the lifts are unavailable.
Resumo:
Despite the continued popularity of travel blogs and virtual travel communities, there is currently a lack of contemporary criticism surrounding the coded structures and connotations of online travel writings, especially those arising out of the Australian context. While there have been few significant studies of Australian women’s travel to date, there have been even less about female wandering. Reimagining the archetype of Penelope, this paper considers liminal accounts of wandering in contemporary travel blogs of Australian women abroad. When women travel as wanderers, they undermine normative accounts of travel and trace out alternative movements fused with gendered meaning.
Resumo:
In a previous chapter (Dean and Kavanagh, Chapter 37), the authors made a case for applying low intensity (LI) cognitive behaviour therapy (CBT) to people with serious mental illness (SMI). As in other populations, LI CBT interventions typically deal with circumscribed problems or behaviours. LI CBT retains an emphasis on self-management, has restricted content and segment length, and does not necessarily require extensive CBT training. In applying these interventions to SMI, adjustments may be needed to address cognitive and symptomatic difficulties often faced by these groups. What may take a single session in a less affected population may require several sessions or a thematic application of the strategy within case management. In some cases, the LI CBT may begin to appear more like a high-intensity (HI) intervention, albeit simple and with many LI CBT characteristics still retained. So, if goal setting were introduced in one or two sessions, it could clearly be seen as an LI intervention. When applied to several different situations and across many sessions, it may be indistinguishable from a simple HI treatment, even if it retains the same format and is effectively applied by a practitioner with limited CBT training. ----- ----- In some ways, LI CBT should be well suited to case management of patients with SMI. treating staff typically have heavy workloads, and find it difficult to apply time-consuming treatments (Singh et al. 2003). LI CBT may allow provision of support to greater numbers of service users, and allow staff to spend more time on those who need intensive and sustained support. However, the introduction of any change in practice has to address significant challenges, and LI CBT is no exception. ----- ----- Many of the issues that we face in applying LI CBT to routine case management in a mnetal health service and their potential solutions are essentially the same as in a range of other problem domains (Turner and Sanders 2006)- and, indeed, are similar to those in any adoption of innovation (Rogers 2003). Over the last 20 years, several commentators have described barriers to implementing evidence-based innovations in mental health services (Corrigan et al. 1992; Deane et al. 2006; Kavanagh et al. 1993). The aim of the current chapter is to present a cognitive behavioural conceptualisation of problems and potential solutions for dissemination of LI CBT.
Resumo:
The delay stochastic simulation algorithm (DSSA) by Barrio et al. [Plos Comput. Biol.2, 117–E (2006)] was developed to simulate delayed processes in cell biology in the presence of intrinsic noise, that is, when there are small-to-moderate numbers of certain key molecules present in a chemical reaction system. These delayed processes can faithfully represent complex interactions and mechanisms that imply a number of spatiotemporal processes often not explicitly modeled such as transcription and translation, basic in the modeling of cell signaling pathways. However, for systems with widely varying reaction rate constants or large numbers of molecules, the simulation time steps of both the stochastic simulation algorithm (SSA) and the DSSA can become very small causing considerable computational overheads. In order to overcome the limit of small step sizes, various τ-leap strategies have been suggested for improving computational performance of the SSA. In this paper, we present a binomial τ- DSSA method that extends the τ-leap idea to the delay setting and avoids drawing insufficient numbers of reactions, a common shortcoming of existing binomial τ-leap methods that becomes evident when dealing with complex chemical interactions. The resulting inaccuracies are most evident in the delayed case, even when considering reaction products as potential reactants within the same time step in which they are produced. Moreover, we extend the framework to account for multicellular systems with different degrees of intercellular communication. We apply these ideas to two important genetic regulatory models, namely, the hes1 gene, implicated as a molecular clock, and a Her1/Her 7 model for coupled oscillating cells.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
Despite various approaches, the production of biodegradable plastics such as polyhydroxybutyrate (PHB) in transgenic plants has met with limited success due largely to low expression levels. Even in the few instances where high levels of protein expression have been reported, the transgenic plants have been stunted indicating PHB is phytotoxic (Poirier 2002). This PhD describes the application of a novel virus-based gene expression technology, termed InPAct („In Plant Activation.), for the production of PHB in tobacco and sugarcane. InPAct is based on the rolling circle replication mechanism by which circular ssDNA viruses replicate and provides a system for controlled, high-level gene expression. Based on these features, InPAct was thought to represent an ideal system to enable the controlled, high-level expression of the three phb genes (phbA, phbB and phbC) required for PHB production in sugarcane at a preferred stage of plant growth. A Tobacco yellow dwarf virus (TbYDV)-based InPAct-phbA vector, as well as linear vectors constitutively expressing phbB and phbC were constructed and different combinations were used to transform tobacco leaf discs. A total of four, eight, three and three phenotypically normal tobacco lines were generated from discs transformed with InPAct-phbA, InPAct-phbA + p1300-TaBV P-phbB/phbC- 35S T, p1300-35S P-phbA-NOS T + p1300-TaBV P-phbB/phbC-35S T and InPAct-GUS, respectively. To determine whether the InPAct cassette could be activated in the presence of the TbYDV Rep, leaf samples from the eight InPActphbA + p1300-TaBV P-phbB/phbC-35S T plants were agroinfiltrated with p1300- TbYDV-Rep/RepA. Three days later, successful activation was indicated by the detection of episomes using both PCR and Southern analysis. Leaf discs from the eight InPAct-phbA + p1300-TaBV P-phbB/phbC-35S T transgenic plant lines were agroinfiltrated with p1300-TbYDV-Rep/RepA and leaf tissue was collected ten days post-infiltration and examined for the presence of PHB granules. Confocal microscopy and TEM revealed the presence of typical PHB granules in five of the eight lines, thus demonstrating the functionality of InPActbased PHB production in tobacco. However, analysis of leaf extracts by HPLC failed to detect the presence of PHB suggesting only very low level expression levels. Subsequent molecular analysis of three lines revealed low levels of correctly processed mRNA from the catalase intron contained within the InPAct cassette and also the presence of cryptic splice sites within the intron. In an attempt to increase expression levels, new InPAct-phb cassettes were generated in which the castorbean catalase intron was replaced with a synthetic intron (syntron). Further, in an attempt to both increase and better control Rep/RepA-mediated activation of InPAct cassettes, Rep/RepA expression was placed under the control of a stably integrated alc switch. Leaf discs from a transgenic tobacco line (Alc ML) containing 35S P-AlcR-AlcA P-Rep/RepA were supertransformed with InPAct-phbAsyn or InPAct-GUSsyn using Agrobacterium and three plants (lines) were regenerated for each construct. Analysis of the RNA processing of the InPAct-phbAsyn cassette revealed highly efficient and correct splicing of the syntron, thus supporting its inclusion within the InPAct system. To determine the efficiency of the alc switch to activate InPAct, leaf material from the three Alc ML + InPAct-phbAsyn lines was either agroinfiltrated with 35S P-Rep/RepA or treated with ethanol. Unexpectedly, episomes were detected not only in the infiltrated and ethanol treated samples, but also in non-treated samples. Subsequent analysis of transgenic Alc ML + InPAct-GUS lines, confirmed that the alc switch was leaky in tissue culture. Although this was shown to be reversible once plants were removed from the tissue culture environment, it made the regeneration of Alc ML + InPAct-phbsyn plant lines extremely difficult, due to unintentional Rep expression and therefore high levels of phb expression and phytotoxic PHB production. Two Alc ML + InPAct-phbAsyn + p1300-TaBV P-phbB/phbC-35S T transgenic lines were able to be regenerated, and these were acclimatised, alcohol-treated and analysed. Although episome formation was detected as late as 21 days post activation, no PHB was detected in the leaves of any plants using either microscopy or HPLC, suggesting the presence of a corrupt InPAct-phbA cassette in both lines. The final component of this thesis involved the application of both the alc switch and the InPAct systems to sugarcane in an attempt to produce PHB. Initial experiments using transgenic Alc ML + InPAct-GUS lines indicated that the alc system was not functional in sugarcane under the conditions tested. The functionality of the InPAct system, independent of the alc gene switch, was subsequently examined by bombarding the 35S Rep/RepA cassette into leaf and immature leaf whorl cells derived from InPAct-GUS transgenic sugarcane plants. No GUS expression was observed in leaf tissue, whereas weak and irregular GUS expression was observed in immature leaf whorl tissue derived from two InPAct- GUS lines and two InPAct-GUS + 35S P-AlcR-AlcA P-GUS lines. The most plausible reason to explain the inconsistent and low levels of GUS expression in leaf whorls is a combination of low numbers of sugarcane cells in the DNA replication-conducive S-phase and the irregular and random nature of sugarcane cells bombarded with Rep/RepA. This study details the first report to develop a TbYDV-based InPAct system under control of the alc switch to produce PHB in tobacco and sugarcane. Despite the inability to detect quantifiable levels of PHB levels in either tobacco or sugarcane, the findings of this study should nevertheless assist in the further development of both the InPAct system and the alc system, particularly for sugarcane and ultimately lead to an ethanol-inducible InPAct gene expression system for the production of bioplastics and other proteins of commercial value in plants.
Resumo:
In this paper we pursue the task of aligning an ensemble of images in an unsupervised manner. This task has been commonly referred to as “congealing” in literature. A form of congealing, using a least-squares criteria, has been recently demonstrated to have desirable properties over conventional congealing. Least-squares congealing can be viewed as an extension of the Lucas & Kanade (LK)image alignment algorithm. It is well understood that the alignment performance for the LK algorithm, when aligning a single image with another, is theoretically and empirically equivalent for additive and compositional warps. In this paper we: (i) demonstrate that this equivalence does not hold for the extended case of congealing, (ii) characterize the inherent drawbacks associated with least-squares congealing when dealing with large numbers of images, and (iii) propose a novel method for circumventing these limitations through the application of an inverse-compositional strategy that maintains the attractive properties of the original method while being able to handle very large numbers of images.
Resumo:
This paper presents a comprehensive study to find the most efficient bitrate requirement to deliver mobile video that optimizes bandwidth, while at the same time maintains good user viewing experience. In the study, forty participants were asked to choose the lowest quality video that would still provide for a comfortable and long-term viewing experience, knowing that higher video quality is more expensive and bandwidth intensive. This paper proposes the lowest pleasing bitrates and corresponding encoding parameters for five different content types: cartoon, movie, music, news and sports. It also explores how the lowest pleasing quality is influenced by content type, image resolution, bitrate, and user gender, prior viewing experience, and preference. In addition, it analyzes the trajectory of users’ progression while selecting the lowest pleasing quality. The findings reveal that the lowest bitrate requirement for a pleasing viewing experience is much higher than that of the lowest acceptable quality. Users’ criteria for the lowest pleasing video quality are related to the video’s content features, as well as its usage purpose and the user’s personal preferences. These findings can provide video providers guidance on what quality they should offer to please mobile users.
Resumo:
Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.
Resumo:
Across the industrialized west there has been a sharp decline in union membership (Frege and Kelly2003, Peetz 2002). Even more alarming are the lower unionization rates of young people and the steeper decline in these rates compared to older workers (Serrano and Waddington 2000). At the same time increasing numbers of young people still at school are participating in the labour market. There have been a number of explorations internationally of young people's union membership, but most either track membership decline over time, comparing adult and youth union density (Blanden and Machin 2003, Bryson et al. 2005, Haynes, Vowles and Boxall 2005, Canny 2002, OECD 2006), explore the general experience of young people in the labour market (for example, Lizen, Bolton and Pole 1999) or examine young people's view of unions (for example, Bulbeck 2008). This chapter however takes a different approach, exploring union officials' constructions of 'the problem' of low union density amongst youth. While the data in this study was obtained from Australia, the Australian context has strong similarities with those in other industrialized economies, not least because globalization has meant the spread of neo-liberal industrial relations (IR) policies and structures. Assuming that unions have choices open to them as to how they recruit and retain young people, it is important to analyse officials' construction of 'the problem', as this affects union strategizing and action.
Resumo:
Over the last decade nations around the world have renewed their efforts to address the problem of human trafficking, following the introduction of the United Nations Protocol to Prevent, Suppress and Punish Trafficking in Persons, Especially Women and Children. In Australia and the United States, legislators sought to quantify and characterise the human trafficking phenomenon, seeking to answer the question — how large is the problem of trafficking? This article explores the attempts of legislators in Australia and the United States to determine how many victims are trafficked into their countries, highlighting the significant uncertainty that still surrounds data on human trafficking. The challenges researchers face in measuring human trafficking are also explored. These challenges include disputes over the definition of a trafficking victim, the limitations of research using sampling to measure the trafficked population, and the mischaracterisation of the trafficking problem as a result of politicisation of the trafficking debate and a focus on trafficking for sexual exploitation versus other forms of labour. This article argues that in the absence of reliable data on trafficking, policy is often informed by misleading or false information.
Resumo:
Purpose: To determine the effect of moderate levels of refractive blur and simulated cataracts on nighttime pedestrian conspicuity in the presence and absence of headlamp glare. Methods: The ability to recognize pedestrians at night was measured in 28 young adults (M=27.6 years) under three visual conditions: normal vision, refractive blur and simulated cataracts; mean acuity was 20/40 or better in all conditions. Pedestrian recognition distances were recorded while participants drove an instrumented vehicle along a closed road course at night. Pedestrians wore one of three clothing conditions and oncoming headlamps were present for 16 participants and absent for 12 participants. Results: Simulated visual impairment and glare significantly reduced the frequency with which drivers recognized pedestrians and the distance at which the drivers first recognized them. Simulated cataracts were significantly more disruptive than blur even though photopic visual acuity levels were matched. With normal vision, drivers responded to pedestrians at 3.6x and 5.5x longer distances on average than for the blur or cataract conditions, respectively. Even in the presence of visual impairment and glare, pedestrians were recognized more often and at longer distances when they wore a “biological motion” reflective clothing configuration than when they wore a reflective vest or black clothing. Conclusions: Drivers’ ability to recognize pedestrians at night is degraded by common visual impairments even when the drivers’ mean visual acuity meets licensing requirements. To maximize drivers’ ability to see pedestrians, drivers should wear their optimum optical correction, and cataract surgery should be performed early enough to avoid potentially dangerous reductions in visual performance.