412 resultados para Averaging Principle
Resumo:
Historical information can be used, in addition to pedigree, traits and genotypes, to map quantitative trait locus (QTL) in general populations via maximum likelihood estimation of variance components. This analysis is known as linkage disequilibrium (LD) and linkage mapping, because it exploits both linkage in families and LD at the population level. The search for QTL in the wild population of Soay sheep on St. Kilda is a proof of principle. We analysed the data from a previous study and confirmed some of the QTLs reported. The most striking result was the confirmation of a QTL affecting birth weight that had been reported using association tests but not when using linkage-based analyses. Copyright © Cambridge University Press 2010.
Resumo:
Food in schools is typically understood from a biomedical perspective. At practical, ideational and material levels, whether addressed pedagogically or bureaucratically, food in schools is generally considered from a natural sciences perspective. This perspective manifests as the bioenergetic principle of energy in versus energy out and appears in policy focused on issues such as obesity and physical activity. Despite the considerable literature on the sociology of food and eating, little is understood about food in schools from a sociological perspective. This oversight of one of the most fundamental requirements of the human condition--namely, food--should be of concern for educators. Investigating food through a political economy lens means understanding food in schools as part of broader economic, political, social and cultural conditions. Hence, a political economy of food and schooling is concerned with the formation of ideas about food relative to political, economic, and cultural ideologies in social practice. From a critical sociology study of food messages students receive in the primary school curriculum, this paper reports on some of the official food messages of an Australian state's education policy, as a case to highlight the current political economy of food in Australia. It examines the role of the corporate food industry in the formation of Australian food policy and how that policy created artefacts infused with competing messages. The paper highlights how food and nutrition policy moved from solely a health concern to incorporate an economic dimension and links that shift with the quality of food available in Queensland schools.
Resumo:
For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.
Resumo:
At a quite fundamental level, the very way in which Public Service Broadcasting (PSB) may envisage its future usually captured in the semantic shift from PSB to Public Service Media (PSM) is at stake when considering the recent history of public value discourse and the public value test. The core Reithian PSB idea assumed that public value would be created through the application of core principles of universality of availability and appeal, provision for minorities, education of the public, distance from vested interests, quality programming standards, program maker independence, and fostering of national culture and the public sphere. On the other hand, the philosophical import of the public value test is that potentially any excursion into the provision of new media services needs to be justified ex ante. In this era of New Public Management, greater transparency and accountability, and the proposition that resources for public value deliverables be contestable and not sequestered in public sector institutions, what might be the new Archimedean point around which a contemporised normativity for PSM be built? This paper will argue for the innovation imperative as an organising principle for contemporary PSM. This may appear counterintuitive, as it is precisely PSB’s predilection for innovating in new media services (in online, mobile, and social media) that has produced the constraining apparatus of the ex ante/public value/Drei-Stufen-Test in Europe, based on principles of competitive neutrality and transparency in the application of public funds for defined and limited public benefit. However, I argue that a commitment to innovation can define as complementary to, rather than as competitive ‘crowding out’, the new products and services that PSM can, and should, be delivering into a post-scarcity, superabundant all-media marketplace. The evidence presented in this paper for this argument is derived mostly from analysis of PSM in the Australian media ecology. While no PSB outside Europe is subject to a formal public value test, the crowding out arguments are certainly run in Australia, particularly by powerful commercial interests for whom free news is a threat to monetising quality news journalism. Take right wing opinion leader, herself a former ABC Board member, Judith Sloan: ‘… the recent expansive nature of the ABC – all those television stations, radio stations and online offerings – is actually squeezing activity that would otherwise be undertaken by the private sector. From partly correcting market failure, the ABC is now causing it. We are now dealing with a case of unfair competition and wasted taxpayer funds’ (The Drum, 1 August http://www.abc.net.au/unleashed/2818220.html). But I argue that the crowding out argument is difficult to sustain in Australia because of the PSB’s non-dominant position and the fact that much of innovation generated by the two PSBs, the ABC and the SBS, has not been imitated by or competed for by the commercials. The paper will bring cases forward, such as SBS’ Go Back to Where you Came From (2011) as an example of product innovation, and a case study of process and organisational innovation which also has resulted in specific product and service innovation – the ABC’s Innovation Unit. In summary, at least some of the old Reithian dicta, along with spectrum scarcity and market failure arguments, have faded or are fading. Contemporary PSM need to justify their role in the system, and to society, in terms of innovation.
Resumo:
In McIntosh & Anor as Trustees of the Estate of Camm (A Bankrupt) v Linke Nominees Pty Ltd & Anor [2008] QCA 410 the Queensland Court of Appeal considered the extent of the court’s power under r 7(1) of the Uniform Civil Procedure Rules 1999 (Qld) (“UCPR”) to extend time, and in particular whether the rule applied so as to permit extension of the period specified under rule 667 for varying or setting aside an order. The case also provides an illustration of circumstances in which the court might be expected to depart from the general principle that a successful litigant is entitled to the costs of the litigation.
Resumo:
We demonstrated for the first time by large-scale ab initio calculations that a graphene/titania interface in the ground electronic state forms a charge-transfer complex due to the large difference of work functions between graphene and titania, leading to substantial hole doping in graphene. Interestingly, electrons in the upper valence band can be directly excited from graphene to the conduction band, that is, the 3d orbitals of titania, under visible light irradiation. This should yield well-separated electron−hole pairs, with potentially high photocatalytic or photovoltaic performance in hybrid graphene and titania nanocomposites. Experimental wavelength-dependent photocurrent generation of the graphene/titania photoanode demonstrated noticeable visible light response and evidently verified our ab initio prediction.
Resumo:
The assembly of retroviruses is driven by oligomerization of the Gag polyprotein. We have used cryo-electron tomography together with subtomogram averaging to describe the three-dimensional structure of in vitro-assembled Gag particles from human immunodeficiency virus, Mason-Pfizer monkey virus, and Rous sarcoma virus. These represent three different retroviral genera: the lentiviruses, betaretroviruses and alpharetroviruses. Comparison of the three structures reveals the features of the supramolecular organization of Gag that are conserved between genera and therefore reflect general principles of Gag-Gag interactions and the features that are specific to certain genera. All three Gag proteins assemble to form approximately spherical hexameric lattices with irregular defects. In all three genera, the N-terminal domain of CA is arranged in hexameric rings around large holes. Where the rings meet, 2-fold densities, assigned to the C-terminal domain of CA, extend between adjacent rings, and link together at the 6-fold symmetry axis with a density, which extends toward the center of the particle into the nucleic acid layer. Although this general arrangement is conserved, differences can be seen throughout the CA and spacer peptide regions. These differences can be related to sequence differences among the genera. We conclude that the arrangement of the structural domains of CA is well conserved across genera, whereas the relationship between CA, the spacer peptide region, and the nucleic acid is more specific to each genus.
Resumo:
Paramedic education has been undergoing major development in Australia in the past 20 years, with many different educational programmes being developed across all Australian jurisdictions. This paper aims to review the current paramedic education programmes in Australia to identify the similarities and differences between the programmes, and the strengths and challenges in these programmes. A literature search was performed using six scientific databases to identify any systematic reviews, literature reviews or relevant articles on the topic. Additional searches included journal articles and text references from 1995 to 2011. The search was conducted during December 2010 and November 2011. Included in this review are a total of 28 articles, which are focused around five major issues in paramedic education: (i) principle on paramedic programmes and the involvement of industry partners; (ii) clinical placements; (iii) contemporary methods of education; (iv) needs for specific programmes within paramedic education; and (v) articles related to the accreditation process for paramedic programmes. Paramedic programmes across Australian universities vary with many different practices, especially relating to clinical placements in the field. The further advances of the paramedic education programmes should aim to respond to population change and industry development, which would enhance the paramedic profession across Australia.
Resumo:
Unless sustained, coordinated action is generated in road safety, road traffic deaths are poised to rise from approximately 1.3 to 1.9 million a year by 2020 (Krug, 2012). To generate this harmonised response, road safety management agencies are being urged to adopt multisectoral collaboration (WHO, 2009b), which is achievable through the principle of policy integration. Yet policy integration, in its current hierarchical format, is marred by a lack of universality of its interpretation, a failure to anticipate the complexities of coordinated effort, dearth of information about its design and the absence of a normative perspective to share responsibility. This paper addresses this ill-conception of policy integration by reconceptualising it through a qualitative examination of 16 road safety stakeholders’ written submissions, lodged with the Australian Transport Council in 2011. The resulting, new principle of policy integration, Participatory Deliberative Integration, provides a conceptual framework for the alignment of effort across stakeholders in transport, health, traffic law enforcement, relevant trades and the community. With the adoption of Participatory Deliberative Integration, road safety management agencies should secure the commitment of key stakeholders in the development and implementation of, amongst other policy measures, National Road Safety Strategies and Mix Mode Integrated Timetabling.
Resumo:
Different types of defects can be introduced into graphene during material synthesis, and significantly influence the properties of graphene. In this work, we investigated the effects of structural defects, edge functionalisation and reconstruction on the fracture strength and morphology of graphene by molecular dynamics simulations. The minimum energy path analysis was conducted to investigate the formation of Stone-Wales defects. We also employed out-of-plane perturbation and energy minimization principle to study the possi-ble morphology of graphene nanoribbons with edge-termination. Our numerical results show that the fracture strength of graphene is dependent on defects and environmental temperature. However, pre-existing defects may be healed, resulting in strength recovery. Edge functionalization can induce compressive stress and ripples in the edge areas of gra-phene nanoribbons. On the other hand, edge reconstruction contributed to the tensile stress and curved shape in the graphene nanoribbons.
Resumo:
As sustainability becomes an important principle guiding various human activities around the globe, the higher education sector is being asked to take an active part in educating and promoting sustainability due to its moral responsibility, social obligation and its own needs to adapt to new circumstances. There is a global trend of higher education institutions embarking on responses to the sustainability challenge. On-campus building performance is one of the most important indicators for “sustainable universities”, because buildings carry substantially environmental burden such as considerable consumption of raw materials and energy as well as huge amount of waste generation and greenhouse gas emission. Plus, much research proves that building performance can impact on students and staff’s awareness about and behaviours related to sustainability. The past studies rarely discussed about sustainable construction projects in universities’ unique context. Universities are labeled with distinct characteristics such as complex governance, multiple cultures and juggling missions and so on. It is necessary and meaningful to examine the project management system in terms of universities’ organizational environment. Thus, this research project applies Delphi study to identify primary barriers to green technology application in on-campus buildings, critical factors for sustainable project success, key actions in project phases and strategies for project improvement. Through three rounds of questionnaires among panel experts, the authors obtain a profound understanding of project delivery system in universities. The research results are expected to provide sustainability practitioners with holistic understanding and generic information about sustainable construction project performance on campus as an assistance tool.
Resumo:
The emerging ‘responsibility to protect’ (R2P) principle presents a significant challenge to the BRICS (Brazil, Russia, India, China and South Africa) states’ traditional emphasis on a strict Westphalian understanding of state sovereignty and non-interference in domestic affairs. Despite formally endorsing R2P at the 2005 World Summit, each of the BRICS has, to varying degrees, retained misgivings about coercive measures under the doctrine’s third pillar. This paper examines how these rising powers engaged with R2P during the 2011–2012 Libyan and Syrian civilian protection crises. The central finding is that although all five states expressed similar concerns over NATO’s military campaign in Libya, they have been unable to maintain a common BRICS position on R2P in Syria. Instead, the BRICS have splintered into two sub-groups. The first, consisting of Russia and China, remains steadfastly opposed to any coercive measures against Syria. The second, comprising the democratic IBSA states (India, Brazil and South Africa) has displayed softer, more flexible stances towards proposed civilian protection measures in Syria, although these three states also remain cautious about the implementation of R2P’s coercive dimension. This paper identifies a number of factors which help to explain this split, arguing that the failure to maintain a cohesive BRICS position on R2P is unsurprising given the many internal differences and diverging national interests between the BRICS members. Overall, the BRICS’ ongoing resistance to intervention is unlikely to disappear quickly, indicating that further attempts to operationalize R2P’s third pillar may prove difficult.
Resumo:
Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.
Resumo:
Two approaches are described, which aid the selection of the most appropriate procurement arrangements for a building project. The first is a multi-attribute technique based on the National Economic Development Office procurement path decision chart. A small study is described in which the utility factors involved were weighted by averaging the scores of five 'experts' for three hypothetical building projects. A concordance analysis is used to provide some evidence of any abnormal data sources. When applied to the study data, one of the experts was seen to be atypical. The second approach is by means of discriminant analysis. This was found to provide reasonably consistent predictions through three discriminant functions. The analysis also showed the quality criteria to have no significant impact on the decision process. Both approaches provided identical and intuitively correct answers in the study described. Some concluding remarks are made on the potential of discriminant analysis for future research and development in procurement selection techniques.
Resumo:
Neutrophils constitute 50-60% of all circulating leukocytes; they present the first line of microbicidal defense and are involved in inflammatory responses. To examine immunocompetence in athletes, numerous studies have investigated the effects of exercise on the number of circulating neutrophils and their response to stimulation by chemotactic stimuli and activating factors. Exercise causes a biphasic increase in the number of neutrophils in the blood, arising from increases in catecholamine and cortisol concentrations. Moderate intensity exercise may enhance neutrophil respiratory burst activity, possibly through increases in the concentrations of growth hormone and the inflammatory cytokine IL-6. In contrast, intense or long duration exercise may suppress neutrophil degranulation and the production of reactive oxidants via elevated circulating concentrations of epinephrine (adrenaline) and cortisol. There is evidence of neutrophil degranulation and activation of the respiratory burst following exercise-induced muscle damage. In principle, improved responsiveness of neutrophils to stimulation following exercise of moderate intensity could mean that individuals participating in moderate exercise may have improved resistance to infection. Conversely, competitive athletes undertaking regular intense exercise may be at greater risk of contracting illness. However, there are limited data to support this concept. To elucidate the cellular mechanisms involved in the neutrophil responses to exercise, researchers have examined changes in the expression of cell membrane receptors, the production and release of reactive oxidants and more recently, calcium signaling. The investigation of possible modifications of other signal transduction events following exercise has not been possible because of current methodological limitations. At present, variation in exercise-induced alterations in neutrophil function appears to be due to differences in exercise protocols, training status, sampling points and laboratory assay techniques.