352 resultados para Zero-One Matrices
Resumo:
The purpose of this paper is to investigate the Japanese answer to the 90’s depression by (i). presenting a case study of the framework developed to address the new business challenges and value creation in complex, ambiguous and uncertain environment, i.e., Development of Project and Programme Management for Enterprise Innovation (P2M) and Project Management Association Japan (PMAJ) in Japan; and (ii). Exposing what in our view are the underlying theoretical bases supporting this framework and from this drawing some theoretical lessons learnt which could be helpful to the development of sound PM standards and PM competence model. This theoretical approach is assumed to be useful to transpose the Japanese experience to other analogical contexts and situations.
Resumo:
Traffic generated semi and non volatile organic compounds (SVOCs and NVOCs) pose a serious threat to human and ecosystem health when washed off into receiving water bodies by stormwater. Climate change influenced rainfall characteristics makes the estimation of these pollutants in stormwater quite complex. The research study discussed in the paper developed a prediction framework for such pollutants under the dynamic influence of climate change on rainfall characteristics. It was established through principal component analysis (PCA) that the intensity and durations of low to moderate rain events induced by climate change mainly affect the wash-off of SVOCs and NVOCs from urban roads. The study outcomes were able to overcome the limitations of stringent laboratory preparation of calibration matrices by extracting uncorrelated underlying factors in the data matrices through systematic application of PCA and factor analysis (FA). Based on the initial findings from PCA and FA, the framework incorporated orthogonal rotatable central composite experimental design to set up calibration matrices and partial least square regression to identify significant variables in predicting the target SVOCs and NVOCs in four particulate fractions ranging from >300-1 μm and one dissolved fraction of <1 μm. For the particulate fractions range >300-1 μm, similar distributions of predicted and observed concentrations of the target compounds from minimum to 75th percentile were achieved. The inter-event coefficient of variations for particulate fractions of >300-1 μm were 5% to 25%. The limited solubility of the target compounds in stormwater restricted the predictive capacity of the proposed method for the dissolved fraction of <1 μm.
Resumo:
Radial Hele-Shaw flows are treated analytically using conformal mapping techniques. The geometry of interest has a doubly-connected annular region of viscous fluid surrounding an inviscid bubble that is either expanding or contracting due to a pressure difference caused by injection or suction of the inviscid fluid. The zero-surface-tension problem is ill-posed for both bubble expansion and contraction, as both scenarios involve viscous fluid displacing inviscid fluid. Exact solutions are derived by tracking the location of singularities and critical points in the analytic continuation of the mapping function. We show that by treating the critical points, it is easy to observe finite-time blow-up, and the evolution equations may be written in exact form using complex residues. We present solutions that start with cusps on one interface and end with cusps on the other, as well as solutions that have the bubble contracting to a point. For the latter solutions, the bubble approaches an ellipse in shape at extinction.
Resumo:
The purpose of this study was to determine the effects of cryotherapy, in the form of cold water immersion, on knee joint position sense. Fourteen healthy volunteers, with no previous knee injury or pre-existing clinical condition, participated in this randomized cross-over trial. The intervention consisted of a 30-min immersion, to the level of the umbilicus, in either cold (14 ± 1°C) or tepid water(28 ± 1°C). Approximately one week later, in a randomized fashion, the volunteers completed the remaining immersion. Active ipsilateral limb repositioning sense of the right knee was measured, using weight-bearing and non-weight bearing assessments, employing video-recorded 3D motion analysis. These assessments were conducted immediately before and after a cold and tepid water immersion. No significant differences were found between treatments for the absolute (P = 0.29), relative (P = 0.21) or variable error (P = 0.86). The average effect size of the outcome measures was modest (range –0.49 to 0.9) and all the associated 95% confidence intervals for these effect sizes crossed zero. These results indicate that there is no evidence of an enhanced risk of injury, following a return to sporting activity, after cold water.
Resumo:
This nuts and bolts session discusses QUT Library’s Study Solutions service which is staffed by academic skills advisors and librarians as the 2nd tier of its learning and study support model. Firstly, it will discuss the rationale behind the Study Solutions model and provide a brief profile of the service. Secondly, it will outline what distinguishes it from other modes of one-to-one learning support. Thirdly, it will report findings from a student perception study conducted to determine what difference this model of individual study assistance made to academic confidence, ability to transfer academic skills and capacity to assist peers. Finally, this session will include small group discussions to consider the feasibility of this model as best practice for other tertiary institutions and student perception as a valuable measure of the impact of learning support services.
Resumo:
The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.
Resumo:
A standard method for the numerical solution of partial differential equations (PDEs) is the method of lines. In this approach the PDE is discretised in space using �finite di�fferences or similar techniques, and the resulting semidiscrete problem in time is integrated using an initial value problem solver. A significant challenge when applying the method of lines to fractional PDEs is that the non-local nature of the fractional derivatives results in a discretised system where each equation involves contributions from many (possibly every) spatial node(s). This has important consequences for the effi�ciency of the numerical solver. First, since the cost of evaluating the discrete equations is high, it is essential to minimise the number of evaluations required to advance the solution in time. Second, since the Jacobian matrix of the system is dense (partially or fully), methods that avoid the need to form and factorise this matrix are preferred. In this paper, we consider a nonlinear two-sided space-fractional di�ffusion equation in one spatial dimension. A key contribution of this paper is to demonstrate how an eff�ective preconditioner is crucial for improving the effi�ciency of the method of lines for solving this equation. In particular, we show how to construct suitable banded approximations to the system Jacobian for preconditioning purposes that permit high orders and large stepsizes to be used in the temporal integration, without requiring dense matrices to be formed. The results of numerical experiments are presented that demonstrate the effectiveness of this approach.
Resumo:
Soil organic carbon sequestration rates over 20 years based on the Intergovernmental Panel for Climate Change (IPCC) methodology were combined with local economic data to determine the potential for soil C sequestration in wheat-based production systems on the Indo-Gangetic Plain (IGP). The C sequestration potential of rice–wheat systems of India on conversion to no-tillage is estimated to be 44.1 Mt C over 20 years. Implementing no-tillage practices in maize–wheat and cotton–wheat production systems would yield an additional 6.6 Mt C. This offset is equivalent to 9.6% of India's annual greenhouse gas emissions (519 Mt C) from all sectors (excluding land use change and forestry), or less than one percent per annum. The economic analysis was summarized as carbon supply curves expressing the total additional C accumulated over 20 year for a price per tonne of carbon sequestered ranging from zero to USD 200. At a carbon price of USD 25 Mg C−1, 3 Mt C (7% of the soil C sequestration potential) could be sequestered over 20 years through the implementation of no-till cropping practices in rice–wheat systems of the Indian States of the IGP, increasing to 7.3 Mt C (17% of the soil C sequestration potential) at USD 50 Mg C−1. Maximum levels of sequestration could be attained with carbon prices approaching USD 200 Mg C−1 for the States of Bihar and Punjab. At this carbon price, a total of 34.7 Mt C (79% of the estimated C sequestration potential) could be sequestered over 20 years across the rice–wheat region of India, with Uttar Pradesh contributing 13.9 Mt C.
Resumo:
Abstract Background: The importance of quality-of-life (QoL) research has been recognised over the past two decades in patients with head and neck (H&N) cancer. The aims of this systematic review are to evaluate the QoL status of H&N cancer survivors one year after treatment and to identify the determinants affecting their QoL. Methods: Pubmed, Medline, Scopus, Sciencedirect and CINAHL (2000–2011) were searched for relevant studies, and two of the present authors assessed their methodological quality. The characteristics and main findings of the studies were extracted and reported. Results: Thirty-seven studies met the inclusion criteria, and the methodological quality of the majority was moderate to high. While patients of the group in question recover their global QoL by 12 months after treatment, a number of outstanding issues persist – deterioration in physical functioning, fatigue, xerostomia and sticky saliva. Age, cancer site, stage of disease, social support, smoking, feeding tube placement and alcohol consumption are the significant determinants of QoL at 12 months, while gender has little or no influence. Conclusions: Regular assessments should be carried out to monitor physical functioning,degree of fatigue, xerostomia and sticky saliva. Further research is required to develop appropriate and effective interventions to deal with these issues, and thus to promote the patients’ QoL.
Resumo:
Nitrate reduction with nanoscale zero-valent iron (NZVI) was reported as a potential technology to remove nitrate from nitrate-contaminated water. In this paper, nitrate reduction with NZVI prepared by hydrogen reduction of natural goethite (NZVI-N, -N represents natural goethite) and hydrothermal goethite (NZVI-H, -H represents hydrothermal goethite) was conducted. Besides, the effects of reaction time, nitrate concentration, iron-to-nitrate ratio on nitrate removal rate over NZVI-H and NZVI-N were investigated. To prove their excellent nitrate reduction capacities, NZVI-N and NZVI-H were compared with ordinary zero-valent iron (OZVI-N) through the static experiments. Based on all above investigations, the mechanism of nitrate reduction with NZVI-N was proposed. The result showed that reaction time, nitrate concentration, iron-to-nitrate ratio played an important role in nitrate reduction by NZVI-N and NZVI-H. Compared with OZVI, NZVI-N and NZVI-H showed little relationship with pH. And NZVI-N for nitrate composition offers a higher stability than NZVI-H because of the existence of Al-substitution. Furthermore, NZVI-N, prepared by hydrogen reduction of goethite, has higher activity for nitrate reduction and the products contain hydrogen, nitrogen, NH 4 +, a little nitrite, but no NOx, meanwhile NZVI-N was oxidized to Fe 2+. It is a relatively easy and cost-effective method for nitrate removal, so NZVI-N reducing nitrate has a great potential application in nitrate removal of groundwater. © 2012 Elsevier B.V.
Resumo:
Over the last twenty years, the use of open content licenses has become increasingly and surprisingly popular. The use of such licences challenges the traditional incentive-based model of exclusive rights under copyright. Instead of providing a means to charge for the use of particular works, what seems important is mitigating against potential personal harm to the author and, in some cases, preventing non-consensual commercial exploitation. It is interesting in this context to observe the primacy of what are essentially moral rights over the exclusionary economic rights. The core elements of common open content licences map somewhat closely to continental conceptions of the moral rights of authorship. Most obviously, almost all free software and free culture licences require attribution of authorship. More interestingly, there is a tension between social norms developed in free software communities and those that have emerged in the creative arts over integrity and commercial exploitation. For programmers interested in free software, licence terms that prohibit commercial use or modification are almost completely inconsistent with the ideological and utilitarian values that underpin the movement. For those in the creative industries, on the other hand, non-commercial terms and, to a lesser extent, terms that prohibit all but verbatim distribution continue to play an extremely important role in the sharing of copyright material. While prohibitions on commercial use often serve an economic imperative, there is also a certain personal interest for many creators in avoiding harmful exploitation of their expression – an interest that has sometimes been recognised as forming a component of the moral right of integrity. One particular continental moral right – the right of withdrawal – is present neither in Australian law or in any of the common open content licences. Despite some marked differences, both free software and free culture participants are using contractual methods to articulate the norms of permissible sharing. Legal enforcement is rare and often prohibitively expensive, and the various communities accordingly rely upon shared understandings of acceptable behaviour. The licences that are commonly used represent a formalised expression of these community norms and provide the theoretically enforceable legal baseline that lends them legitimacy. The core terms of these licences are designed primarily to alleviate risk in sharing and minimise transaction costs in sharing and using copyright expression. Importantly, however, the range of available licences reflect different optional balances in the norms of creating and sharing material. Generally, it is possible to see that, stemming particularly from the US, open content licences are fundamentally important in providing a set of normatively accepted copyright balances that reflect the interests sought to be protected through moral rights regimes. As the cost of creation, distribution, storage, and processing of expression continues to fall towards zero, there are increasing incentives to adopt open content licences to facilitate wide distribution and reuse of creative expression. Thinking of these protocols not only as reducing transaction costs but of setting normative principles of participation assists in conceptualising the role of open content licences and the continuing tensions that permeate modern copyright law.
Resumo:
The health effects of environmental hazards are often examined using time series of the association between a daily response variable (e.g., death) and a daily level of exposure (e.g., temperature). Exposures are usually the average from a network of stations. This gives each station equal importance, and negates the opportunity for some stations to be better measures of exposure. We used a Bayesian hierarchical model that weighted stations using random variables between zero and one. We compared the weighted estimates to the standard model using data on health outcomes (deaths and hospital admissions) and exposures (air pollution and temperature) in Brisbane, Australia. The improvements in model fit were relatively small, and the estimated health effects of pollution were similar using either the standard or weighted estimates. Spatial weighted exposures would be probably more worthwhile when there is either greater spatial detail in the health outcome, or a greater spatial variation in exposure.
Resumo:
This short article summarises some of the proposed reforms to surrogacy laws in Queensland, suggested by the Liberal National Party in 2012. The paper outlines some of the main objections that could be voiced in response to the proposed changes to the law.