172 resultados para fixed point formulae
Resumo:
Young children are most vulnerable to, and most at risk from, environmental and sustainability challenges. Early education investments aimed at addressing such issues, until recently however, have been neglected or under-rated. Fortunately, this is changing. A groundswell of practitioner interest in early childhood environmental education/ education for sustainability is emerging, in contrast to the ‘patches of green’ that have characterised previous decades. Indeed, an international coalition for early childhood education for sustainability (ECEfS) is beginning to develop, evidenced by The Gothenburg Recommendations on Education for Sustainable Development (2008) that identifies early childhood, within a framework of lifelong learning, as a ‘natural starting point’ for all ongoing education for sustainability. This document is important as it is the first international statement to explicitly identify ECEfS as contributing to education for sustainability. The next challenge for ECEfS is for practitioner mobilisation to be matched by research activity aimed at broadening and deepening practice-based responses. This is the next exciting frontier in the legitimisation of ECEfS.
Resumo:
This study of photocatalytic oxidation of phenol over titanium dioxide films presents a method for the evaluation of true reaction kinetics. A flat plate reactor was designed for the specific purpose of investigating the influence of various reaction parameters, specifically photocatalytic film thickness, solution flow rate (1–8 l min−1), phenol concentration (20, 40 and 80 ppm), and irradiation intensity (70.6, 57.9, 37.1and 20.4 W m−2), in order to further understand their impact on the reaction kinetics. Special attention was given to the mass transfer phenomena and the influence of film thickness. The kinetics of phenol degradation were investigated with different irradiation levels and initial pollutant concentration. Photocatalytic degradation experiments were performed to evaluate the influence of mass transfer on the reaction and, in addition, the benzoic acid method was applied for the evaluation of mass transfer coefficient. For this study the reactor was modelled as a batch-recycle reactor. A system of equations that accounts for irradiation, mass transfer and reaction rate was developed to describe the photocatalytic process, to fit the experimental data and to obtain kinetic parameters. The rate of phenol photocatalytic oxidation was described by a Langmuir–Hinshelwood type law that included competitive adsorption and degradation of phenol and its by-products. The by-products were modelled through their additive effect on the solution total organic carbon.
Resumo:
This paper presents a guidance approach for aircraft in periodic inspection tasks. The periodic inspection task involves flying to a series of desired fixed points of inspection with specified attitude requirements so that requirements for downward looking sensors, such as cameras, are achieved. We present a solution using a precision guidance law and a bank turn dynamics model. High fidelity simulation studies illustrate the effectiveness of this approach under both ideal (nil-wind) and non-ideal (wind) conditions.
Resumo:
Fixed-wing aircraft equipped with downward pointing cameras and/or LiDAR can be used for inspecting approximately piecewise linear assets such as oil-gas pipelines, roads and power-lines. Automatic control of such aircraft is important from a productivity and safety point of view (long periods of precision manual flight at low-altitude is not considered reasonable from a safety perspective). This paper investigates the effect of any unwanted coupling between guidance and autopilot loops (typically caused by unmodeled delays in the aircraft’s response), and the specific impact of any unwanted dynamics on the performance of aircraft undertaking inspection of piecewise linear corridor assets (such as powerlines). Simulation studies and experimental flight tests are used to demonstrate the benefits of a simple compensator in mitigating the unwanted lateral oscillatory behaviour (or coupling) that is caused by unmodeled time constants in the aircraft dynamics.
Resumo:
This paper formulates a node-based smoothed conforming point interpolation method (NS-CPIM) for solid mechanics. In the proposed NS-CPIM, the higher order conforming PIM shape functions (CPIM) have been constructed to produce a continuous and piecewise quadratic displacement field over the whole problem domain, whereby the smoothed strain field was obtained through smoothing operation over each smoothing domain associated with domain nodes. The smoothed Galerkin weak form was then developed to create the discretized system equations. Numerical studies have demonstrated the following good properties: NS-CPIM (1) can pass both standard and quadratic patch test; (2) provides an upper bound of strain energy; (3) avoid the volumetric locking; (4) provides the higher accuracy than those in the node-based smoothed schemes of the original PIMs.
Resumo:
In this article, an enriched radial point interpolation method (e-RPIM) is developed for computational mechanics. The conventional radial basis function (RBF) interpolation is novelly augmented by the suitable basis functions to reflect the natural properties of deformation. The performance of the enriched meshless RBF shape functions is first investigated using the surface fitting. The surface fitting results have proven that, compared with the conventional RBF, the enriched RBF interpolation has a much better accuracy to fit a complex surface than the conventional RBF interpolation. It has proven that the enriched RBF shape function will not only possess all advantages of the conventional RBF interpolation, but also can accurately reflect the deformation properties of problems. The system of equations for two-dimensional solids is then derived based on the enriched RBF shape function and both of the meshless strong-form and weak-form. A numerical example of a bar is presented to study the effectiveness and efficiency of e-RPIM. As an important application, the newly developed e-RPIM, which is augmented by selected trigonometric basis functions, is applied to crack problems. It has been demonstrated that the present e-RPIM is very accurate and stable for fracture mechanics problems.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.
Resumo:
This paper presents a comprehensive study to find the most efficient bitrate requirement to deliver mobile video that optimizes bandwidth, while at the same time maintains good user viewing experience. In the study, forty participants were asked to choose the lowest quality video that would still provide for a comfortable and long-term viewing experience, knowing that higher video quality is more expensive and bandwidth intensive. This paper proposes the lowest pleasing bitrates and corresponding encoding parameters for five different content types: cartoon, movie, music, news and sports. It also explores how the lowest pleasing quality is influenced by content type, image resolution, bitrate, and user gender, prior viewing experience, and preference. In addition, it analyzes the trajectory of users’ progression while selecting the lowest pleasing quality. The findings reveal that the lowest bitrate requirement for a pleasing viewing experience is much higher than that of the lowest acceptable quality. Users’ criteria for the lowest pleasing video quality are related to the video’s content features, as well as its usage purpose and the user’s personal preferences. These findings can provide video providers guidance on what quality they should offer to please mobile users.
Resumo:
For the analysis of material nonlinearity, an effective shear modulus approach based on the strain control method is proposed in this paper by using point collocation method. Hencky’s total deformation theory is used to evaluate the effective shear modulus, Young’s modulus and Poisson’s ratio, which are treated as spatial field variables. These effective properties are obtained by the strain controlled projection method in an iterative manner. To evaluate the second order derivatives of shape function at the field point, the radial basis function (RBF) in the local support domain is used. Several numerical examples are presented to demonstrate the efficiency and accuracy of the proposed method and comparisons have been made with analytical solutions and the finite element method (ABAQUS).
Resumo:
The concept of produsage developed from the realisation that new language was needed to describe the new phenomena emerging from the intersection of Web 2.0, user-generated content, and social media since the early years of the new millennium. When hundreds, thousands, maybe tens of thousands of participants utilise online platforms to collaborate in the development and continuous improvement of a wide variety of content – from software to informational resources to creative works –, and when this work takes place through a series of more or less unplanned, ad hoc, almost random cooperative encounters, then to describe these processes using terms which were developed during the industrial revolution no longer makes much sense. When – exactly because what takes place here is no longer a form of production in any conventional sense of the word – the outcomes of these massively distributed collaborations appear in the form of constantly changing, permanently mutable bodies of work which are owned at once by everyone and no-one, by the community of contributors as a whole but by none of them as individuals, then to conceptualise them as fixed and complete products in the industrial meaning of the term is missing the point. When what results from these efforts is of a quality (in both depth and breadth) that enables it to substitute for, replace, and even undermine the business model of long-established industrial products, even though precariously it relies on volunteer contributions, and when their volunteering efforts make it possible for some contributors to find semi- or fully professional employment in their field, then conventional industrial logic is put on its head.
Resumo:
Precise identification of the time when a change in a hospital outcome has occurred enables clinical experts to search for a potential special cause more effectively. In this paper, we develop change point estimation methods for survival time of a clinical procedure in the presence of patient mix in a Bayesian framework. We apply Bayesian hierarchical models to formulate the change point where there exists a step change in the mean survival time of patients who underwent cardiac surgery. The data are right censored since the monitoring is conducted over a limited follow-up period. We capture the effect of risk factors prior to the surgery using a Weibull accelerated failure time regression model. Markov Chain Monte Carlo is used to obtain posterior distributions of the change point parameters including location and magnitude of changes and also corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the risk-adjusted survival time CUSUM control charts for different magnitude scenarios. The proposed estimator shows a better performance where a longer follow-up period, censoring time, is applied. In comparison with the alternative built-in CUSUM estimator, more accurate and precise estimates are obtained by the Bayesian estimator. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.
Resumo:
This paper presents a nonlinear gust-attenuation controller to stabilize velocities, attitudes and angular rates of a fixed-wing unmanned aerial vehicle (UAV) in the presence of wind gusts. The proposed controller aims to achieve a steady-state flight condition such that the host UAV can avoid airspace collision with other UAVs during the cruise flight. Based on the typical UAV model capturing flight aerodynamics, a nonlinear Hinf controller is developed with rapid response property in consideration of actuator constraints. Simulations are conducted for the Shadow UAV to verify performance of the proposed controller. Comparative studies with the proportional-integral derivative (PID) controllers demonstrate that the proposed controller exhibits great performance improvement in a gusty environment, making it suitable for integration into the design of flight control systems for cruise flight with safety guarantees.
Resumo:
The draft of the first stage of the national curriculum has now been published. Its final form to be presented in December 2010 should be the centrepiece of Labor’s Educational Revolution. All the other aspects – personal computers, new school buildings, rebates for uniforms and even the MySchool report card – are marginal to the prescription of what is to be taught and learnt in schools. The seven authors in this journal’s Point and Counterpoint (Curriculum Perspectives, 30(1) 2010, pp.53-74) raise a number of both large and small issues in education as a whole, and in science education more particularly. Two of them (Groves and McGarry) make brief reference to earlier attempts to achieve national curriculum in Australia. Those writing from New Zealand and USA will be unaware of just how ambitious this project is for Australia - a bold and overdue educational adventure or a foolish political decision destined to failure, as happened in the later 1970s and the 1990s.