982 resultados para mastery approach
Resumo:
Index tracking is an investment approach where the primary objective is to keep portfolio return as close as possible to a target index without purchasing all index components. The main purpose is to minimize the tracking error between the returns of the selected portfolio and a benchmark. In this paper, quadratic as well as linear models are presented for minimizing the tracking error. The uncertainty is considered in the input data using a tractable robust framework that controls the level of conservatism while maintaining linearity. The linearity of the proposed robust optimization models allows a simple implementation of an ordinary optimization software package to find the optimal robust solution. The proposed model of this paper employs Morgan Stanley Capital International Index as the target index and the results are reported for six national indices including Japan, the USA, the UK, Germany, Switzerland and France. The performance of the proposed models is evaluated using several financial criteria e.g. information ratio, market ratio, Sharpe ratio and Treynor ratio. The preliminary results demonstrate that the proposed model lowers the amount of tracking error while raising values of portfolio performance measures.
Resumo:
This paper provides an important and timely overview of a conceptual framework designed to assist with the development of message content, as well as the evaluation, of persuasive health messages. While an earlier version of this framework was presented in a prior publication by the authors in 2009, important refinements to the framework have seen it evolve in recent years, warranting the need for an updated review. This paper outlines the Step approach to Message Design and Testing (or SatMDT) in accordance with the theoretical evidence which underpins, as well as empirical evidence which demonstrates the relevance and feasibility, of each of the framework’s steps. The development and testing of the framework have thus far been based exclusively within the road safety advertising context; however, the view expressed herein is that the framework may have broader appeal and application to the health persuasion context.
Resumo:
This report describes the Year One Pilot Study processes, and articulates findings from the major project components designed to address these challenges noted above (See Figure 1). Specifically, the pilot study tested the campaign research and development process involving participatory design with young people and sector partners, and the efficacy and practicality of conducting a longitudinal, randomised control trial online with minors, including ways oflinking survey data to campaign data. Each sub-study comprehensively considered the ethical requirements of conducting online research with minors in school settings. The theoretical and methodological framework for measuring campaign engagement and efficacy (Sub-studies 3, 4 and 5) drew on the Model of Goal-Directed Behaviour (MGB) (Perugini & Bagozzi 2001) and Nudge Theory (Thaler & Sunstein, 2008).
Resumo:
Purpose Health service quality is an important determinant for health service satisfaction and behavioral intentions. The purpose of this paper is to investigate requirements of e‐health services and to develop a measurement model to analyze the construct of “perceived e‐health service quality.” Design/methodology/approach The paper adapts the C‐OAR‐SE procedure for scale development by Rossiter. The focal aspect is the “physician‐patient relationship” which forms the core dyad in the healthcare service provision. Several in‐depth interviews were conducted in Switzerland; first with six patients (as raters), followed by two experts of the healthcare system (as judges). Based on the results and an extensive literature research, the classification of object and attributes is developed for this model. Findings The construct e‐health service quality can be described as an abstract formative object and is operationalized with 13 items: accessibility, competence, information, usability/user friendliness, security, system integration, trust, individualization, empathy, ethical conduct, degree of performance, reliability, and ability to respond. Research limitations/implications Limitations include the number of interviews with patients and experts as well as critical issues associated with C‐OAR‐SE. More empirical research is needed to confirm the quality indicators of e‐health services. Practical implications Health care providers can utilize the results for the evaluation of their service quality. Practitioners can use the hierarchical structure to measure service quality at different levels. The model provides a diagnostic tool to identify poor and/or excellent performance with regard to the e‐service delivery. Originality/value The paper contributes to knowledge with regard to the measurement of e‐health quality and improves the understanding of how customers evaluate the quality of e‐health services.
Resumo:
The microbial mediated production of nitrous oxide (N2O) and its reduction to dinitrogen (N2) via denitrification represents a loss of nitrogen (N) from fertilised agro-ecosystems to the atmosphere. Although denitrification has received great interest by biogeochemists in the last decades, the magnitude of N2lossesand related N2:N2O ratios from soils still are largely unknown due to methodical constraints. We present a novel 15N tracer approach, based on a previous developed tracer method to study denitrification in pure bacterial cultures which was modified for the use on soil incubations in a completely automated laboratory set up. The method uses a background air in the incubation vessels that is replaced with a helium-oxygen gas mixture with a 50-fold reduced N2 background (2 % v/v). This method allows for a direct and sensitive quantification of the N2 and N2O emissions from the soil with isotope-ratio mass spectrometry after 15N labelling of denitrification N substrates and minimises the sensitivity to the intrusion of atmospheric N2 at the same time. The incubation set up was used to determine the influence of different soil moisture levels on N2 and N2O emissions from a sub-tropical pasture soil in Queensland/Australia. The soil was labelled with an equivalent of 50 μg-N per gram dry soil by broadcast application of KNO3solution (4 at.% 15N) and incubated for 3 days at 80% and 100% water filled pore space (WFPS), respectively. The headspace of the incubation vessel was sampled automatically over 12hrs each day and 3 samples (0, 6, and 12 hrs after incubation start) of headspace gas analysed for N2 and N2O with an isotope-ratio mass spectrometer (DELTA V Plus, Thermo Fisher Scientific, Bremen, Germany(. In addition, the soil was analysed for 15N NO3- and NH4+ using the 15N diffusion method, which enabled us to obtain a complete N balance. The method proved to be highly sensitive for N2 and N2O emissions detecting N2O emissions ranging from 20 to 627 μN kg-1soil-1hr-1and N2 emissions ranging from 4.2 to 43 μN kg-1soil-1hr-1for the different treatments. The main end-product of denitrification was N2O for both water contents with N2 accounting for 9% and 13% of the total denitrification losses at 80% and 100%WFPS, respectively. Between 95-100% of the added 15N fertiliser could be recovered. Gross nitrification over the 3 days amounted to 8.6 μN g-1 soil-1 and 4.7 μN g-1 soil-1, denitrification to 4.1 μN g-1 soil-1 and 11.8 μN g-1 soil-1at 80% and 100%WFPS, respectively. The results confirm that the tested method allows for a direct and highly sensitive detection of N2 and N2O fluxes from soils and hence offers a sensitive tool to study denitrification and N turnover in terrestrial agro-ecosystems.
Resumo:
In life cycle assessment studies, greenhouse gas (GHG) emissions from direct land-use change have been estimated to make a significant contribution to the global warming potential of agricultural products. However, these estimates have a high uncertainty due to the complexity of data requirements and difficulty in attribution of land-use change. This paper presents estimates of GHG emissions from direct land-use change from native woodland to grazing land for two beef production regions in eastern Australia, which were the subject of a multi-impact life cycle assessment study for premium beef production. Spatially- and temporally consistent datasets were derived for areas of forest cover and biomass carbon stocks using published remotely sensed tree-cover data and regionally applicable allometric equations consistent with Australia's national GHG inventory report. Standard life cycle assessment methodology was used to estimate GHG emissions and removals from direct land-use change attributed to beef production. For the northern-central New South Wales region of Australia estimates ranged from a net emission of 0.03 t CO2-e ha-1 year-1 to net removal of 0.12 t CO2-e ha-1 year-1 using low and high scenarios, respectively, for sequestration in regrowing forests. For the same period (1990-2010), the study region in southern-central Queensland was estimated to have net emissions from land-use change in the range of 0.45-0.25 t CO2-e ha-1 year-1. The difference between regions reflects continuation of higher rates of deforestation in Queensland until strict regulation in 2006 whereas native vegetation protection laws were introduced earlier in New South Wales. On the basis of liveweight produced at the farm-gate, emissions from direct land-use change for 1990-2010 were comparable in magnitude to those from other on-farm sources, which were dominated by enteric methane. However, calculation of land-use change impacts for the Queensland region for a period starting 2006, gave a range from net emissions of 0.11 t CO2-e ha-1 year-1 to net removals of 0.07 t CO2-e ha-1 year-1. This study demonstrated a method for deriving spatially- and temporally consistent datasets to improve estimates for direct land-use change impacts in life cycle assessment. It identified areas of uncertainty, including rates of sequestration in woody regrowth and impacts of land-use change on soil carbon stocks in grazed woodlands, but also showed the potential for direct land-use change to represent a net sink for GHG.
Resumo:
Researchers have highlighted the importance of the nonprofit sector, its continued growth, and a relative lack of literature particularly related to nonprofit organizational values. Therefore, this study investigates organizational culture in a human services nonprofit organization. The relationship between person-organization value congruence and employee and volunteer job-related attitudes is examined (N = 227). Following initial qualitative enquiry, confirmatory factor analyses of the Competing Values Framework and additional values revealed five dimensions of organizational values. The relationship between value congruence, and employee and volunteers' job-related attitudes was examined using polynomial regression techniques. Analyses revealed that for employees, job-related attitudes were influenced strongly by organization values ratings, particularly when exceeding person ratings of the same values. For volunteers, person value ratings exceeding organization value ratings were especially detrimental to their job-related attitudes. Findings are discussed in terms of their theoretical and practical implications.
Resumo:
Efficient and accurate geometric and material nonlinear analysis of the structures under ultimate loads is a backbone to the success of integrated analysis and design, performance-based design approach and progressive collapse analysis. This paper presents the advanced computational technique of a higher-order element formulation with the refined plastic hinge approach which can evaluate the concrete and steel-concrete structure prone to the nonlinear material effects (i.e. gradual yielding, full plasticity, strain-hardening effect when subjected to the interaction between axial and bending actions, and load redistribution) as well as the nonlinear geometric effects (i.e. second-order P-d effect and P-D effect, its associate strength and stiffness degradation). Further, this paper also presents the cross-section analysis useful to formulate the refined plastic hinge approach.
Resumo:
There are currently 23,500 level crossings in Australia, broadly divided active level crossings with flashing lights; and passive level crossings controlled by stop and give way signs. The current strategy is to annually upgrade passive level crossings with active controls within a given budget, but the 5,900 public passive crossings are too numerous to be upgraded all. The rail industry is considering alternative options to treat more crossings. One of them is to use lower cost equipment with reduced safety integrity level, but with a design that would fail to a safe state: in case of the impossibility for the system to know whether a train is approaching, the crossing changes to a passive crossing. This is implemented by having a STOP sign coming in front of the flashing lights. While such design is considered safe in terms of engineering design, questions remain on human factors. In order to evaluate whether such approach is safe, we conducted a driving simulator study where participants were familiarized with the new active crossing, before changing the signage to a passive crossing. Our results show that drivers treated the new crossing as an active crossing after the novelty effect had passed. While most participants did not experience difficulties with the crossing being turned back to a passive crossing, a number of participants experienced difficulties stopping in time at the first encounter of such passive crossing. Worse, a number of drivers never realized the signage had changed, highlighting the link between the decision to brake and stop at an active crossing to the lights flashing. Such results show the potential human factor issues of changing an active crossing to a passive crossing in case of failure of the detection of the train.
Resumo:
The increasing prevalence of dementia in Australia (and worldwide) over the next few decades poses enormous social, health and economic challenges. In the absence of a cure, strategies to prevent, delay the onset of, or reduce the impact of dementia are required to contain a growing disease burden, and health and care costs. A population health approach has the potential to substantially reduce the impact of dementia. Internationally, many countries have started to adopt population health strategies that incorporate elements of dementia prevention. The authors examine some of the elements of such an approach and barriers to its implementation. International dementia frameworks and strategies were reviewed to identify options utilized for a population health approach to dementia. Internationally and nationally, dementia frameworks are being developed that include population health approaches. Most of the frameworks identified included early diagnosis and intervention, and increasing community awareness as key objectives, while several included promotion of the links between a healthy lifestyle and reduced risk for dementia. A poor evidence base (especially for illness prevention), diagnostic and technical limitations, and policy and implementation issues are significant barriers in maximizing the promise of population health approaches in this area. The review and analysis of the population health approach to dementia will inform national and jurisdictional policy development.
Resumo:
Oleaginous microorganisms have potential to be used to produce oils as alternative feedstock for biodiesel production. Microalgae (Chlorella protothecoides and Chlorella zofingiensis), yeasts (Cryptococcus albidus and Rhodotorula mucilaginosa), and fungi (Aspergillus oryzae and Mucor plumbeus) were investigated for their ability to produce oil from glucose, xylose and glycerol. Multi-criteria analysis (MCA) using analytic hierarchy process (AHP) and preference ranking organization method for the enrichment of evaluations (PROMETHEE) with graphical analysis for interactive aid (GAIA), was used to rank and select the preferred microorganisms for oil production for biodiesel application. This was based on a number of criteria viz., oil concentration, content, production rate and yield, substrate consumption rate, fatty acids composition, biomass harvesting and nutrient costs. PROMETHEE selected A. oryzae, M. plumbeus and R. mucilaginosa as the most prospective species for oil production. However, further analysis by GAIA Webs identified A. oryzae and M. plumbeus as the best performing microorganisms.
Resumo:
Structural fire safety has become one of the key considerations in the design and maintenance of the built infrastructure. Conventionally the fire resistance rating of load bearing Light gauge Steel Frame (LSF) walls is determined based on the standard time-temperature curve given in ISO 834. Recent research has shown that the true fire resistance of building elements exposed to building fires can be less than their fire resistance ratings determined based on standard fire tests. It is questionable whether the standard time-temperature curve truly represents the fuel loads in modern buildings. Therefore an equivalent fire severity approach has been used in the past to obtain fire resistance rating. This is based on the performance of a structural member exposed to a realistic design fire curve in comparison to that of standard fire time-temperature curve. This paper presents the details of research undertaken to develop an energy based time equivalent approach to obtain the fire resistance ratings of LSF walls exposed to realistic design fire curves with respect to standard fire exposure. This approach relates to the amount of energy transferred to the member. The proposed method was used to predict the fire resistance ratings of single and double layer plasterboard lined and externally insulated LSF walls. The predicted fire ratings were compared with the results from finite element analyses and fire design rules for three different wall configurations exposed to both rapid and prolonged fires. The comparison shows that the proposed energy method can be used to obtain the fire resistance ratings of LSF walls in the case of prolonged fires.
Resumo:
This paper analyzes the application of rights-based approaches to disaster displacement in the Asia-Pacific region in order to assess whether the current framework is sufficient to protect the rights of internally displaced persons. It identifies that disaster-induced displacement is increasingly prevalent in the region and that economic and social conditions in many countries mean that the impact of displacement is often prolonged and more severe. The paper identifies the relevant human rights principles which apply in the context of disaster-induced displacement and examines their implementation in a number of soft-law instruments. While it identifies shortcomings in impementation and enforcement, the paper concludes that a rights-based approach could be enhanced by greater engagement with existing human rights treaties and greater implementation of soft-law principles, and that no new instrument is required.
Resumo:
High-stakes testing has become an important element of the Australian educational landscape. As one part of the neo-liberal paradigm where beliefs in the individual and the free market are paramount, it is of concern how school leaders can respond to this phenomenon in an ethical manner. Ethics and ethical leadership have increased in prominence both in the educational administration literature and in the media (Cranston, Ehrich, & Kimber, 2006). In this paper we consider ethical theories on which school principals can draw, not only in the leadership of their own schools but in their relationships with other schools. We provide an example of a school leader sharing a successful intervention with other schools, illustrating that school leaders can create spaces for promoting the public good within the context of high-stakes testing.
Resumo:
Access to transport systems and the connection to such systems provided to essential economic and social activities are critical to determine households' transportation disadvantage levels. In spite of the developments in better identifying transportation disadvantaged groups, the lack of effective policies resulted in the continuum of the issue as a significant problem. This paper undertakes a pilot case investigation as test bed for a new approach developed to reduce transportation policy shortcomings. The approach, ‘disadvantage-impedance index’, aims to ease transportation disadvantages by employing representative parameters to measure the differences between policy alternatives run in a simulation environment. Implemented in the Japanese town of Arao, the index uses trip-making behaviour and resident stated preference data. The results of the index reveal that even a slight improvement in accessibility and travel quality indicators makes a significant difference in easing disadvantages. The index, integrated into a four-step model, proves to be highly robust and useful in terms of quick diagnosis in capturing effective actions, and developing potentially efficient policies.