985 resultados para Vehicle Standards Costs.
Resumo:
"Standards are living documents which reflect progress in science, technology and systems" - Standards Australia.
Resumo:
Preservice teacher educators, both nationally and internationally, must negotiate a plethora of expectations including using Professional Standards to enhance teacher quality. In Australia, the recent Teacher Education Ministerial Advisory Group (TEMAG) report highlighted weak application of Standards in Initial Teacher Education (ITE). However, recent findings suggest that education stakeholders feel positive about the implementation of the Australian Professional Standards for Teachers (APSTs). This study responds to these differing viewpoints by exploring how teacher educators in a large metropolitan university in Australia account for the use of Standards in their work. Discourse analytic techniques in conjunction with socio-spatial theory make visible particular metaphors of practice as teacher educators negotiate the real-and-imagined spaces of regulated teacher education programs. The findings highlight the importance of investigating the utility of Standards in the lived experiences of teacher educators, as they are responsible for preparing quality, classroom ready graduates.
Resumo:
One of the central issues in making efficient use of IT in the design, construction and maintenance of buildings is the sharing of the digital building data across disciplines and lifecycle stages. One technology which enables data sharing is CAD layering, which to be of real use requires the definition of standards. This paper focuses on the background, objectives and effectiveness of the International standard ISO 13567, Organisation and naming of layers for CAD. In particular the efficiency and effectiveness of the standardisation and standard implementation process are in focus, rather than the technical details. The study was conducted as a qualitative study with a number of experts who responded to a semi-structured mail questionnaire, supplemented by personal interviews. The main results were that CAD layer standards based on the ISO standard have been implemented, particularly in northern European countries, but are not very widely used. A major problem which was identified was the lack of resources for marketing and implementing the standard as national variations, once it had been formally accepted.
Resumo:
“Corporate governance deals with the ways in which suppliers of finance to firms assure themselves of getting a return on their investment” (Shleifer and Vishny (1997, p. 737). According to La Porta et al. (1999), research in corporate finance relevant for most countries should focus on the incentives and capabilities of controlling shareholders to treat themselves preferentially at the expense of minority shareholders. Accordingly, this thesis sets out to answer a number of research questions regarding the role of large shareholders in public firms that have received little attention in the literature so far. A common theme in the essays stems from the costs and benefits of individual large-block owners and the role of control contestability from the perspective of outside minority shareholders. The first essay empirically examines whether there are systematic performance differences between family controlled and nonfamily controlled firms in Western Europe. In contrast to the widely held view that family control penalizes firm value, the essay shows that publicly traded family firms have higher performance than comparable firms. In the second essay, we present both theoretical and empirical analysis on the effects of control contestability on firm valuation. Consistent with the theoretical model, the empirical results show that minority shareholders benefit from a more contestable control structure. The third essay explores the effects of individual large-block owners on top management turnover and board appointments in Finnish listed firms. The results indicate that firm performance is an important determinant for management and board restructurings. For certain types of turnover decisions the corporate governance structure influences the performance / turnover sensitivity. In the fourth essay, we investigate the relation between the governance structure and dividend policy in Finnish listed firms. We find evidence in support of the outcome agency model of dividends stating that lower agency conflicts should be associated with higher dividend payouts.
Resumo:
The negative relationship between economic growth and stock market return is not an anomaly according to evidence documented in many economies. It is argued that future economic growth is largely irrelevant for predicting future equity returns, since long-run equity returns depend mainly on dividend yields and the growth of per share dividends. The economic growth does result in a higher standard of living for consumers, but does not necessarily translate into higher returns for owners of the capital. The divergence in performance between the real sector and stock markets appears to support the above argument. However, this thesis strives to offer an alternative explanation to the apparent divergence within the framework of corporate governance. It argues that weak corporate governance standards in Chinese listed firms exacerbated by poor inventor protection results into a marginalized capital market. Each of the three essays in the thesis addresses one particular aspect of corporate governance on the Chinese stock market in a sequential way through gathering empirical evidence on three distinctive stock market activities. The first essay questions whether significant agency conflicts do exist by building a game on rights issues. It documents significant divergence in interests among shareholders holding different classes of shares. The second essay investigates the level of agency costs by examining value of control through constructing a sample of block transactions. It finds that block transactions that transfer ultimate control entail higher premiums. The third essay looks into possible avenues through which corporate governance standards could be improved by investigating the economic consequences of cross-listing on the Chinese stock market. It finds that, by adopting a higher disclosure standard through cross-listings, firms voluntarily commit themselves to reducing information asymmetry, and consequently command higher valuation than their counterparts.
Resumo:
This paper presents a robust fixed order H-2 controller design using Strengthened discrete optimal projection equations, which approximate the first order necessary optimality condition. The novelty of this work is the application of the robust H-2 controller to a micro aerial vehicle named Sarika2 developed in house. The controller is designed in discrete domain for the lateral dynamics of Sarika2 in the presence of low frequency atmospheric turbulence (gust) and high frequency sensor noise. The design specification includes simultaneous stabilization, disturbance rejection and noise attenuation over the entire flight envelope of the vehicle. The resulting controller performance is comprehensively analyzed by means of simulation.
Resumo:
We present a generic study of inventory costs in a factory stockroom that supplies component parts to an assembly line. Specifically, we are concerned with the increase in component inventories due to uncertainty in supplier lead-times, and the fact that several different components must be present before assembly can begin. It is assumed that the suppliers of the various components are independent, that the suppliers' operations are in statistical equilibrium, and that the same amount of each type of component is demanded by the assembly line each time a new assembly cycle is scheduled to begin. We use, as a measure of inventory cost, the expected time for which an order of components must be held in the stockroom from the time it is delivered until the time it is consumed by the assembly line. Our work reveals the effects of supplier lead-time variability, the number of different types of components, and their desired service levels, on the inventory cost. In addition, under the assumptions that inventory holding costs and the cost of delaying assembly are linear in time, we study optimal ordering policies and present an interesting characterization that is independent of the supplier lead-time distributions.
Resumo:
With the objective of better understanding the significance of New Car Assessment Program (NCAP) tests conducted by the National Highway Traffic Safety Administration (NHTSA), head-on collisions between two identical cars of different sizes and between cars and a pickup truck are studied in the present paper using LS-DYNA models. Available finite element models of a compact car (Dodge Neon), midsize car (Dodge Intrepid), and pickup truck (Chevrolet C1500) are first improved and validated by comparing theanalysis-based vehicle deceleration pulses against corresponding NCAP crash test histories reported by NHTSA. In confirmation of prevalent perception, simulation-bascd results indicate that an NCAP test against a rigid barrier is a good representation of a collision between two similar cars approaching each other at a speed of 56.3 kmph (35 mph) both in terms of peak deceleration and intrusions. However, analyses carried out for collisions between two incompatible vehicles, such as an Intrepid or Neon against a C1500, point to the inability of the NCAP tests in representing the substantially higher intrusions in the front upper regions experienced by the cars, although peak decelerations in cars arc comparable to those observed in NCAP tests. In an attempt to improve the capability of a front NCAP test to better represent real-world crashes between incompatible vehicles, i.e., ones with contrasting ride height and lower body stiffness, two modified rigid barriers are studied. One of these barriers, which is of stepped geometry with a curved front face, leads to significantly improved correlation of intrusions in the upper regions of cars with respect to those yielded in the simulation of collisions between incompatible vehicles, together with the yielding of similar vehicle peak decelerations obtained in NCAP tests.
Resumo:
Submergence of land is a major impact of large hydropower projects. Such projects are often also dogged by siltation, delays in construction and heavy debt burdens-factors that are not considered in the project planning exercise. A simple constrained optimization model for the benefit~ost analysis of large hydropower projects that considers these features is proposed. The model is then applied to two sites in India. Using the potential productivity of an energy plantation on the submergible land is suggested as a reasonable approach to estimating the opportunity cost of submergence. Optimum project dimensions are calculated for various scenarios. Results indicate that the inclusion of submergence cost may lead to a substanual reduction in net present value and hence in project viability. Parameters such as project lifespan, con$truction time, discount rate and external debt burden are also of significance. The designs proposed by the planners are found to be uneconomic, whIle even the optimal design may not be viable for more typical scenarios. The concept of energy opportunity cost is useful for preliminary screening; some projects may require more detailed calculations. The optimization approach helps identify significant trade-offs between energy generation and land availability.
Resumo:
This dissertation develops a strategic management accounting perspective of inventory routing. The thesis studies the drivers of cost efficiency gains by identifying the role of the underlying cost structure, demand, information sharing, forecasting accuracy, service levels, vehicle fleet, planning horizon and other strategic factors as well as the interaction effects among these factors with respect to performance outcomes. The task is to enhance the knowledge of the strategic situations that favor the implementation of inventory routing systems, understanding cause-and-effect relationships, linkages and gaining a holistic view of the value proposition of inventory routing. The thesis applies an exploratory case study design, which is based on normative quantitative empirical research using optimization, simulation and factor analysis. Data and results are drawn from a real world application to cash supply chains. The first research paper shows that performance gains require a common cost component and cannot be explained by simple linear or affine cost structures. Inventory management and distribution decisions become separable in the absence of a set-dependent cost structure, and neither economies of scope nor coordination problems are present in this case. The second research paper analyzes whether information sharing improves the overall forecasting accuracy. Analysis suggests that the potential for information sharing is limited to coordination of replenishments and that central information do not yield more accurate forecasts based on joint forecasting. The third research paper develops a novel formulation of the stochastic inventory routing model that accounts for minimal service levels and forecasting accuracy. The developed model allows studying the interaction of minimal service levels and forecasting accuracy with the underlying cost structure in inventory routing. Interestingly, results show that the factors minimal service level and forecasting accuracy are not statistically significant, and subsequently not relevant for the strategic decision problem to introduce inventory routing, or in other words, to effectively internalize inventory management and distribution decisions at the supplier. Consequently the main contribution of this thesis is the result that cost benefits of inventory routing are derived from the joint decision model that accounts for the underlying set-dependent cost structure rather than the level of information sharing. This result suggests that the value of information sharing of demand and inventory data is likely to be overstated in prior literature. In other words, cost benefits of inventory routing are primarily determined by the cost structure (i.e. level of fixed costs and transportation costs) rather than the level of information sharing, joint forecasting, forecasting accuracy or service levels.
Resumo:
Using the recently developed model predictive static programming (MPSP) technique, a nonlinear suboptimal reentry guidance scheme is presented in this paper for a reusable launch vehicle (RLV). Unlike traditional RLV guidance, the problem considered over here is restricted only to pitch plane maneuver of the vehicle, which allows simpler mission planning and vehicle load management. The computationally efficient MPSP technique brings in the philosophy of trajectory optimization into the framework of guidance design, which in turn results in very effective guidance schemes in general. In the problem addressed in this paper, it successfully guides the RLV through the critical reentry phase both by constraining it to the allowable narrow flight corridor as well as by meeting the terminal constraints at the end of the reentry segment. The guidance design is validated by considering possible aerodynamic uncertainties as well as dispersions in the initial conditions. (C) 2010 Elsevier Masson SAS. All rights reserved.
Resumo:
Stroke is a major cause of death and disability, incurs significant costs to healthcare systems, and inflicts severe burden to the whole society. Stroke care in Finland has been described in several population-based studies between 1967 and 1998, but not since. In the PERFECT Stroke study presented here, a system for monitoring the Performance, Effectiveness, and Costs of Treatment episodes in Stroke was developed in Finland. Existing nationwide administrative registries were linked at individual patient level with personal identification numbers to depict whole episodes of care, from acute stroke, through rehabilitation, until the patients went home, were admitted to permanent institutional care, or died. For comparisons in time and between providers, patient case-mix was adjusted for. The PERFECT Stroke database includes 104 899 first-ever stroke patients over the years 1999 to 2008, of whom 79% had ischemic stroke (IS), 14% intracerebral hemorrhage (ICH), and 7% subarachnoid hemorrhage (SAH). A 18% decrease in the age and sex adjusted incidence of stroke was observed over the study period, 1.8% improvement annually. All-cause 1-year case-fatality rate improved from 28.6% to 24.6%, or 0.5% annually. The expected median lifetime after stroke increased by 2 years for IS patients, to 7 years and 7 months, and by 1 year for ICH patients, to 4 years 5 months. No change could be seen in median SAH patient survival, >10 years. Stroke prevalence was 82 000, 1.5% of total population of Finland, in 2008. Modern stroke center care was shown to be associated with a decrease in both death and risk of institutional care of stroke patients. Number needed to treat to prevent these poor outcomes at one year from stroke was 32 (95% confidence intervals 26 to 42). Despite improvements over the study period, more than a third of Finnish stroke patients did not have access to stroke center care. The mean first-year healthcare cost of a stroke patient was ~20 000 , and among survivors ~10 000 annually thereafter. Only part of this cost was incurred by stroke, as the same patients cost ~5000 over the year prior to stroke. Total lifetime costs after first-ever stroke were ~85 000 . A total of 1.1 Billion , 7% of all healthcare expenditure, is used in the treatment of stroke patients annually. Despite a rapidly aging population, the number of new stroke patients is decreasing, and the patients are more likely to survive. This is explained in part by stroke center care, which is effective, and should be made available for all stroke patients. It is possible, in a suitable setting with high-quality administrative registries and a common identifier, to avoid the huge workload and associated costs of setting up a conventional stroke registry, and still acquire a fairly comprehensive dataset on stroke care and outcome.
Resumo:
We propose a method to compute a probably approximately correct (PAC) normalized histogram of observations with a refresh rate of Theta(1) time units per histogram sample on a random geometric graph with noise-free links. The delay in computation is Theta(root n) time units. We further extend our approach to a network with noisy links. While the refresh rate remains Theta(1) time units per sample, the delay increases to Theta(root n log n). The number of transmissions in both cases is Theta(n) per histogram sample. The achieved Theta(1) refresh rate for PAC histogram computation is a significant improvement over the refresh rate of Theta(1/log n) for histogram computation in noiseless networks. We achieve this by operating in the supercritical thermodynamic regime where large pathways for communication build up, but the network may have more than one component. The largest component however will have an arbitrarily large fraction of nodes in order to enable approximate computation of the histogram to the desired level of accuracy. Operation in the supercritical thermodynamic regime also reduces energy consumption. A key step in the proof of our achievability result is the construction of a connected component having bounded degree and any desired fraction of nodes. This construction may also prove useful in other communication settings on the random geometric graph.