889 resultados para Value analysis (Cost control)


Relevância:

50.00% 50.00%

Publicador:

Resumo:

Traditional real-time control systems are tightly integrated into the industrial processes they govern. Now, however, there is increasing interest in networked control systems. These provide greater flexibility and cost savings by allowing real-time controllers to interact with industrial processes over existing communications networks. New data packet queuing protocols are currently being developed to enable precise real-time control over a network with variable propagation delays. We show how one such protocol was formally modelled using timed automata, and how model checking was used to reveal subtle aspects of the control system's dynamic behaviour.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

OBJECTIVES: To assess whether blood pressure control in primary care could be improved with the use of patient held targets and self monitoring in a practice setting, and to assess the impact of these on health behaviours, anxiety, prescribed antihypertensive drugs, patients' preferences, and costs. DESIGN: Randomised controlled trial. SETTING: Eight general practices in south Birmingham. PARTICIPANTS: 441 people receiving treatment in primary care for hypertension but not controlled below the target of < 140/85 mm Hg. INTERVENTIONS: Patients in the intervention group received treatment targets along with facilities to measure their own blood pressure at their general practice; they were also asked to visit their general practitioner or practice nurse if their blood pressure was repeatedly above the target level. Patients in the control group received usual care (blood pressure monitoring by their practice). MAIN OUTCOME MEASURES: Primary outcome: change in systolic blood pressure at six months and one year in both intervention and control groups. Secondary outcomes: change in health behaviours, anxiety, prescribed antihypertensive drugs, patients' preferences of method of blood pressure monitoring, and costs. RESULTS: 400 (91%) patients attended follow up at one year. Systolic blood pressure in the intervention group had significantly reduced after six months (mean difference 4.3 mm Hg (95% confidence interval 0.8 mm Hg to 7.9 mm Hg)) but not after one year (mean difference 2.7 mm Hg (- 1.2 mm Hg to 6.6 mm Hg)). No overall difference was found in diastolic blood pressure, anxiety, health behaviours, or number of prescribed drugs. Patients who self monitored lost more weight than controls (as evidenced by a drop in body mass index), rated self monitoring above monitoring by a doctor or nurse, and consulted less often. Overall, self monitoring did not cost significantly more than usual care (251 pounds sterling (437 dollars; 364 euros) (95% confidence interval 233 pounds sterling to 275 pounds sterling) versus 240 pounds sterling (217 pounds sterling to 263 pounds sterling). CONCLUSIONS: Practice based self monitoring resulted in small but significant improvements of blood pressure at six months, which were not sustained after a year. Self monitoring was well received by patients, anxiety did not increase, and there was no appreciable additional cost. Practice based self monitoring is feasible and results in blood pressure control that is similar to that in usual care.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This thesis deals with the problems associated with the planning and control of production, with particular reference to a small aluminium die casting company. The main problem areas were identified as: (a) A need to be able to forecast the customers demands upon the company's facilities. (b) A need to produce a manufacturing programme in which the output of the foundry (or die casting section) was balanced with the available capacity in the machine shop. (c) The need to ensure that the resultant system enabled the company's operating budget to have a reasonable chance of being achieved. At the commencement of the research work the major customers were members of the automobile industry and had their own system of forecasting, from which they issued manufacturing schedules to their component suppliers, The errors in the forecast were analysed and the distributions noted. Using these distributions the customer's forecast was capable of being modified to enable his final demand to be met with a known degree of confidence. Before a manufacturing programme could be developed the actual manufacturing system had to be reviewed and it was found that as with many small companies there was a remarkable lack of formal control and written data. Relevant data with regards to the component and the manufacturing process had therefore to be collected and analysed. The foundry process was fixed but the secondary machining operations were analysed by a technique similar to Component Flow Analysis and as a result the machines were arranged in a series of flow lines. A system of manual production control was proposed and for comparison, a local computer bureau was approached and a system proposed incorporating the production of additional management information. These systems are compared and the relative merits discussed and a proposal made for implementation.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The thesis investigates the value of quantitative analyses for historical studies of science through an examination of research trends in insect pest control, or economic entomology. Reviews are made of quantitative studies of science, and historical studies of pest control. The methodological strengths and weaknesses of bibliometric techniques are examined in a special chapter; techniques examined include productivity studies such as paper counts, and relational techniques such as co-citation and co-word analysis. Insect pest control is described. This includes a discussion of the socio-economic basis of the concept of `pest'; a series of classifications of pest control techniques are provided and analysed with respect to their utility for scientometric studies. The chemical and biological approaches to control are discussed as scientific and technological paradigms. Three case studies of research trends in economic entomology are provided. First a scientometric analysis of samples of chemical control and biological control papers; providing quantitative data on institutional, financial, national, and journal structures associated with pest control research fields. Second, a content analysis of a core journal, the Journal of Economic Entomology, over a period of 1910-1985; this identifies the main research innovations and trends, in particular the changing balance between chemical and biological control. Third, an analysis of historical research trends in insecticide research; this shows the rise, maturity and decline of research of many groups of compounds. These are supplemented by a collection of seven papers on scientometric studies of pest control and quantitative techniques for analysing science.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This research develops a low cost remote sensing system for use in agricultural applications. The important features of the system are that it monitors the near infrared and it incorporates position and attitude measuring equipment allowing for geo-rectified images to be produced without the use of ground control points. The equipment is designed to be hand held and hence requires no structural modification to the aircraft. The portable remote sensing system consists of an inertia measurement unit (IMU), which is accelerometer based, a low-cost GPS device and a small format false colour composite digital camera. The total cost of producing such a system is below GBP 3000, which is far cheaper than equivalent existing systems. The design of the portable remote sensing device has eliminated bore sight misalignment errors from the direct geo-referencing process. A new processing technique has been introduced for the data obtained from these low-cost devices, and it is found that using this technique the image can be matched (overlaid) onto Ordnance Survey Master Maps at an accuracy compatible with precision agriculture requirements. The direct geo-referencing has also been improved by introducing an algorithm capable of correcting oblique images directly. This algorithm alters the pixels value, hence it is advised that image analysis is performed before image georectification. The drawback of this research is that the low-cost GPS device experienced bad checksum errors, which resulted in missing data. The Wide Area Augmented System (WAAS) correction could not be employed because the satellites could not be locked onto whilst flying. The best GPS data were obtained from the Garmin eTrex (15 m kinematic and 2 m static) instruments which have a highsensitivity receiver with good lock on capability. The limitation of this GPS device is the inability to effectively receive the P-Code wavelength, which is needed to gain the best accuracy when undertaking differential GPS processing. Pairing the carrier phase L1 with the pseudorange C/A-Code received, in order to determine the image coordinates by the differential technique, is still under investigation. To improve the position accuracy, it is recommended that a GPS base station should be established near the survey area, instead of using a permanent GPS base station established by the Ordnance Survey.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Energy price is related to more than half of the total life cycle cost of asphalt pavements. Furthermore, the fluctuation related to price of energy has been much higher than the general inflation and interest rate. This makes the energy price inflation an important variable that should be addressed when performing life cycle cost (LCC) studies re- garding asphalt pavements. The present value of future costs is highly sensitive to the selected discount rate. Therefore, the choice of the discount rate is the most critical element in LCC analysis during the life time of a project. The objective of the paper is to present a discount rate for asphalt pavement projects as a function of interest rate, general inflation and energy price inflation. The discount rate is defined based on the portion of the energy related costs during the life time of the pavement. Consequently, it can reflect the financial risks related to the energy price in asphalt pavement projects. It is suggested that a discount rate sensitivity analysis for asphalt pavements in Sweden should range between –20 and 30%.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The extractive industry is characterized by high levels of risk and uncertainty. These attributes create challenges when applying traditional accounting concepts (such as the revenue recognition and matching concepts) to the preparation of financial statements in the industry. The International Accounting Standards Board (2010) states that the objective of general purpose financial statements is to provide useful financial information to assist the capital allocation decisions of existing and potential providers of capital. The usefulness of information is defined as being relevant and faithfully represented so as to best aid in the investment decisions of capital providers. Value relevance research utilizes adaptations of the Ohlson (1995) to assess the attribute of value relevance which is one part of the attributes resulting in useful information. This study firstly examines the value relevance of the financial information disclosed in the financial reports of extractive firms. The findings reveal that the value relevance of information disclosed in the financial reports depends on the circumstances of the firm including sector, size and profitability. Traditional accounting concepts such as the matching concept can be ineffective when applied to small firms who are primarily engaged in nonproduction activities that involve significant levels of uncertainty such as exploration activities or the development of sites. Standard setting bodies such as the International Accounting Standards Board and the Financial Accounting Standards Board have addressed the financial reporting challenges in the extractive industry by allowing a significant amount of accounting flexibility in industryspecific accounting standards, particularly in relation to the accounting treatment of exploration and evaluation expenditure. Therefore, secondly this study examines whether the choice of exploration accounting policy has an effect on the value relevance of information disclosed in the financial reports. The findings show that, in general, the Successful Efforts method produces value relevant information in the financial reports of profitable extractive firms. However, specifically in the oil & gas sector, the Full Cost method produces value relevant asset disclosures if the firm is lossmaking. This indicates that investors in production and non-production orientated firms have different information needs and these needs cannot be simultaneously fulfilled by a single accounting policy. In the mining sector, a preference by large profitable mining companies towards a more conservative policy than either the Full Cost or Successful Efforts methods does not result in more value relevant information being disclosed in the financial reports. This finding supports the fact that the qualitative characteristic of prudence is a form of bias which has a downward effect on asset values. The third aspect of this study is an examination of the effect of corporate governance on the value relevance of disclosures made in the financial reports of extractive firms. The findings show that the key factor influencing the value relevance of financial information is the ability of the directors to select accounting policies which reflect the economic substance of the particular circumstances facing the firms in an effective way. Corporate governance is found to have an effect on value relevance, particularly in the oil & gas sector. However, there is no significant difference between the exploration accounting policy choices made by directors of firms with good systems of corporate governance and those with weak systems of corporate governance.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Waterways have many more ties with society than as a medium for the transportation of goods alone. Waterway systems offer society many kinds of socio-economic value. Waterway authorities responsible for management and (re)development need to optimize the public benefits for the investments made. However, due to the many trade-offs in the system these agencies have multiple options for achieving this goal. Because they can invest resources in a great many different ways, they need a way to calculate the efficiency of the decisions they make. Transaction cost theory, and the analysis that goes with it, has emerged as an important means of justifying efficiency decisions in the economic arena. To improve our understanding of the value-creating and coordination problems for waterway authorities, such a framework is applied to this sector. This paper describes the findings for two cases, which reflect two common multi trade-off situations for waterway (re)development. Our first case study focuses on the Miami River, an urban revitalized waterway. The second case describes the Inner Harbour Navigation Canal in New Orleans, a canal and lock in an industrialized zone, in need of an upgrade to keep pace with market developments. The transaction cost framework appears to be useful in exposing a wide variety of value-creating opportunities and the resistances that come with it. These insights can offer infrastructure managers guidance on how to seize these opportunities.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Teachers frequently struggle to cope with conduct problems in the classroom. The aim of this study was to assess the effectiveness of the Incredible Years Teacher Classroom Management Training Programme for improving teacher competencies and child adjustment. The study involved a group randomised controlled trial which included 22 teachers and 217 children (102 boys and 115 girls). The average age of children included in the study was 5.3 years (standard deviation = 0.89). Teachers were randomly allocated to an intervention group (n = 11 teachers; 110 children) or a waiting-list control group (n = 11; 107 children). The sample also included 63 ‘high-risk’ children (33 intervention; 30 control), who scored above the cut-off (>12) on the Strengths and Difficulties Questionnaire for abnormal socioemotional and behavioural difficulties. Teacher and child behaviours were assessed at baseline and 6 months later using psychometric and observational measures. Programme delivery costs were also analysed. Results showed positive changes in teachers’ self-reported use of positive classroom management strategies (effect size = 0.56), as well as negative classroom management strategies (effect size = −0.43). Teacher reports also highlight improvements in the classroom behaviour of the high-risk group of children, while the estimated cost of delivering the Incredible Years Teacher Classroom Management Training Programme was modest. However, analyses of teacher and child observations were largely non-significant. A need for further research exploring the effectiveness and cost-effectiveness of the Incredible Years Teacher Classroom Management Training Programme is indicated.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Empirical validity of the claim that overhead costs are driven not by production volume but by transactions resulting from production complexity is examined using data from 32 manufacturing plants from the electronics, machinery, and automobile components industries. Transactions are measured using number of engineering change orders, number of purchasing and production planning personnel, shop- floor area per part, and number of quality control and improvement personnel. Results indicate a strong positive relation between manufacturing overhead costs and both manufacturing transactions and production volume. Most of the variation in overhead costs, however, is explained by measures of manufacturing transactions, not volume.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We explore the recently developed snapshot-based dynamic mode decomposition (DMD) technique, a matrix-free Arnoldi type method, to predict 3D linear global flow instabilities. We apply the DMD technique to flows confined in an L-shaped cavity and compare the resulting modes to their counterparts issued from classic, matrix forming, linear instability analysis (i.e. BiGlobal approach) and direct numerical simulations. Results show that the DMD technique, which uses snapshots generated by a 3D non-linear incompressible discontinuous Galerkin Navier?Stokes solver, provides very similar results to classical linear instability analysis techniques. In addition, we compare DMD results issued from non-linear and linearised Navier?Stokes solvers, showing that linearisation is not necessary (i.e. base flow not required) to obtain linear modes, as long as the analysis is restricted to the exponential growth regime, that is, flow regime governed by the linearised Navier?Stokes equations, and showing the potential of this type of analysis based on snapshots to general purpose CFD codes, without need of modifications. Finally, this work shows that the DMD technique can provide three-dimensional direct and adjoint modes through snapshots provided by the linearised and adjoint linearised Navier?Stokes equations advanced in time. Subsequently, these modes are used to provide structural sensitivity maps and sensitivity to base flow modification information for 3D flows and complex geometries, at an affordable computational cost. The information provided by the sensitivity study is used to modify the L-shaped geometry and control the most unstable 3D mode.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The fundamental objective for health research is to determine whether changes should be made to clinical decisions. Decisions made by veterinary surgeons in the light of new research evidence are known to be influenced by their prior beliefs, especially their initial opinions about the plausibility of possible results. In this paper, clinical trial results for a bovine mastitis control plan were evaluated within a Bayesian context, to incorporate a community of prior distributions that represented a spectrum of clinical prior beliefs. The aim was to quantify the effect of veterinary surgeons’ initial viewpoints on the interpretation of the trial results. A Bayesian analysis was conducted using Markov chain Monte Carlo procedures. Stochastic models included a financial cost attributed to a change in clinical mastitis following implementation of the control plan. Prior distributions were incorporated that covered a realistic range of possible clinical viewpoints, including scepticism, enthusiasm and uncertainty. Posterior distributions revealed important differences in the financial gain that clinicians with different starting viewpoints would anticipate from the mastitis control plan, given the actual research results. For example, a severe sceptic would ascribe a probability of 0.50 for a return of <£5 per cow in an average herd that implemented the plan, whereas an enthusiast would ascribe this probability for a return of >£20 per cow. Simulations using increased trial sizes indicated that if the original study was four times as large, an initial sceptic would be more convinced about the efficacy of the control plan but would still anticipate less financial return than an initial enthusiast would anticipate after the original study. In conclusion, it is possible to estimate how clinicians’ prior beliefs influence their interpretation of research evidence. Further research on the extent to which different interpretations of evidence result in changes to clinical practice would be worthwhile.