111 resultados para supply lead time

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The time for conducting Preventive Maintenance (PM) on an asset is often determined using a predefined alarm limit based on trends of a hazard function. In this paper, the authors propose using both hazard and reliability functions to improve the accuracy of the prediction particularly when the failure characteristic of the asset whole life is modelled using different failure distributions for the different stages of the life of the asset. The proposed method is validated using simulations and case studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost. In order to reach these goals, they need good quality components from suppliers at optimum price and lead time. This actually forced all the companies to adapt different improvement practices such as lean manufacturing, Just in Time (JIT) and effective supply chain management. Applying new improvement techniques and tools cause higher establishment costs and more Information Delay (ID). On the contrary, these new techniques may reduce the risk of stock outs and affect supply chain flexibility to give a better overall performance. But industry people are unable to measure the overall affects of those improvement techniques with a standard evaluation model .So an effective overall supply chain performance evaluation model is essential for suppliers as well as manufacturers to assess their companies under different supply chain strategies. However, literature on lean supply chain performance evaluation is comparatively limited. Moreover, most of the models assumed random values for performance variables. The purpose of this paper is to propose an effective supply chain performance evaluation model using triangular linguistic fuzzy numbers and to recommend optimum ranges for performance variables for lean implementation. The model initially considers all the supply chain performance criteria (input, output and flexibility), converts the values to triangular linguistic fuzzy numbers and evaluates overall supply chain performance under different situations. Results show that with the proposed performance measurement model, improvement area for each variable can be accurately identified.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research develops a design support system, which is able to estimate the life cycle cost of different product families at the early stage of product development. By implementing the system, a designer is able to develop various cost effective product families in a shorter lead-time and minimise the destructive impact of the product family on the environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In truck manufacturing, the exhaust and air inlet pipes are specialized equipment that requires highly skilled, heavy machinery and small batch production methods. This paper describes a project to develop the computer numerically controlled (CNC) pipe bending process for a truck component manufacturer. The company supplies a huge range of heavy duty truck parts to the domestic market and is a significant supplier in Australia. The company has been using traditional methods of machine assisted manual pipe bending techniques. In a drive of continuous improvement, the company has acquired a pre-owned CNC bending machine capable of bending pipes automatically up to 25 bends. However, due to process mismatch, this machine is only used for single bending operation. The researchers studied the bending system and changed the manufacturing process. Using an example exhaust pipe as the benchmark, a significant drop of manufacturing lead time from 70 minutes to 40 minutes for each pipe was demonstrated. There was also a decrease of material cost due to the multiple bends part in one piece without cutting excessive materials for each single bend like it used to be.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Delay in the delivery of manufacturing projects is a major problem in manufacturing industries. This research investigates the factors that influence the lead time of new projects in manufacturing organisations. Employing a questionnaire survey and interview methodologies, this study collected data from five leading manufacturing organisations as well as their suppliers and contractors in Saudi Arabia to examine what, how and why the new project implementation delay occurs. Results show that the main factors contributing to manufacturing delays are related to people and material. On the other hand, social, political and cultural factors were the least significant factors as per the outcome of this study. Views of manufacturers, suppliers and contractors regarding causes of delays have also been analysed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An effective prognostics program will provide ample lead time for maintenance engineers to schedule a repair and to acquire replacement components before catastrophic failures occur. This paper presents a technique for accurate assessment of the remnant life of machines based on health state probability estimation technique. For comparative study of the proposed model with the proportional hazard model (PHM), experimental bearing failure data from an accelerated bearing test rig were used. The result shows that the proposed prognostic model based on health state probability estimation can provide a more accurate prediction capability than the commonly used PHM in bearing failure case study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mismatches between services needing to interoperate have been addressed through the adaptation of structural and behavioural interfaces of services, which in practice incur long lead time through manual, coding effort. We propose a framework, complementary to con- ventional service adaptation, to synthesise service interfaces in the open setting of business networks, allowing consumers to introspect service interfaces and formulate service invocations. The framework also allows evolved service requests, as new features of service capabilities are discov- ered, through interactions with other, similar services. Finally the frame- work fosters reuse of adaptation efforts through normalisation of struc- tural and behavioural interfaces of similar services. This paper provides a first exposition of the service interface synthesis framework, describing patterns containing novel requirements for unilateral service adaptation and detailing the interface synthesis technique. Complex examples of ser- vices drawn from commercial logistic systems are then used to validate the synthesis technique and identify open challenges and future research directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction sector policy makers have the opportunity to create improvements and develop economic, social and environmental sustainability through supply chain economics. The idea of the supply chain concept to improve firm behaviour and industry performance is not new. However there has been limited application and little or no measurement to monitor successful implementation. Often purchasing policies have been developed with sound strategic procurement principles but even these have had limited penetration in to the processes and practices of infrastructure agencies. The research reported in this paper documents an action research study currently being undertaken in the Australian construction sector which aims to explore supply chain economic policy implementation for sectoral change by two government agencies. The theory which informs this study is the emerging area of construction supply chain economics. There are five stages to the project including; demand analysis, chain analysis, government agency organizational audit, supplier strategy and strategic alignment. The overall objective is towards the development of a Supplier Group Strategy Map for two public sector agencies. Two construction subsectors are examined in detail; construction and demolition waste and precast concrete. Both of these subsectors are critical to the economic and environmental sustainability performance of the construction sector and the community as a whole in the particular jurisdictions. The local and state government agencies who are at the core of the case studies rely individually on the performance of these sectors. The study is set within the context of a sound state purchasing policy that has however, had limited application by the two agencies. Partial results of the study are presented and early findings indicate that the standard risk versus expenditure procurement model does not capture the complexities of project, owner and government risk considerations. A new model is proposed in this paper, which incorporates the added dimension of time. The research results have numerous stakeholders; they will hold particular value for those interested in regional construction sector economics, government agencies who develop and implement policy and who have a large construction purchasing imprint and the players involved in the two subsectors. Even though this is a study in Australia it has widespread applicability as previous research indicates that procurement reform is of international significance and policy implementation is problematic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Joint ventures can take many forms and can be formed for different reasons, from sharing resources to creating future business opportunities. At the same time, there is increasing interest and discussion of alternative procurement methods, moving away from traditional procurement systems to relational approaches. Business systems and strategies need to be redefined and move from a short-term project to project culture to a more strategic, long-term perspective. Joint ventures of construction organisations, global and local, have become increasingly popular to deliver large-scale infrastructure construction projects. However, successful strategic collaborations require project organisations to formulate a fit between contractual and operational arrangements for each situation. This study reviews the movement from traditional procurement methods towards relational contracting approaches in Queensland, Australia. The study examines the organisational factors that facilitates sustainable relationship between project organisations and hence, lead to long-term business success. This paper reports on initial findings captured from a survey undertaken with construction contracting organisations in Australia, focusing on the supply chain relationships. Contractors’ perceptions of the relationship management process and the engagement of the supply chain are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researching administrative history is problematical. A trail of authoritative documents is often hard to find; and useful summaries can be difficult to organise, especially if source material is in paper formats in geographically dispersed locations. In the absence of documents, the reasons for particular decisions and the rationale underpinning particular policies can be confounded as key personnel advance in their professions and retire. The rationale for past decisions may be lost for practical purposes; and if an organisation’s memory of events is diminished, its learning through experience is also diminished. Publishing this document tries to avoid unnecessary duplication of effort by other researchers that need to venture into how policies of charging for public sector information have been justified. The author compiled this work within a somewhat limited time period and the work does not pretend to be a complete or comprehensive analysis of the issues.----- A significant part of the role of government is to provide a framework of legally-enforceable rights and obligations that can support individuals and non-government organisations in their lawful activities. Accordingly, claims that governments should be more ‘business-like’ need careful scrutiny. A significant supply of goods and services occurs as non-market activity where neither benefits nor costs are quantified within conventional accounting systems or in terms of money. Where a government decides to provide information as a service; and information from land registries is archetypical, the transactions occur as a political decision made under a direct or a clearly delegated authority of a parliament with the requisite constitutional powers. This is not a market transaction and the language of the market confuses attempts to describe a number of aspects of how governments allocate resources.----- Cost recovery can be construed as an aspect of taxation that is a sole prerogative of a parliament. The issues are fundamental to political constitutions; but they become more complicated where states cede some taxing powers to a central government as part of a federal system. Nor should the absence of markets be construed necessarily as ‘market failure’ or even ‘government failure’. The absence is often attributable to particular technical, economic and political constraints that preclude the operation of markets. Arguably, greater care is needed in distinguishing between the polity and markets in raising revenues and allocating resources; and that needs to start by removing unhelpful references to ‘business’ in the context of government decision-making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background A complete explanation of the mechanisms by which Pb2+ exerts toxic effects on developmental central nervous system remains unknown. Glutamate is critical to the developing brain through various subtypes of ionotropic or metabotropic glutamate receptors (mGluRs). Ionotropic N-methyl-D-aspartate receptors have been considered as a principal target in lead-induced neurotoxicity. The relationship between mGluR3/mGluR7 and synaptic plasticity had been verified by many recent studies. The present study aimed to examine the role of mGluR3/mGluR7 in lead-induced neurotoxicity. Methods Twenty-four adult and female rats were randomly selected and placed on control or 0.2% lead acetate during gestation and lactation. Blood lead and hippocampal lead levels of pups were analyzed at weaning to evaluate the actual lead content at the end of the exposure. Impairments of short -term memory and long-term memory of pups were assessed by tests using Morris water maze and by detection of hippocampal ultrastructural alterations on electron microscopy. The impact of lead exposure on mGluR3 and mGluR7 mRNA expression in hippocampal tissue of pups were investigated by quantitative real-time polymerase chain reaction and its potential role in lead neurotoxicity were discussed. Results Lead levels of blood and hippocampi in the lead-exposed rats were significantly higher than those in the controls (P < 0.001). In tests using Morris Water Maze, the overall decrease in goal latency and swimming distance was taken to indicate that controls had shorter latencies and distance than lead-exposed rats (P = 0.001 and P < 0.001 by repeated-measures analysis of variance). On transmission electron microscopy neuronal ultrastructural alterations were observed and the results of real-time polymerase chain reaction showed that exposure to 0.2% lead acetate did not substantially change gene expression of mGluR3 and mGluR7 mRNA compared with controls. Conclusion Exposure to lead before and after birth can damage short-term and long-term memory ability of young rats and hippocampal ultrastructure. However, the current study does not provide evidence that the expression of rat hippocampal mGluR3 and mGluR7 can be altered by systemic administration of lead during gestation and lactation, which are informative for the field of lead-induced developmental neurotoxicity noting that it seems not to be worthwhile to include mGluR3 and mGluR7 in future studies. Background

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase of buyer-driven supply chains, outsourcing and other forms of non-traditional employment has resulted in challenges for labour market regulation. One business model which has created substantial regulatory challenges is supply chains. The supply chain model involves retailers purchasing products from brand corporations who then outsource the manufacturing of the work to traders who contract with factories or outworkers who actually manufacture the clothing and textiles. This business model results in time and cost pressures being pushed down the supply chain which has resulted in sweatshops where workers systematically have their labour rights violated. Literally millions of workers work in dangerous workplaces where thousands are killed or permanently disabled every year. This thesis has analysed possible regulatory responses to provide workers a right to safety and health in supply chains which provide products for Australian retailers. This thesis will use a human rights standard to determine whether Australia is discharging its human rights obligations in its approach to combating domestic and foreign labour abuses. It is beyond this thesis to analyse Occupational Health and Safety (OHS) laws in every jurisdiction. Accordingly, this thesis will focus upon Australian domestic laws and laws in one of Australia’s major trading partners, the Peoples’ Republic of China (China). It is hypothesised that Australia is currently breaching its human rights obligations through failing to adequately regulate employees’ safety at work in Australian-based supply chains. To prove this hypothesis, this thesis will adopt a three- phase approach to analysing Australia’s regulatory responses. Phase 1 will identify the standard by which Australia’s regulatory approach to employees’ health and safety in supply chains can be judged. This phase will focus on analysing how workers’ rights to safety as a human right imposes a moral obligation on Australia to take reasonablely practicable steps regulate Australian-based supply chains. This will form a human rights standard against which Australia’s conduct can be judged. Phase 2 focuses upon the current regulatory environment. If existing regulatory vehicles adequately protect the health and safety of employees, then Australia will have discharged its obligations through simply maintaining the status quo. Australia currently regulates OHS through a combination of ‘hard law’ and ‘soft law’ regulatory vehicles. The first part of phase 2 analyses the effectiveness of traditional OHS laws in Australia and in China. The final part of phase 2 then analyses the effectiveness of the major soft law vehicle ‘Corporate Social Responsibility’ (CSR). The fact that employees are working in unsafe working conditions does not mean Australia is breaching its human rights obligations. Australia is only required to take reasonably practicable steps to ensure human rights are realized. Phase 3 identifies four regulatory vehicles to determine whether they would assist Australia in discharging its human rights obligations. Phase 3 then analyses whether Australia could unilaterally introduce supply chain regulation to regulate domestic and extraterritorial supply chains. Phase 3 also analyses three public international law regulatory vehicles. This chapter considers the ability of the United Nations Global Compact, the ILO’s Better Factory Project and a bilateral agreement to improve the detection and enforcement of workers’ right to safety and health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vigilance declines when exposed to highly predictable and uneventful tasks. Monotonous tasks provide little cognitive and motor stimulation and contribute to human errors. This paper aims to model and detect vigilance decline in real time through participant’s reaction times during a monotonous task. A lab-based experiment adapting the Sustained Attention to Response Task (SART) is conducted to quantify the effect of monotony on overall performance. Then relevant parameters are used to build a model detecting hypovigilance throughout the experiment. The accuracy of different mathematical models are compared to detect in real-time – minute by minute - the lapses in vigilance during the task. We show that monotonous tasks can lead to an average decline in performance of 45%. Furthermore, vigilance modelling enables to detect vigilance decline through reaction times with an accuracy of 72% and a 29% false alarm rate. Bayesian models are identified as a better model to detect lapses in vigilance as compared to Neural Networks and Generalised Linear Mixed Models. This modelling could be used as a framework to detect vigilance decline of any human performing monotonous tasks.