42 resultados para Setup

em Deakin Research Online - Australia


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides an examination of the determinants of derivative use by Australian corporations. We analysed the characteristics of a sample of 469 firm/year observations drawn from the largest Australian publicly listed companies in 1999 and 2000 to address two issues: the decision to use financial derivatives and the extent to which they are used. Logit analysis suggests that a firm's leverage (distress proxy), size (financial distress and setup costs) and liquidity (financial constraints proxy) are important factors associated with the decision to use derivatives. These findings support the financial distress hypothesis while the evidence on the underinvestment hypothesis is mixed. Additionally, setup costs appear to be important, as larger firms are more likely to use derivatives. Tobit results, on the other hand, show that once the decision to use derivatives has been made, a firm uses more derivatives as its leverage increases and as it pays out more dividends (hedging substitute proxy). The overall results indicate that Australian companies use derivatives with a view to enhancing the firms' value rather than to maximizing managerial wealth. In particular, corporations' derivative policies are mostly concerned with reducing the expected cost of financial distress and managing cash flows. Our inability to identify managerial influences behind the derivative decision suggests a competitive Australian managerial labor market.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cost of recovery protocols is important with respect to system performance during normal operation and failure in terms of overhead, and time taken to recover failed transactions. The cost of recovery protocols for web database systems has not been addressed much. In this paper, we present a quantitative study of cost of recovery protocols. For this purpose, we use an experiment setup to evaluate the performance of two recovery algorithms, namely the, two-phase commit algorithm and log-based algorithm. Our work is a step towards building reliable protocols for web database systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of dimensional defects in aluminum die- casting is widespread throughout the foundry industry and their detection is of paramount importance in maintaining product quality. Due to the unpredictable factory environment and metallic, with highly reflective, nature of aluminum die-castings, it is extremely hard to estimate true dimensionality of the die-casting, autonomously. In this work, we propose a novel robust 3D reconstruction algorithm capable of reconstructing dimensionally accurate 3D depth models of the aluminum die-castings. The developed system is very simple and cost effective as it consists of only a stereo cameras pair and a simple fluorescent light. The developed system is capable of estimating surface depths within the tolerance of 1.5 mm. Moreover, the system is invariant to illuminative variations and orientation of the objects in the input image space, which makes the developed system highly robust. Due to its hardware simplicity and robustness, it can be implemented in different factory environments without a significant change in the setup.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many environmental studies require accurate simulation of water and solute fluxes in the unsaturated zone. This paper evaluates one- and multi-dimensional approaches for soil water flow as well as different spreading mechanisms to model solute behavior at different scales. For quantification of soil water fluxes,Richards equation has become the standard. Although current numerical codes show perfect water balances, the calculated soil water fluxes in case of head boundary conditions may depend largely on the method used for spatial averaging of the hydraulic conductivity. Atmospheric boundary conditions, especially in the case of phreatic groundwater levels fluctuating above and below a soil surface, require sophisticated solutions to ensure convergence. Concepts for flow in soils with macro pores and unstable wetting fronts are still in development. One-dimensional flow models are formulated to work with lumped parameters in order to account for the soil heterogeneity and preferential flow. They can be used at temporal and spatial scales that are of interest to water managers and policymakers. Multi-dimensional flow models are hampered by data and computation requirements.Their main strength is detailed analysis of typical multi-dimensional flow problems, including soil heterogeneity and preferential flow. Three physically based solute-transport concepts have been proposed to describe solute spreading during unsaturated flow: The stochastic-convective model (SCM), the convection-dispersion equation (CDE), and the fraction aladvection-dispersion equation (FADE). A less physical concept is the continuous-time random-walk process (CTRW). Of these, the SCM and the CDE are well established, and their strengths and weaknesses are identified. The FADE and the CTRW are more recent,and only a tentative strength weakness opportunity threat (SWOT)analysis can be presented at this time. We discuss the effect of the number of dimensions in a numerical model and the spacing between model nodes on solute spreading and the values of the solute-spreading parameters. In order to meet the increasing complexity of environmental problems, two approaches of model combination are used: Model integration and model coupling. Amain drawback of model integration is the complexity of there sulting code. Model coupling requires a systematic physical domain and model communication analysis. The setup and maintenance of a hydrologic framework for model coupling requires substantial resources, but on the other hand, contributions can be made by many research groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose two new classes of hash functions which are motivated by Maximum Rank Distance (MRD) codes. We analise the security of these schemes. The system setup phase is computationally expensive for general field extensions. To overcome this limitation we derive an algebraic solution which avoids computations in special extension fields in the intended operational range of the hash functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – To examine a simple testing method of measuring the force to pull a fabric through a series of parallel pins to determine the fabric softness property.

Design/methodology/approach – A testing system was setup for fabric pulling force measurements and the testing parameters were experimentally determined. The specific pulling forces were compared with the fabric assurance by simple testing (FAST) parameters and subjective softness ranking. Their correlations were also statistically analyzed.

Findings – The fabric pulling force reflects the physical and surface properties of the fabrics measured by the FAST instrument and its ability to rank fabric softness appears to be close to the human hand response on fabric softness. The pulling force method can also distinguish the difference of fabrics knitted with different wool fiber contents.

Research limitations/implications – Only 21 woven and three knitted fabrics were used for this investigation. More fabrics with different structures and finishes may be evaluated before the testing method can be put in practice.

Practical implications – The testing method could be used for objective assessment of fabric softness.

Originality/value – The testing method reported in this paper is a new concept in fabric softness measurement. It can provide objective specifications for fabric softness, thus should be valuable to fabric community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the optimal size of a pay-as-you-go social security program for an economy composed of both permanent-income and hand-to-mouth consumers. While previous work on this topic is framed within a two-period partial equilibrium setup, we study this issue in a life-cycle general equilibrium model. Because this type of welfare analysis depends critically on unobservable preference parameters, we methodically consider all parameterizations of the unobservables that are both feasible and reasonable—all parameterizations that can mimic key features of macro data (feasible) while still being consistent with micro evidence and convention (reasonable). The baseline model predicts that the optimal tax rate is between 6 percent and 15 percent of wage income.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The future global distribution of the political regimes of countries, just like that of their economic incomes, displays a surprising tendency for polarization into only two clubs of convergence at the extrema. This, in itself, is a persuasive reason to analyze afresh the logical validity of an endogenous theory for political and economic development inherent in modernization theory. I suggest how adopting a simple evolutionary game theoretic view on the subject allows an explanation for these parallel clubs of convergence in political regimes and economic income within the framework of existing research in democratization theory. I also suggest how instrumental action can be methodically introduced into such a setup using learning strategies adopted by political actors. These strategies, based on the first principles of political competition, are motivated by introducing the theoretical concept of a Credible Polity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of dimensional defects in aluminum die-castings is widespread throughout the foundry industry and their detection is of paramount importance in maintaining product quality. Due to the unpredictable factory environment and metallic with highly reflective nature, it is extremely hard to estimate true dimensionality of these metallic parts, autonomously. Some existing vision systems are capable of estimating depth to high accuracy, however are very much hardware dependent, involving the use of light and laser pattern projectors, integrated into vision systems or laser scanners. However, due to the reflective nature of these metallic parts and variable factory environments, the aforementioned vision systems tend to exhibit unpromising performance. Moreover, hardware dependency makes these systems cumbersome and costly. In this work, we propose a novel robust 3D reconstruction algorithm capable of reconstructing dimensionally accurate 3D depth models of the aluminum die-castings. The developed system is very simple and cost effective as it consists of only a pair of stereo cameras and a defused fluorescent light. The proposed vision system is capable of estimating surface depths within the accuracy of 0.5mm. In addition, the system is invariant to illuminative variations as well as orientation and location of the objects on the input image space, making the developed system highly robust. Due to its hardware simplicity and robustness, it can be implemented in different factory environments without a significant change in the setup. The proposed system is a major part of quality inspection system for the automotive manufacturing industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of dimensional defects in aluminum die-casting is widespread throughout the foundry industry and their detection is of paramount importance in maintaining product quality. Due to the unpredictable factory environment and metallic, with highly reflective, nature of aluminum die-castings, it is extremely hard to estimate true dimensionality of the die-casting, autonomously. In this work, we propose a novel robust 3D reconstruction algorithm capable of reconstructing dimensionally accurate 3D depth models of the aluminum die-castings. The developed system is very simple and cost effective as it consists of only a stereo camera pair and a simple fluorescent light. The developed system is capable of estimating surface depths within the tolerance of 1.5 mm. Moreover, the system is invariant to illuminative variations and orientation of the objects in the input image space, which makes the developed system highly robust. Due to its hardware simplicity and robustness, it can be implemented in different factory environments without a significant change in the setup.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1991, the World Health Assembly approved a set of Guiding Principles which emphasize voluntary donation, non-commercialization and a preference for cadavers over living donors” (World Health Organization). The objective of this paper is to identify the factors that affect the ratio of cadaveric transplants to all transplants. This paper first provides informational background on problems surrounding kidney transplants and then uses a theoretical framework which employs standard economic assumptions but incorporates a setup where the persons needing kidneys can obtain it from their compatible relatives or purchase it from individuals who are willing to sell one of their kidneys. The methods of economic theoretical analyses are used where following definitions and assumptions some conclusions are drawn. This paper finds that factors such as inequality, rule of law and religion have significant effect on the ratio of cadaveric transplants to all transplants. The paper concludes that improvement in equality and in rule of law will increase the use of cadaveric kidney transplants. In addition, fighting religious beliefs against cadaveric kidney transplants too will lead to a higher ratio of cadaveric transplants to all transplants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless broadcasting is an efficient way to broadcast data to a large number of users. Some commercial applications of wireless broadcasting, such as satellite pay-TV, desire that only those users who have paid for the service can retrieve broadcast data. This is often achieved by broadcast encryption, which allows a station securely to broadcast data to a dynamically changing set of privileged users through open air. Most existing broadcast encryption schemes can only revoke a pre-specified number of users before system re-setup or require high computation, communication and storage overheads in receivers. In this paper, we propose a new broadcast encryption scheme based on smart cards. In our scheme, smart cards are used to prevent users from leaking secret keys. Additionally, once an illegally cloned smart card is captured, our scheme also allows tracing of the compromised smart card by which illegal smart cards are cloned, and can then revoke all cloned smart cards. The new features of our scheme include minimal computation needs of only a few modular multiplications in the smart card, and the capability to revoke up to any number of users in one revocation. Furthermore, our scheme is secure against both passive and active attacks and has better performance than other schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report presents the results of an analysis of three years of tide and meteorological data aimed at delineating the influences of atmospheric pressure, waves, onshore winds and longshore winds on coastal sea levels West of Port Phillip Bay, Victoria« The data was used to develop predictive and hindcasting techniques for meteorological tides on the Otway Coast, using statistical methods, an empirical method and a mathematical model. The nature and magnitude of contributions of the various components of the meteorological tide, and the general variability of monthly and seasonal variations were also studied. It was found that meteorological tides on the Otway Coast can account for significant sea level changes, with the main factors being wind and atmospheric pressure. The wind component of the meteorological tide was found to be approximately twice the pressure component, and longshore winds were found to be more significant than onshore winds for wind setup on the Otway Coast. The meteorological tide models developed enable estimates of wind setup and atmospheric pressure setup on the Otway Coast to be readily computed using data from synoptic charts. The wave setup component could not be separated from the meteorological tide and is included in the wind setup component. The results of the investigation are relevant to the design and maintenance of coastal engineering works, and point to the need for the establishment and operation of coastal management schemes on the Otway Coast.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper focuses on the development of a hybrid phenomenological/inductive model to improve the current physical setup force model on a five stand industrial hot strip finishing mill. We approached the problem from two directions. In the first approach, the starting point was the output of the current setup force model. A feedforward multilayer perceptron (MLP) model was then used to estimate the true roll separating force using some other available variables as additional inputs to the model.

It was found that it is possible to significantly improve the estimation of a roll separating force from 5.3% error on average with the current setup model to 2.5% error on average with the hybrid model. The corresponding improvements for the first coils are from 7.5% with the current model to 3.8% with the hybrid model. This was achieved by inclusion, in addition to each stand's force from the current model, the contributions from setup forces from the other stands, as well as the contributions from a limited set of additional variables such as: a) aim width; b) setup thickness; c) setup temperature; and d) measured force from the previous coil.

In the second approach, we investigated the correlation between the large errors in the current model and input parameters of the model. The data set was split into two subsets, one representing the "normal" level of error between the current model and the measured force value, while the other set contained the coils with a "large" level of error. Additional set of data with changes in each coil's inputs from the previous coil's inputs was created to investigate the dependency on the previous coil.

The data sets were then analyzed using a C4.5 decision tree. The main findings were that the level of the speed vernier variable is highly correlated with the large errors in the current setup model. Specifically, a high positive speed vernier value often correlated to a large error. Secondly, it has been found that large changes to the model flow stress values between coils are correlated frequently with larger errors in the current setup force model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Corrosion testing (half-cell and LPR) was carried out on a number reinforced concrete panels which had been taken from the fascia of a twenty five year old high rise building in Melbourne, Australia. Corrosion, predominantly as a result of carbonation of the concrete, was associated with a limited amount of cracking. A monitoring technique was established in which probe electrodes (reference and counter) were retro-fitted into the concrete. The probe electrode setup was identical for all panels tested. It was found that the corrosion behaviour of all panels tested closely fitted a family of results when the corrosion potential is plotted against the polarisation resistance (Rp). This enabled the development of a so-called 'control curve' relating the corrosion potential to the Rp for all of the panels under investigation. This relationship was also confirmed on laboratory samples, indicating that for a fixed geometry and experimental conditions a relationship between the potential and polarisation resistance of steel can be established for the steel-concrete system. Experimental results will be presented which indicate that for a given monitoring cell geometry, it may be possible to propose criteria for the point at which remediation measures should be considered. The establishment of such a control curve has enabled the development of a powerful monitoring tool for the assessment of a number of proposed corrosion remediation techniques. The actual effect of any corrosion remediation technique becomes clearly apparent via the type and magnitude of deviation of post remediation data from the original (preremediation) control curve.