229 resultados para time dependant cost function


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider a time-space fractional diffusion equation of distributed order (TSFDEDO). The TSFDEDO is obtained from the standard advection-dispersion equation by replacing the first-order time derivative by the Caputo fractional derivative of order α∈(0,1], the first-order and second-order space derivatives by the Riesz fractional derivatives of orders β 1∈(0,1) and β 2∈(1,2], respectively. We derive the fundamental solution for the TSFDEDO with an initial condition (TSFDEDO-IC). The fundamental solution can be interpreted as a spatial probability density function evolving in time. We also investigate a discrete random walk model based on an explicit finite difference approximation for the TSFDEDO-IC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the preliminary results in establishing a strategy for predicting Zenith Tropospheric Delay (ZTD) and relative ZTD (rZTD) between Continuous Operating Reference Stations (CORS) in near real-time. It is anticipated that the predicted ZTD or rZTD can assist the network-based Real-Time Kinematic (RTK) performance over long inter-station distances, ultimately, enabling a cost effective method of delivering precise positioning services to sparsely populated regional areas, such as Queensland. This research firstly investigates two ZTD solutions: 1) the post-processed IGS ZTD solution and 2) the near Real-Time ZTD solution. The near Real-Time solution is obtained through the GNSS processing software package (Bernese) that has been deployed for this project. The predictability of the near Real-Time Bernese solution is analyzed and compared to the post-processed IGS solution where it acts as the benchmark solution. The predictability analyses were conducted with various prediction time of 15, 30, 45, and 60 minutes to determine the error with respect to timeliness. The predictability of ZTD and relative ZTD is determined (or characterized) by using the previously estimated ZTD as the predicted ZTD of current epoch. This research has shown that both the ZTD and relative ZTD predicted errors are random in nature; the STD grows from a few millimeters to sub-centimeters while the predicted delay interval ranges from 15 to 60 minutes. Additionally, the RZTD predictability shows very little dependency on the length of tested baselines of up to 1000 kilometers. Finally, the comparison of near Real-Time Bernese solution with IGS solution has shown a slight degradation in the prediction accuracy. The less accurate NRT solution has an STD error of 1cm within the delay of 50 minutes. However, some larger errors of up to 10cm are observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation provided to a PhD Colloquium between two Australian and one Malaysian University providing the opportunity to inform and critique progress of students concerning their selected topic. This presentation essentially involves "The conceptualisation, sensitivity and measurement of holding costs and other selected elements impacting housing affordability" as provided by Gary Owen Garner of QUT, with research objectives thus: 1. To establish the nature and composition of holding costs over time, as related to residential property in Australia, and internationally. 2. To examine the linkages that may exist between various planning instruments, the length of regulatory assessment periods, and housing affordability. 3. To develop a model that quantifies the impact of holding costs on housing affordability in Australia, with a particular focus on the consequences of extended assessment periods as a component of holding costs. Thus, provide clarification as to the impact of holding costs on overall housing affordability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automated crowd counting allows excessive crowding to be detected immediately, without the need for constant human surveillance. Current crowd counting systems are location specific, and for these systems to function properly they must be trained on a large amount of data specific to the target location. As such, configuring multiple systems to use is a tedious and time consuming exercise. We propose a scene invariant crowd counting system which can easily be deployed at a different location to where it was trained. This is achieved using a global scaling factor to relate crowd sizes from one scene to another. We demonstrate that a crowd counting system trained at one viewpoint can achieve a correct classification rate of 90% at a different viewpoint.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are increasing indications that the contribution of holding costs and its impact on housing affordability is very significant. Their importance and perceived high level impact can be gauged from considering the unprecedented level of attention policy makers have given them recently. This may be evidenced by the embedding of specific strategies to address burgeoning holding costs (and particularly those cost savings associated with streamlining regulatory assessment) within statutory instruments such as the Queensland Housing Affordability Strategy, and the South East Queensland Regional Plan. However, several key issues require further investigation. Firstly, the computation and methodology behind the calculation of holding costs varies widely. In fact, it is not only variable, but in some instances completely ignored. Secondly, some ambiguity exists in terms of the inclusion of various elements of holding costs and assessment of their relative contribution. Perhaps this may in part be explained by their nature: such costs are not always immediately apparent. They are not as visible as more tangible cost items associated with greenfield development such as regulatory fees, government taxes, acquisition costs, selling fees, commissions and others. Holding costs are also more difficult to evaluate since for the most part they must be ultimately assessed over time in an ever-changing environment based on their strong relationship with opportunity cost which is in turn dependant, inter alia, upon prevailing inflation and / or interest rates. This paper seeks to provide a more detailed investigation of those elements related to holding costs, and in so doing determine the size of their impact specifically on the end user. It extends research in this area clarifying the extent to which holding costs impact housing affordability. Geographical diversity indicated by the considerable variation between various planning instruments and the length of regulatory assessment periods suggests further research should adopt a case study approach in order to test the relevance of theoretical modelling conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researching administrative history is problematical. A trail of authoritative documents is often hard to find; and useful summaries can be difficult to organise, especially if source material is in paper formats in geographically dispersed locations. In the absence of documents, the reasons for particular decisions and the rationale underpinning particular policies can be confounded as key personnel advance in their professions and retire. The rationale for past decisions may be lost for practical purposes; and if an organisation’s memory of events is diminished, its learning through experience is also diminished. Publishing this document tries to avoid unnecessary duplication of effort by other researchers that need to venture into how policies of charging for public sector information have been justified. The author compiled this work within a somewhat limited time period and the work does not pretend to be a complete or comprehensive analysis of the issues.----- A significant part of the role of government is to provide a framework of legally-enforceable rights and obligations that can support individuals and non-government organisations in their lawful activities. Accordingly, claims that governments should be more ‘business-like’ need careful scrutiny. A significant supply of goods and services occurs as non-market activity where neither benefits nor costs are quantified within conventional accounting systems or in terms of money. Where a government decides to provide information as a service; and information from land registries is archetypical, the transactions occur as a political decision made under a direct or a clearly delegated authority of a parliament with the requisite constitutional powers. This is not a market transaction and the language of the market confuses attempts to describe a number of aspects of how governments allocate resources.----- Cost recovery can be construed as an aspect of taxation that is a sole prerogative of a parliament. The issues are fundamental to political constitutions; but they become more complicated where states cede some taxing powers to a central government as part of a federal system. Nor should the absence of markets be construed necessarily as ‘market failure’ or even ‘government failure’. The absence is often attributable to particular technical, economic and political constraints that preclude the operation of markets. Arguably, greater care is needed in distinguishing between the polity and markets in raising revenues and allocating resources; and that needs to start by removing unhelpful references to ‘business’ in the context of government decision-making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the paper, the flow-shop scheduling problem with parallel machines at each stage (machine center) is studied. For each job its release and due date as well as a processing time for its each operation are given. The scheduling criterion consists of three parts: the total weighted earliness, the total weighted tardiness and the total weighted waiting time. The criterion takes into account the costs of storing semi-manufactured products in the course of production and ready-made products as well as penalties for not meeting the deadlines stated in the conditions of the contract with customer. To solve the problem, three constructive algorithms and three metaheuristics (based one Tabu Search and Simulated Annealing techniques) are developed and experimentally analyzed. All the proposed algorithms operate on the notion of so-called operation processing order, i.e. the order of operations on each machine. We show that the problem of schedule construction on the base of a given operation processing order can be reduced to the linear programming task. We also propose some approximation algorithm for schedule construction and show the conditions of its optimality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the relative importance of vision and proprioception in estimating target and hand locations in a dynamic environment. Subjects performed a position estimation task in which a target moved horizontally on a screen at a constant velocity and then disappeared. They were asked to estimate the position of the invisible target under two conditions: passively observing and manually tracking. The tracking trials included three visual conditions with a cursor representing the hand position: always visible, disappearing simultaneously with target disappearance, and always invisible. The target’s invisible displacement was systematically underestimated during passive observation. In active conditions, tracking with the visible cursor significantly decreased the extent of underestimation. Tracking of the invisible target became much more accurate under this condition and was not affected by cursor disappearance. In a second experiment, subjects were asked to judge the position of their unseen hand instead of the target during tracking movements. Invisible hand displacements were also underestimated when compared with the actual displacement. Continuous or brief presentation of the cursor reduced the extent of underestimation. These results suggest that vision–proprioception interactions are critical for representing exact target–hand spatial relationships, and that such sensorimotor representation of hand kinematics serves a cognitive function in predicting target position. We propose a hypothesis that the central nervous system can utilize information derived from proprioception and/or efference copy for sensorimotor prediction of dynamic target and hand positions, but that effective use of this information for conscious estimation requires that it be presented in a form that corresponds to that used for the estimations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is widely held that strong relationships exist between housing, economic status, and well being. This is exemplified by widespread housing stock surpluses in many countries which threaten to destabilise numerous aspects related to individuals and community. However, the position of housing demand and supply is not consistent. The Australian position provides a distinct contrast whereby seemingly inexorable housing demand generally remains a critical issue affecting the socio-economic landscape. Underpinned by high levels of immigration, and further buoyed by sustained historically low interest rates, increasing income levels, and increased government assistance for first home buyers, this strong housing demand ensures elements related to housing affordability continue to gain prominence. A significant, but less visible factor impacting housing affordability – particularly new housing development – relates to holding costs. These costs are in many ways “hidden” and cannot always be easily identified. Although it is only one contributor, the nature and extent of its impact requires elucidation. In its simplest form, it commences with a calculation of the interest or opportunity cost of land holding. However, there is significantly more complexity for major new developments - particularly greenfield property development. Preliminary analysis conducted by the author suggests that even small shifts in primary factors impacting holding costs can appreciably affect housing affordability – and notably, to a greater extent than commonly held. Even so, their importance and perceived high level impact can be gauged from the unprecedented level of attention policy makers have given them over recent years. This may be evidenced by the embedding of specific strategies to address burgeoning holding costs (and particularly those cost savings associated with streamlining regulatory assessment) within statutory instruments such as the Queensland Housing Affordability Strategy, and the South East Queensland Regional Plan. However, several key issues require investigation. Firstly, the computation and methodology behind the calculation of holding costs varies widely. In fact, it is not only variable, but in some instances completely ignored. Secondly, some ambiguity exists in terms of the inclusion of various elements of holding costs, thereby affecting the assessment of their relative contribution. Perhaps this may in part be explained by their nature: such costs are not always immediately apparent. Some forms of holding costs are not as visible as the more tangible cost items associated with greenfield development such as regulatory fees, government taxes, acquisition costs, selling fees, commissions and others. Holding costs are also more difficult to evaluate since for the most part they must be ultimately assessed over time in an ever-changing environment, based on their strong relationship with opportunity cost which is in turn dependant, inter alia, upon prevailing inflation and / or interest rates. By extending research in the general area of housing affordability, this thesis seeks to provide a more detailed investigation of those elements related to holding costs, and in so doing determine the size of their impact specifically on the end user. This will involve the development of soundly based economic and econometric models which seek to clarify the componentry impacts of holding costs. Ultimately, there are significant policy implications in relation to the framework used in Australian jurisdictions that promote, retain, or otherwise maximise, the opportunities for affordable housing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aimed to investigate the way in which distance runners modulate their speed in an effort to understand the key processes and determinants of speed selection when encountering hills in natural outdoor environments. One factor which has limited the expansion of knowledge in this area has been a reliance on the motorized treadmill which constrains runners to constant speeds and gradients and only linear paths. Conversely, limits in the portability or storage capacity of available technology have restricted field research to brief durations and level courses. Therefore another aim of this thesis was to evaluate the capacity of lightweight, portable technology to measure running speed in outdoor undulating terrain. The first study of this thesis assessed the validity of a non-differential GPS to measure speed, displacement and position during human locomotion. Three healthy participants walked and ran over straight and curved courses for 59 and 34 trials respectively. A non-differential GPS receiver provided speed data by Doppler Shift and change in GPS position over time, which were compared with actual speeds determined by chronometry. Displacement data from the GPS were compared with a surveyed 100m section, while static positions were collected for 1 hour and compared with the known geodetic point. GPS speed values on the straight course were found to be closely correlated with actual speeds (Doppler shift: r = 0.9994, p < 0.001, Δ GPS position/time: r = 0.9984, p < 0.001). Actual speed errors were lowest using the Doppler shift method (90.8% of values within ± 0.1 m.sec -1). Speed was slightly underestimated on a curved path, though still highly correlated with actual speed (Doppler shift: r = 0.9985, p < 0.001, Δ GPS distance/time: r = 0.9973, p < 0.001). Distance measured by GPS was 100.46 ± 0.49m, while 86.5% of static points were within 1.5m of the actual geodetic point (mean error: 1.08 ± 0.34m, range 0.69-2.10m). Non-differential GPS demonstrated a highly accurate estimation of speed across a wide range of human locomotion velocities using only the raw signal data with a minimal decrease in accuracy around bends. This high level of resolution was matched by accurate displacement and position data. Coupled with reduced size, cost and ease of use, the use of a non-differential receiver offers a valid alternative to differential GPS in the study of overground locomotion. The second study of this dissertation examined speed regulation during overground running on a hilly course. Following an initial laboratory session to calculate physiological thresholds (VO2 max and ventilatory thresholds), eight experienced long distance runners completed a self- paced time trial over three laps of an outdoor course involving uphill, downhill and level sections. A portable gas analyser, GPS receiver and activity monitor were used to collect physiological, speed and stride frequency data. Participants ran 23% slower on uphills and 13.8% faster on downhills compared with level sections. Speeds on level sections were significantly different for 78.4 ± 7.0 seconds following an uphill and 23.6 ± 2.2 seconds following a downhill. Speed changes were primarily regulated by stride length which was 20.5% shorter uphill and 16.2% longer downhill, while stride frequency was relatively stable. Oxygen consumption averaged 100.4% of runner’s individual ventilatory thresholds on uphills, 78.9% on downhills and 89.3% on level sections. Group level speed was highly predicted using a modified gradient factor (r2 = 0.89). Individuals adopted distinct pacing strategies, both across laps and as a function of gradient. Speed was best predicted using a weighted factor to account for prior and current gradients. Oxygen consumption (VO2) limited runner’s speeds only on uphill sections, and was maintained in line with individual ventilatory thresholds. Running speed showed larger individual variation on downhill sections, while speed on the level was systematically influenced by the preceding gradient. Runners who varied their pace more as a function of gradient showed a more consistent level of oxygen consumption. These results suggest that optimising time on the level sections after hills offers the greatest potential to minimise overall time when running over undulating terrain. The third study of this thesis investigated the effect of implementing an individualised pacing strategy on running performance over an undulating course. Six trained distance runners completed three trials involving four laps (9968m) of an outdoor course involving uphill, downhill and level sections. The initial trial was self-paced in the absence of any temporal feedback. For the second and third field trials, runners were paced for the first three laps (7476m) according to two different regimes (Intervention or Control) by matching desired goal times for subsections within each gradient. The fourth lap (2492m) was completed without pacing. Goals for the Intervention trial were based on findings from study two using a modified gradient factor and elapsed distance to predict the time for each section. To maintain the same overall time across all paced conditions, times were proportionately adjusted according to split times from the self-paced trial. The alternative pacing strategy (Control) used the original split times from this initial trial. Five of the six runners increased their range of uphill to downhill speeds on the Intervention trial by more than 30%, but this was unsuccessful in achieving a more consistent level of oxygen consumption with only one runner showing a change of more than 10%. Group level adherence to the Intervention strategy was lowest on downhill sections. Three runners successfully adhered to the Intervention pacing strategy which was gauged by a low Root Mean Square error across subsections and gradients. Of these three, the two who had the largest change in uphill-downhill speeds ran their fastest overall time. This suggests that for some runners the strategy of varying speeds systematically to account for gradients and transitions may benefit race performances on courses involving hills. In summary, a non – differential receiver was found to offer highly accurate measures of speed, distance and position across the range of human locomotion speeds. Self-selected speed was found to be best predicted using a weighted factor to account for prior and current gradients. Oxygen consumption limited runner’s speeds only on uphills, speed on the level was systematically influenced by preceding gradients, while there was a much larger individual variation on downhill sections. Individuals were found to adopt distinct but unrelated pacing strategies as a function of durations and gradients, while runners who varied pace more as a function of gradient showed a more consistent level of oxygen consumption. Finally, the implementation of an individualised pacing strategy to account for gradients and transitions greatly increased runners’ range of uphill-downhill speeds and was able to improve performance in some runners. The efficiency of various gradient-speed trade- offs and the factors limiting faster downhill speeds will however require further investigation to further improve the effectiveness of the suggested strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A hip fracture causes permanent changes to life style for older people. Further, two important mortality indicators found post operatively for this group include, the time until surgery after fracture, and pre-operative health status prior to surgery, yet no research is available investigating relationships between time to surgery and health status. The researchers aimed to establish the health status risks for patients aged over 65 years with a non-pathological hip fracture to guide nursing care interventions. A prospective cohort design was used to investigate relationships between time to surgery and measures on pre-operative health status indicators including, skin integrity risk, vigor, mental state, bowel function and continence. Twenty-nine patients with a mean age in years of 81.93 (SD,9.49), were recruited. The mean number of hours from time 1 assessment to surgery was 52.72 (SD,58.35) and the range was 1 hour to 219 hours. At Time 2, the mean scores of vigor and skin integrity risk were significantly higher, indicating poorer health status. A change in health status occurred but possibly due to the small sample size it was difficult to relate this result to time. However the results informed preoperative care prior to surgery, for this group.