614 resultados para debts incurred


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The report follows up on data and trends tabled in August 2015 that collected data from two key sources – six identified case study productions that have been tracked for eighteen months, and an online survey delivered to all APAM 2014 delegates. The comparative report has been constructed through an analysis of data reported from the August 2015 and the most recent online survey to all 2104 PM delegates conducted in late November 2015. The report highlights six key trends emerging from the data: The majority of survey respondents will return to APAM 2016; The central reason for attending is the networking opportunities the Market affords; Respondents are confident that a range of new relationships forged at the Market will afford long-term interest and buying opportunities and that as a result of the 2014 event, real touring outcomes were realised for some respondents; Respondents would like to see greater attention to a greater number of networking activities within the program to enable touring outcomes; The multi-venue model is still of concern, and is a recurrent issue from earlier surveys; The level of expense incurred by producers to present work at APAM. Throughout the report, extracted data from the online survey responses will be tabled to develop a narrative in response to the key research aims outlined in the Brisbane Powerhouse Tender document (2011). A full version of the collated responses to the survey questions can be found in the appendices of the report.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A field experiment was carried out in southeastern Australia to assess the short-term mortality and stress incurred by juvenile school prawns (Metapenaeus macleayi) discarded from an estuarine trawler. Some 35% of the prawns died up to 72 h after being caught in a trawl, exposed to air during sorting and separation from the retained catch (as per normal commercial procedures), then discarded into replicate cages. Total mortality was partitioned into that caused by trawling (about 16% of mortalities), and by subsequent sorting and grading (about 19%). Assuming that the majority of the non-penaeid bycatch is excluded from trawls (by the use of bycatch reduction devices), the latter mortalities could be almost eliminated by sorting and separating unwanted school prawns in water-filled compartments. Emersion stress was measured as concentrations of l-lactate in the haemolymph, which were elevated for at least 40 min following capture, but similar among all trawled treatments. l-lactate levels decreased within the first 24 h post-capture, then remained constant over at least the next 48 h, and were greater than baseline levels. The potential benefits associated with subtle changes to handling practices onboard estuarine trawlers are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: We sought to characterise the demographics, length of admission, final diagnoses, long-term outcome and costs associated with the population who presented to an Australian emergency department (ED) with symptoms of possible acute coronary syndrome (ACS). Design, setting and participants: Prospectively collected data on ED patients presenting with suspected ACS between November 2008 and February 2011 was used, including data on presentation and at 30 days after presentation. Information on patient disposition, length of stay and costs incurred was extracted from hospital administration records. Main outcome measures: Primary outcomes were mean and median cost and length of hospital stay. Secondary outcomes were diagnosis of ACS, other cardiovascular conditions or non-cardiovascular conditions within 30 days of presentation. Results: An ACS was diagnosed in 103 (11.1%) of the 926 patients recruited. 193 patients (20.8%) were diagnosed with other cardiovascular-related conditions and 622 patients (67.2%) had non-cardiac-related chest pain. ACS events occurred in 0 and 11 (1.9%) of the low-risk and intermediate-risk groups, respectively. Ninety-two (28.0%) of the 329 high-risk patients had an ACS event. Patients with a proven ACS, high-grade atrioventricular block, pulmonary embolism and other respiratory conditions had the longest length of stay. The mean cost was highest in the ACS group ($13 509; 95% CI, $11 794–$15 223) followed by other cardiovascular conditions ($7283; 95% CI, $6152–$8415) and non-cardiovascular conditions ($3331; 95% CI, $2976–$3685). Conclusions: Most ED patients with symptoms of possible ACS do not have a cardiac cause for their presentation. The current guideline-based process of assessment is lengthy, costly and consumes significant resources. Investigation of strategies to shorten this process or reduce the need for objective cardiac testing in patients at intermediate risk according to the National Heart Foundation and Cardiac Society of Australia and New Zealand guideline is required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Online fraud is a global problem. Millions of individuals worldwide are losing money and experiencing the devastation associated with becoming a victim of online fraud. In 2014, Australians reported losses of $82 million as a result of online fraud to the Australian Competition and Consumer Commission (ACCC). Given that the ACCC is one of many agencies that receives victim complaints, and the extent of under‐reporting of online fraud, this figure is likely to represent only a fraction of the actual monetary losses incurred. The successful policing of online fraud is hampered by its transnational nature, the prevalence of false/stolen identities used by offenders, and a lack of resources available to investigate offences. In addition, police are restricted by the geographical boundaries of their own jurisdictions which conflicts with the lack of boundaries afforded to offenders by the virtual world. In response to this, Australia is witnessing the emergence of victim‐oriented policing approaches to counter online fraud victimisation. This incorporates the use of financial intelligence as a tool to proactively notify potential victims of online fraud. Using a variety of Australian examples, this paper documents the history to this new approach and considers the significance that such a shift represents to policing in a broader context. It also details the value that this approach can have to both victims and law enforcement agencies. Overall, it is argued that a victim‐oriented approach to policing online fraud can have substantial benefits to police and victims alike.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Productivity is predicted to drive the ecological and evolutionary dynamics of predator-prey interaction through changes in resource allocation between different traits. However, resources are seldom constantly available and thus temporal variation in productivity could have considerable effect on the species' potential to evolve. To study this, three long-term microbial laboratory experiments were established where Serratia marcescens prey bacteria was exposed to predation of protist Tetrahymena thermophila in different prey resource environments. The consequences of prey resource availability for the ecological properties of the predator-prey system, such as trophic dynamics, stability, and virulence, were determined. The evolutionary changes in species traits and prey genetic diversity were measured. The prey defence evolved stronger in high productivity environment. Increased allocation to defence incurred cost in terms of reduced prey resource use ability, which probably constrained prey evolution by increasing the effect of resource competition. However, the magnitude of this trade-off diminished when measured in high resource concentrations. Predation selected for white, non-pigmented, highly defensive prey clones that produced predation resistant biofilm. The biofilm defence was also potentially accompanied with cytotoxicity for predators and could have been traded off with high motility. Evidence for the evolution of predators was also found in one experiment suggesting that co-evolutionary dynamics could affect the evolution and ecology of predator-prey interaction. Temporal variation in resource availability increased variation in predator densities leading to temporally fluctuating selection for prey defences and resource use ability. Temporal variation in resource availability was also able to constrain prey evolution when the allocation to defence incurred high cost. However, when the magnitude of prey trade-off was small and the resource turnover was periodically high, temporal variation facilitated the formation of predator resistant biofilm. The evolution of prey defence constrained the transfer of energy from basal to higher trophic levels, decreasing the strength of top-down regulation on prey community. Predation and temporal variation in productivity decreased the stability of populations and prey traits in general. However, predation-induced destabilization was less pronounced in the high productivity environment where the evolution of prey defence was stronger. In addition, evolution of prey defence weakened the environmental variation induced destabilization of predator population dynamics. Moreover, protozoan predation decreased the S. marcescens virulence in the insect host moth (Parasemia plantaginis) suggesting that species interactions outside the context of host-pathogen relationship could be important indirect drivers for the evolution of pathogenesis. This thesis demonstrates that rapid evolution can affect various ecological properties of predator-prey interaction. The effect of evolution on the ecological dynamics depended on the productivity of the environment, being most evident in the constant environments with high productivity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims: Develop and validate tools to estimate residual noise covariance in Planck frequency maps. Quantify signal error effects and compare different techniques to produce low-resolution maps. Methods: We derive analytical estimates of covariance of the residual noise contained in low-resolution maps produced using a number of map-making approaches. We test these analytical predictions using Monte Carlo simulations and their impact on angular power spectrum estimation. We use simulations to quantify the level of signal errors incurred in different resolution downgrading schemes considered in this work. Results: We find an excellent agreement between the optimal residual noise covariance matrices and Monte Carlo noise maps. For destriping map-makers, the extent of agreement is dictated by the knee frequency of the correlated noise component and the chosen baseline offset length. The significance of signal striping is shown to be insignificant when properly dealt with. In map resolution downgrading, we find that a carefully selected window function is required to reduce aliasing to the sub-percent level at multipoles, ell > 2Nside, where Nside is the HEALPix resolution parameter. We show that sufficient characterization of the residual noise is unavoidable if one is to draw reliable contraints on large scale anisotropy. Conclusions: We have described how to compute the low-resolution maps, with a controlled sky signal level, and a reliable estimate of covariance of the residual noise. We have also presented a method to smooth the residual noise covariance matrices to describe the noise correlations in smoothed, bandwidth limited maps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stroke is a major cause of death and disability, incurs significant costs to healthcare systems, and inflicts severe burden to the whole society. Stroke care in Finland has been described in several population-based studies between 1967 and 1998, but not since. In the PERFECT Stroke study presented here, a system for monitoring the Performance, Effectiveness, and Costs of Treatment episodes in Stroke was developed in Finland. Existing nationwide administrative registries were linked at individual patient level with personal identification numbers to depict whole episodes of care, from acute stroke, through rehabilitation, until the patients went home, were admitted to permanent institutional care, or died. For comparisons in time and between providers, patient case-mix was adjusted for. The PERFECT Stroke database includes 104 899 first-ever stroke patients over the years 1999 to 2008, of whom 79% had ischemic stroke (IS), 14% intracerebral hemorrhage (ICH), and 7% subarachnoid hemorrhage (SAH). A 18% decrease in the age and sex adjusted incidence of stroke was observed over the study period, 1.8% improvement annually. All-cause 1-year case-fatality rate improved from 28.6% to 24.6%, or 0.5% annually. The expected median lifetime after stroke increased by 2 years for IS patients, to 7 years and 7 months, and by 1 year for ICH patients, to 4 years 5 months. No change could be seen in median SAH patient survival, >10 years. Stroke prevalence was 82 000, 1.5% of total population of Finland, in 2008. Modern stroke center care was shown to be associated with a decrease in both death and risk of institutional care of stroke patients. Number needed to treat to prevent these poor outcomes at one year from stroke was 32 (95% confidence intervals 26 to 42). Despite improvements over the study period, more than a third of Finnish stroke patients did not have access to stroke center care. The mean first-year healthcare cost of a stroke patient was ~20 000 , and among survivors ~10 000 annually thereafter. Only part of this cost was incurred by stroke, as the same patients cost ~5000 over the year prior to stroke. Total lifetime costs after first-ever stroke were ~85 000 . A total of 1.1 Billion , 7% of all healthcare expenditure, is used in the treatment of stroke patients annually. Despite a rapidly aging population, the number of new stroke patients is decreasing, and the patients are more likely to survive. This is explained in part by stroke center care, which is effective, and should be made available for all stroke patients. It is possible, in a suitable setting with high-quality administrative registries and a common identifier, to avoid the huge workload and associated costs of setting up a conventional stroke registry, and still acquire a fairly comprehensive dataset on stroke care and outcome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An explicit representation of an analytical solution to the problem of decay of a plane shock wave of arbitrary strength is proposed. The solution satisfies the basic equations exactly. The approximation lies in the (approximate) satisfaction of two of the Rankine-Hugoniot conditions. The error incurred is shown to be very small even for strong shocks. This solution analyses the interaction of a shock of arbitrary strength with a centred simple wave overtaking it, and describes a complete history of decay with a remarkable accuracy even for strong shocks. For a weak shock, the limiting law of motion obtained from the solution is shown to be in complete agreement with the Friedrichs theory. The propagation law of the non-uniform shock wave is determined, and the equations for shock and particle paths in the (x, t)-plane are obtained. The analytic solution presented here is uniformly valid for the entire flow field behind the decaying shock wave.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the problem of designing codes for specific applications using deterministic annealing. Designing a block code over any finite dimensional space may be thought of as forming the corresponding number of clusters over the particular dimensional space. We have shown that the total distortion incurred in encoding a training set is related to the probability of correct reception over a symmetric channel. While conventional deterministic annealing make use of the Euclidean squared error distance measure, we have developed an algorithm that can be used for clustering with Hamming distance as the distance measure, which is required in the error correcting, scenario.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The SUMO ligase activity of Mms21/Nse2, a conserved member of the Smc5/6 complex, is required for resisting extrinsically induced genotoxic stress. We report that the Mms21 SUMO ligase activity is also required during the unchallenged mitotic cell cycle in Saccharomyces cerevisiae. SUMO ligase-defective cells were slow growing and spontaneously incurred DNA damage. These cells required caffeine-sensitive Mec1 kinase-dependent checkpoint signaling for survival even in the absence of extrinsically induced genotoxic stress. SUMO ligase-defective cells were sensitive to replication stress and displayed synthetic growth defects with DNA damage checkpoint-defective mutants such as mec1, rad9, and rad24. MMS21 SUMO ligase and mediator of replication checkpoint 1 gene (MRC1) were epistatic with respect to hydroxyurea-induced replication stress or methyl methanesulfonate-induced DNA damage sensitivity. Subjecting Mms21 SUMO ligase-deficient cells to transient replication stress resulted in enhancement of cell cycle progression defects such as mitotic delay and accumulation of hyperploid cells. Consistent with the spontaneous activation of the DNA damage checkpoint pathway observed in the Mms21-mediated sumoylation-deficient cells, enhanced frequency of chromosome breakage and loss was detected in these mutant cells. A mutation in the conserved cysteine 221 that is engaged in coordination of the zinc ion in Loop 2 of the Mms21 SPL-RING E3 ligase catalytic domain resulted in strong replication stress sensitivity and also conferred slow growth and Mec1 dependence to unchallenged mitotically dividing cells. Our findings establish Mms21-mediated sumoylation as a determinant of cell cycle progression and maintenance of chromosome integrity during the unperturbed mitotic cell division cycle in budding yeast.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports new results concerning the capabilities of a family of service disciplines aimed at providing per-connection end-to-end delay (and throughput) guarantees in high-speed networks. This family consists of the class of rate-controlled service disciplines, in which traffic from a connection is reshaped to conform to specific traffic characteristics, at every hop on its path. When used together with a scheduling policy at each node, this reshaping enables the network to provide end-to-end delay guarantees to individual connections. The main advantages of this family of service disciplines are their implementation simplicity and flexibility. On the other hand, because the delay guarantees provided are based on summing worst case delays at each node, it has also been argued that the resulting bounds are very conservative which may more than offset the benefits. In particular, other service disciplines such as those based on Fair Queueing or Generalized Processor Sharing (GPS), have been shown to provide much tighter delay bounds. As a result, these disciplines, although more complex from an implementation point-of-view, have been considered for the purpose of providing end-to-end guarantees in high-speed networks. In this paper, we show that through ''proper'' selection of the reshaping to which we subject the traffic of a connection, the penalty incurred by computing end-to-end delay bounds based on worst cases at each node can be alleviated. Specifically, we show how rate-controlled service disciplines can be designed to outperform the Rate Proportional Processor Sharing (RPPS) service discipline. Based on these findings, we believe that rate-controlled service disciplines provide a very powerful and practical solution to the problem of providing end-to-end guarantees in high-speed networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider a single-server multiclass queueing system with K classes where the individual queues are fed by K-correlated interrupted Poisson streams generated in the states of a K-state stationary modulating Markov chain. The service times for all the classes are drawn independently from the same distribution. There is a setup time (and/or a setup cost) incurred whenever the server switches from one queue to another. It is required to minimize the sum of discounted inventory and setup costs over an infinite horizon. We provide sufficient conditions under which exhaustive service policies are optimal. We then present some simulation results for a two-class queueing system to show that exhaustive, threshold policies outperform non-exhaustive policies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The velocity distribution function for the steady shear flow of disks (in two dimensions) and spheres (in three dimensions) in a channel is determined in the limit where the frequency of particle-wall collisions is large compared to particle-particle collisions. An asymptotic analysis is used in the small parameter epsilon, which is naL in two dimensions and na(2)L in three dimensions, where; n is the number density of particles (per unit area in two dimensions and per unit volume in three dimensions), L is the separation of the walls of the channel and a is the particle diameter. The particle-wall collisions are inelastic, and are described by simple relations which involve coefficients of restitution e(t) and e(n) in the tangential and normal directions, and both elastic and inelastic binary collisions between particles are considered. In the absence of binary collisions between particles, it is found that the particle velocities converge to two constant values (u(x), u(y)) = (+/-V, O) after repeated collisions with the wall, where u(x) and u(y) are the velocities tangential and normal to the wall, V = (1 - e(t))V-w/(1 + e(t)), and V-w and -V-w, are the tangential velocities of the walls of the channel. The effect of binary collisions is included using a self-consistent calculation, and the distribution function is determined using the condition that the net collisional flux of particles at any point in velocity space is zero at steady state. Certain approximations are made regarding the velocities of particles undergoing binary collisions :in order to obtain analytical results for the distribution function, and these approximations are justified analytically by showing that the error incurred decreases proportional to epsilon(1/2) in the limit epsilon --> 0. A numerical calculation of the mean square of the difference between the exact flux and the approximate flux confirms that the error decreases proportional to epsilon(1/2) in the limit epsilon --> 0. The moments of the velocity distribution function are evaluated, and it is found that [u(x)(2)] --> V-2, [u(y)(2)] similar to V-2 epsilon and -[u(x)u(y)] similar to V-2 epsilon log(epsilon(-1)) in the limit epsilon --> 0. It is found that the distribution function and the scaling laws for the velocity moments are similar for both two- and three-dimensional systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose partial and full link reversal algorithms to bypass voids during geographic routing over duty-cycled wireless sensor networks. We propose a distributed approach that is oblivious to one-hop neighbor information. Upon termination of the algorithm, the resulting network is guaranteed to be destination-oriented. Further, to reduce the delays incurred under reactive link reversal, we propose the use of `pseudo-events', a preemptive link reversal strategy, that renders the network destination-oriented before the onset of a real event. A simulation study of the effectiveness of pseudo-events is also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We are concerned with the situation in which a wireless sensor network is deployed in a region, for the purpose of detecting an event occurring at a random time and at a random location. The sensor nodes periodically sample their environment (e.g., for acoustic energy),process the observations (in our case, using a CUSUM-based algorithm) and send a local decision (which is binary in nature) to the fusion centre. The fusion centre collects these local decisions and uses a fusion rule to process the sensors’ local decisions and infer the state of nature, i.e., if an event has occurred or not. Our main contribution is in analyzing two local detection rules in combination with a simple fusion rule. The local detection algorithms are based on the nonparametric CUSUMprocedure from sequential statistics. We also propose two ways to operate the local detectors after an alarm. These alternatives when combined in various ways yield several approaches. Our contribution is to provide analytical techniques to calculate false alarm measures, by the use of which the local detector thresholds can be set. Simulation results are provided to evaluate the accuracy of our analysis. As an illustration we provide a design example. We also use simulations to compare the detection delays incurred in these algorithms.