272 resultados para Apparent Return Rate


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fifty-nine persons with industrial handling of low levels of acrylonitrile (AN) were studied. As part of a medical surveillance programme an extended haemoglobin adduct monitoring [N-(cyanoethyl)valine, CEV; N- (methyl)valine, MV; N-(hydroxyethyl)valine, HEV] was performed. Moreover, the genetic states of the polymorphic glutathione transferases GSTM1 and GSTT1 were assayed by polymerase chain reaction (PCR). Repetitive analyses of CEV and MV in subsequent years resulted in comparable values (means, 59.8 and 70.3 g CEV/1 blood; 6.7 and 6.7 g MV/1 blood). Hence, the industrial AN exposures were well below current official standards. Monitoring the haemoglobin adduct CEV appears as a suitable means of biomonitoring and medical surveillance under such exposure conditions. There was also no apparent correlation between the CEV and HEV or CEV and MV adduct levels. The MV and HEV values observed represented background levels, which apparently are not related to any occupational chemical exposure. There was no consistent effect of the genetic GSTM1 or GSTT1 state on CEV adduct levels induced by acrylonitrile exposure. Therefore, neither GSTM1 nor GSTT1 appears as a major AN metabolizing isoenzyme in humans. The low and physiological background levels of MV were also not influenced by the genetic GSTM1 state, but the MV adduct levels tended to be higher in GSTT1- individuals compared to GSTT1 + persons. With respect to the background levels of HEV adducts observed, there was no major influence of the GSTM1 state, but GST- individuals displayed adduct levels that were about 1/3 higher than those of GSTT1+ individuals. The coincidence with known differences in rates of background sister chromatid exchange between GSTT1- and GSTT1 + persons suggests that the lower ethylene oxide (EO) detoxification rate in GSTT1- persons, indicated by elevated blood protein hydroxyethyl adduct levels, leads to an increased genotoxic effect of the physiological EO background.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphyne is an allotrope of graphene. The mechanical properties of graphynes (-, -, - and 6,6,12-graphynes) under uniaxial tension deformation at different temperatures and strain rates are studied using molecular dynamics simulations. It is found that graphynes are more sensitive to temperature changes than graphene in terms of fracture strength and Young's modulus. The temperature sensitivity of the different graphynes is proportionally related to the percentage of acetylenic linkages in their structures, with the -graphyne (having 100% of acetylenic linkages) being most sensitive to temperature. For the same graphyne, temperature exerts a more pronounced effect on the Young's modulus than fracture strength, which is different from that of graphene. The mechanical properties of graphynes are also sensitive to strain rate, in particular at higher temperatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The global grown in institutional investors means that firms can no longer ignore their influence in capital markets. However, not all institutional investors have the same motives to influence the firms they invest in. Institution investors' ability to influence management depends on the size of their investment and whether they have any business relations with the firm. Using a sample of Australian firms from 2006 to 2008, our empirical results show that the proportion of a company's shares held by institutional investors is positively associated with firm governance ratings, risk and profitability. This study shows that a positive association between risk and return is associated with large active institutional ownership, which we interpret as shareholders with sufficient power to pressure management to increase short-term profits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the physiological tolerance times when wearing explosive and chemical (>35kg) personal protective equipment (PPE) in simulated environmental extremes across a range of differing work intensities. Twelve healthy males undertook nine trials which involved walking on a treadmill at 2.5, 4 and 5.5 km.h-1 in the following environmental conditions, 21, 30 and 37 C wet bulb globe temperature (WBGT). Participants exercised for 60 min or until volitional fatigue, core temperature reached 39 C, or heart rate exceeded 90% of maximum. Tolerance time, core temperature, skin temperature, mean body temperature, heart rate and body mass loss were measured. Exercise time was reduced in the higher WBGT environments (WBGT37<WBGT30<WBGT21; P<0.05) and work intensities (5.5<4<2.5 km.h-1; P<0.001). The majority of trials (85/108; 78.7%) were terminated due to participants heart rate exceeding 90% of their maximum. A total of eight trials (7.4%) lasted the full duration. Only nine (8.3%) trials were terminated due to volitional fatigue and six (5.6%) due to core temperatures in excess of 39 C. These results demonstrate that physiological tolerance times are influenced by the external environment and workload, and that cardiovascular strain is the limiting factor to work tolerance when wearing this heavy multi layered PPE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The incidence of clinically apparent stroke in transcatheter aortic valve implantation (TAVI) exceeds that of any other procedure performed by interventional cardiologists and, in the index admission, occurs more than twice as frequently with TAVI than with surgical aortic valve replacement (SAVR). However, this represents only a small component of the vast burden of neurological injury that occurs during TAVI, with recent evidence suggesting that many strokes are clinically silent or only subtly apparent. Additionally, insult may manifest as slight neurocognitive dysfunction rather than overt neurological deficits. Characterisation of the incidence and underlying aetiology of these neurological events may lead to identification of currently unrecognised neuroprotective strategies. Methods The Silent and Apparent Neurological Injury in TAVI (SANITY) Study is a prospective, multicentre, observational study comparing the incidence of neurological injury after TAVI versus SAVR. It introduces an intensive, standardised, formal neurologic and neurocognitive disease assessment for all aortic valve recipients, regardless of intervention (SAVR, TAVI), valve-type (bioprosthetic, Edwards SAPIEN-XT) or access route (sternotomy, transfemoral, transapical or transaortic). Comprehensive monitoring of neurological insult will also be recorded to more fully define and compare the neurological burden of the procedures and identify targets for harm minimisation strategies. Discussion The SANITY study undertakes the most rigorous assessment of neurological injury reported in the literature to date. It attempts to accurately characterise the insult and sustained injury associated with both TAVI and SAVR in an attempt to advance understanding of this complication and associations thus allowing for improved patient selection and procedural modification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linkage of echolocation call production with contraction of flight muscles has been suggested to reduce the energetic cost of flight with echolocation, such that the overall cost is approximately equal to that of flight alone. However, the pattern of call production with limb movement in terrestrially agile bats has never been investigated. We used synchronised high-speed video and audio recordings to determine patterns of association between echolocation call production and limb motion by Mystacina tuberculata Gray 1843 as individuals walked and flew, respectively. Results showed that there was no apparent linkage between call production and limb motion when bats walked. When in flight, two calls were produced per wingbeat, late in the downstroke and early in the upstroke. When bats walked, calls were produced at a higher rate, but at a slightly lower intensity, compared with bats in flight. These results suggest that M. tuberculata do not attempt to reduce the cost of terrestrial locomotion and call production through biomechanical linkage. They also suggest that the pattern of linkage seen when bats are in flight is not universal and that energetic savings cannot necessarily be explained by contraction of muscles associated with the downstroke alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article explores the outcomes experienced by abducting primary carer mothers and their children post-return to Australia under the Hague Convention on Civil Aspects of International Child Abduction.1 The circumstances faced by families that experience international parental child abduction are examined by considering how part VII of the Australian Family Law Act 1975 (Cth) is applied to resolve parenting disputes post-return. At present, the statutory criteria found in part VII encourage an equal shared parental responsibility and shared care parenting approach.2 This emphasis aligns childrens best interests with collaborative parenting3 and their parents living within close geographical proximity of each other to facilitate the practicalities of the approach.4 Arguably, these statutory criteria guide the exercise of judicial discretion to determine a childs best interests towards a parenting arrangement that is incompatible with the lifestyle and functional characteristics of these families.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reports the findings of an empirical study of outcomes experienced by abducting primary-carer mothers and their children post-return to Australia under the Hague Child Abduction Convention. The study specifically focused on legal and factual outcomes post-return to Australia as the child's habitual residence. The study contributes an original critique of the Convention's operation by examining the collective operation of Convention return proceedings and Pt VII proceedings under the Family Law Act 1975 (Cth) post-return. Convention return proceedings, and the resolution of the substantive parenting dispute post-return to Australia, are not distinct stages operating in isolation. Viewing them as such is a purely theoretical exercise divorced from the reality of the lives of transnational families. Arguably, a better measure of the Convention's success is the outcomes it produces as part of the entire system designed to address the contemporary problem of international parental child abduction. When a child is returned to Australia this system includes the operation of Australian family law.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nb2O5 nanosheets are successfully synthesized through a facile hydrothermal reaction and followed heating treatment in air. The structural characterization reveals that the thickness of these sheets is around 50 nm and the length of sheets is 500~800 nm. Such a unique two dimensional structure enables the nanosheet electrode with superior performance during the charge-discharge process, such as high specific capacity (~184 mAh.g-1) and rate capability. Even at a current density of 1 A.g-1, the nanosheet electrode still exhibits a specific capacity of ~90 mAh.g-1. These results suggest the Nb2O5 nanosheet is a promising candidate for high-rate lithium ion storage applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduced predators can have pronounced effects on nave prey species; thus, predator control is often essential for conservation of threatened native species. Complete eradication of the predator, although desirable, may be elusive in budget-limited situations, whereas predator suppression is more feasible and may still achieve conservation goals. We used a stochastic predator-prey model based on a Lotka-Volterra system to investigate the cost-effectiveness of predator control to achieve prey conservation. We compared five control strategies: immediate eradication, removal of a constant number of predators (fixed-number control), removal of a constant proportion of predators (fixed-rate control), removal of predators that exceed a predetermined threshold (upper-trigger harvest), and removal of predators whenever their population falls below a lower predetermined threshold (lower-trigger harvest). We looked at the performance of these strategies when managers could always remove the full number of predators targeted by each strategy, subject to budget availability. Under this assumption immediate eradication reduced the threat to the prey population the most. We then examined the effect of reduced management success in meeting removal targets, assuming removal is more difficult at low predator densities. In this case there was a pronounced reduction in performance of the immediate eradication, fixed-number, and lower-trigger strategies. Although immediate eradication still yielded the highest expected minimum prey population size, upper-trigger harvest yielded the lowest probability of prey extinction and the greatest return on investment (as measured by improvement in expected minimum population size per amount spent). Upper-trigger harvest was relatively successful because it operated when predator density was highest, which is when predator removal targets can be more easily met and the effect of predators on the prey is most damaging. This suggests that controlling predators only when they are most abundant is the "best" strategy when financial resources are limited and eradication is unlikely. 2008 Society for Conservation Biology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most standard algorithms for prediction with expert advice depend on a parameter called the learning rate. This learning rate needs to be large enough to fit the data well, but small enough to prevent overfitting. For the exponential weights algorithm, a sequence of prior work has established theoretical guarantees for higher and higher data-dependent tunings of the learning rate, which allow for increasingly aggressive learning. But in practice such theoretical tunings often still perform worse (as measured by their regret) than ad hoc tuning with an even higher learning rate. To close the gap between theory and practice we introduce an approach to learn the learning rate. Up to a factor that is at most (poly)logarithmic in the number of experts and the inverse of the learning rate, our method performs as well as if we would know the empirically best learning rate from a large range that includes both conservative small values and values that are much higher than those for which formal guarantees were previously available. Our method employs a grid of learning rates, yet runs in linear time regardless of the size of the grid.