989 resultados para Standard map
Resumo:
Introduction: In an attempt to reduce stress shielding in the proximal femur multiple new shorter stem design have become available. We investigated the load to fracture of a new polished tapered cemented short stem in comparison to the conventional polished tapered Exeter stem. Method: A total of forty-two stems, twenty-one short stems and twenty-one conventional stems both with three different offsets were cemented in a composite sawbone model and loaded to fracture. Results: study showed that femurs will break at a significantly lower load to failure with a shorter compared to conventional length Exeter stem. Conclusion: This Both standard and short stem design are safe to use as the torque to failure is 7–10 times as much as the torques seen in activities of daily living.
Resumo:
Aim A new method of penumbral analysis is implemented which allows an unambiguous determination of field size and penumbra size and quality for small fields and other non-standard fields. Both source occlusion and lateral electronic disequilibrium will affect the size and shape of cross-axis profile penumbrae; each is examined in detail. Method A new method of penumbral analysis is implemented where the square of the derivative of the cross-axis profile is plotted. The resultant graph displays two peaks in the place of the two penumbrae. This allows a strong visualisation of the quality of a field penumbra, as well as a mathematically consistent method of determining field size (distance between the two peak’s maxima), and penumbra (full-widthtenth-maximum of peak). Cross-axis profiles were simulated in a water phantom at a depth of 5 cm using Monte Carlo modelling, for field sizes between 5 and 30 mm. The field size and penumbra size of each field was calculated using the method above, as well as traditional definitions set out in IEC976. The effect of source occlusion and lateral electronic disequilibrium on the penumbrae was isolated by repeating the simulations removing electron transport and using an electron spot size of 0 mm, respectively. Results All field sizes calculated using the traditional and proposed methods agreed within 0.2 mm. The penumbra size measured using the proposed method was systematically 1.8 mm larger than the traditional method at all field sizes. The size of the source had a larger effect on the size of the penumbra than did lateral electronic disequilibrium, particularly at very small field sizes. Conclusion Traditional methods of calculating field size and penumbra are proved to be mathematically adequate for small fields. However, the field size definition proposed in this study would be more robust amongst other nonstandard fields, such as flattening filter free. Source occlusion plays a bigger role than lateral electronic disequilibrium in small field penumbra size.
Resumo:
Occupational standards concerning the allowable concentrations of chemical compounds in the ambient air of workplaces have been established in several countries at national levels. With the integration of the European Union, a need exists for establishing harmonized Occupational Exposure Limits. For analytical developments, it is apparent that methods for speciation or fractionation of carcinogenic metal compounds will be of increasing practical importance for standard setting. Criteria of applicability under field conditions, cost-effectiveness, and robustness are practical driving forces for new developments. When the European Union issued a list of 62 chemical substances with Occupational Exposure Limits in 2000, 25 substances received a 'skin' notation. The latter indicates that toxicologically significant amounts may be taken up via the skin. Similar notations exist on national levels. For such substances, monitoring concentrations in ambient air will not be sufficient; biological monitoring strategies will gain further importance in the medical surveillance of workers who are exposed to such compounds. Proceedings in establishing legal frameworks for a biological monitoring of chemical exposures within Europe are paralleled by scientific advances in this field. A new aspect is the possibility of a differential adduct monitoring, using blood proteins of different half-life or lifespan. This technique allows differentiation between long-term mean exposure to reactive chemicals and short-term episodes, for example, by accidental overexposure. For further analytical developments, the following issues have been addressed as being particularly important: New dose monitoring strategies, sensitive and reliable methods for detection of DNA adducts, cytogenetic parameters in biological monitoring, methods to monitor exposure to sensitizing chemicals, and parameters for individual susceptibilities to chemical toxicants.
Resumo:
The benefits of using eXtensible Business Reporting Language (XBRL) as a business reporting standard have been widely canvassed in the extant literature, in particular, as the enabling technology for standard business reporting tools. One of the key benefits noted is the ability of standard business reporting to create significant efficiencies in the regulatory reporting process. Efficiency-driven cost reductions are highly desirable by data and report producers. However, they may not have the same potential to create long-term firm value as improved effectiveness of decision making. This study assesses the perceptions of Australian business stakeholders in relation to the benefits of the Australian standard business reporting instantiation (SBR) for financial reporting. These perceptions were drawn from interviews of persons knowledgeable in XBRL-based standard business reporting and submissions to Treasury relative to SBR reporting options. The combination of interviews and submissions permit insights into the views of various groups of stakeholders in relation to the potential benefits. In line with predictions based on a transaction-cost economics perspective, interviewees who primarily came from a data and report-producer background mentioned benefits that centre largely on asset specificity and efficiency. The interviewees who principally came from a data and report-consumer background mentioned benefits that centre on reducing decision-making uncertainty and decision-making effectiveness. The data and report consumers also took a broader view of the benefits of SBR to the financial reporting supply chain. Our research suggests that advocates of SBR have successfully promoted its efficiency benefits to potential users. However, the effectiveness benefits of SBR, for example, the decision-making benefits offered to investors via standardised reports, while becoming more broadly acknowledged, remain not a priority for all stakeholders.
Resumo:
Millions of people with print disabilities are denied the right to read. While some important efforts have been made to convert standard books to accessible formats and create accessible repositories, these have so far only addressed this crisis in an ad hoc way. This article argues that universally designed ebook libraries have the potential of substantially enabling persons with print disabilities. As a case study of what is possible, we analyse 12 academic ebook libraries to map their levels of accessibility. The positive results from this study indicate that universally designed ebooks are more than possible; they exist. While results are positive, however, we also found that most ebook libraries have some features that frustrate full accessibility, and some ebook libraries present critical barriers for people with disabilities. Based on these findings, we consider that some combination of private pressure and public law is both possible and necessary to advance the right-to-read cause. With access improving and recent advances in international law, now is the time to push for universal design and equality.
Resumo:
Purpose: The purpose of this work was to evaluate the patient-borne financial cost of common, adverse breast cancer treatment-associated effects, comparing cost across women with or without these side-effects. Methods: 287 Australian women diagnosed with early-stage breast cancer were prospectively followed starting at six months post-surgery for 12 months, with three-monthly assessment of detailed treatment-related side effects and their direct and indirect patient costs attributable to breast cancer. Bootstrapping statistics were used to analyze cost data and adjusted logistic regression was used to evaluate the association between costs and adverse events from breast cancer. Costs were inflated and converted from 2002 Australian to 2014 US dollars. Results: More than 90% of women experienced at least one adverse effect (i.e. post-surgical issue, reaction to radiotherapy, upper-body symptoms or reduced function, lymphedema, fatigue or weight gain). On average, women paid $5,636 (95%CI: $4,694, $6,577) in total costs. Women with any one of the following symptoms (fatigue, reduced upper-body function, upper-body symptoms) or women who report ≥4 adverse treatment-related effects, have 1.5 to nearly 4 times the odds of having higher healthcare costs than women who do not report these complaints (p<0.05). Conclusions: Women face substantial economic burden due to a range of treatment-related health problems, which may persist beyond the treatment period. Improving breast cancer care by incorporating prospective surveillance of treatment-related side effects, and strategies for prevention and treatment of concerns (e.g., exercise) has real potential for reducing patient-borne costs.
Resumo:
Throughout the world, there is increasing pressure on governments, companies,regulators and standard-setters to respond to the global challenge of climate change. The growing number of regulatory requirements for organisations to disclose their greenhouse gas (GHG) emissions and emergent national, regional and international emissions trading schemes (ETSs) reflect key government responses to this challenge. Assurance of GHG emissions disclosures enhances the credibility of these disclosures and any associated trading schemes. The auditing and assurance profession has an important role to play in the provision of such assurance, highlighted by the International Auditing and Assurance Standards Board’s (IAASB) decision to develop an international GHG emissions assurance standard. This article sets out the developments to date on an international standard for the assurance of GHG emissions disclosures. It then provides information on the way Australian companies have responded to the challenge of GHG reporting and assurance. Finally, it outlines the types of assurance that assurance providers in Australia are currently providing in this area.
Resumo:
Worldwide public concern over climate change and the need to limit greenhouse gas (hereafter, GHG) emissions has increasingly motivated public officials to consider more stringent environmental regulation and standards. The authors argue that the development of a new international assurance standard on GHG disclosures is an appropriate response by the auditing and assurance profession to meet these challenges. At its December 2007 meeting, the International Auditing and Assurance Standards Board (hereafter, IAASB) approved a project to consider the development of such a standard aimed at promoting trust and confidence in disclosures of GHG emissions, including disclosures required under emissions trading schemes. The authors assess the types of disclosures that can be assured, and outline the issues involved in developing an international assurance standard on GHG emissions disclosures. The discussion synthesizes the insights gained from four international roundtables on the proposed IAASB assurance standard held in Asia-Pacific, North America, and Europe during 2008, and an IAASB meeting addressing this topic in December 2008.
Resumo:
This video was prepared as a teaching resource for CARRS-Q's Under the Limit Drink Driving Rehabilitation Program.
Resumo:
Natural disasters cause widespread disruption, costing the Australian economy $6.3 billion per year, and those costs are projected to rise incrementally to $23 billion by 2050. With more frequent natural disasters with greater consequences, Australian communities need the ability to prepare and plan for them, absorb and recover from them, and adapt more successfully to their effects. Enhancing Australian resilience will allow us to better anticipate disasters and assist in planning to reduce losses, rather than just waiting for the next king hit and paying for it afterwards. Given the scale of devastation, governments have been quick to pick up the pieces when major natural disasters hit. But this approach (‘The government will give you taxpayers’ money regardless of what you did to help yourself, and we’ll help you rebuild in the same risky area.’) has created a culture of dependence. This is unsustainable and costly. In 2008, ASPI published Taking a punch: building a more resilient Australia. That report emphasised the importance of strong leadership and coordination in disaster resilience policymaking, as well as the value of volunteers and family and individual preparation, in managing the effects of major disasters. This report offers a roadmap for enhancing Australia’s disaster resilience, building on the 2011 National Strategy for Disaster Resilience. It includes a snapshot of relevant issues and current resilience efforts in Australia, outlining key challenges and opportunities. The report sets out 11 recommendations to help guide Australia towards increasing national resilience, from individuals and local communities through to state and federal agencies.
Resumo:
Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.
Resumo:
Background There has been growing interest in mixed species plantation systems because of their potential to provide a range of socio-economic and bio-physical benefits which can be matched to the diverse needs of smallholders and communities. Potential benefits include the production of a range of forest products for home and commercial use; improved soil fertility especially when nitrogen fixing species are included; improved survival rates and greater productivity of species; a reduction in the amount of damage from pests or disease; and improved biodiversity and wildlife habitats. Despite these documented services and growing interest in mixed species plantation systems, the actual planting areas in the tropics are low, and monocultures are still preferred for industrial plantings and many reforestation programs because of perceived higher economic returns and readily available information about the species and their silviculture. In contrast, there are few guidelines for the design and management of mixed-species systems, including the social and ecological factors of successful mixed species plantings. Methods This protocol explains the methodology used to investigate the following question: What is the available evidence for the relative performance of different designs of mixed-species plantings for smallholder and community forestry in the tropics? This study will systematically search, identify and describe studies related to mixed species plantings across tropical and temperate zones to identify the social and ecological factors that affect polyculture systems. The objectives of this study are first to identify the evidence of biophysical or socio-economic factors that have been considered when designing mixed species systems for community and smallholder forestry in the tropics; and second, to identify gaps in research of mixed species plantations. Results of the study will help create guidelines that can assist practitioners, scientists and farmers to better design mixed species plantation systems for smallholders in the tropics.