642 resultados para fix
Resumo:
Remote monitoring of animal behaviour in the environment can assist in managing both the animal and its environmental impact. GPS collars which record animal locations with high temporal frequency allow researchers to monitor both animal behaviour and interactions with the environment. These ground-based sensors can be combined with remotely-sensed satellite images to understand animal-landscape interactions. The key to combining these technologies is communication methods such as wireless sensor networks (WSNs). We explore this concept using a case-study from an extensive cattle enterprise in northern Australia and demonstrate the potential for combining GPS collars and satellite images in a WSN to monitor behavioural preferences and social behaviour of cattle.
Resumo:
The Trouble with Play is a radical departure from some of the ideas about play that are held dear by many in early childhood education. For many, play is considered essential to children's development and learning, and is often promoted as a universal and almost magical 'fix'. Although play does have many proven benefits for children, the authors show that play in the early years is not always innocent, fun and natural. Play can also be political and involve morals, ethics, values and power.
Resumo:
Several studies have developed metrics for software quality attributes of object-oriented designs such as reusability and functionality. However, metrics which measure the quality attribute of information security have received little attention. Moreover, existing security metrics measure either the system from a high level (i.e. the whole system’s level) or from a low level (i.e. the program code’s level). These approaches make it hard and expensive to discover and fix vulnerabilities caused by software design errors. In this work, we focus on the design of an object-oriented application and define a number of information security metrics derivable from a program’s design artifacts. These metrics allow software designers to discover and fix security vulnerabilities at an early stage, and help compare the potential security of various alternative designs. In particular, we present security metrics based on composition, coupling, extensibility, inheritance, and the design size of a given object-oriented, multi-class program from the point of view of potential information flow.
Resumo:
Gibson and Tarrant discuss the range of inter-dependant factors needed to manage organisational resilience. Over the last few years there has been considerable interest in the idea of resilience across all areas of society. Like any new area or field this has produced a vast array of definitions, processes, management systems and measurement tools which together have clouded the concept of resilience. Many of us have forgotten that ultimately resilience is not just about ‘bouncing back from adversity’ but is more broadly concerned with adaptive capacity and how we better understand and address uncertainty in our internal and external environments. The basis of organisational resilience is a fundamental understanding and treatment of risk, particularly non-routine or disruption related risk. This paper presents a number of conceptual models of organisational resilience that we have developed to demonstrate the range of inter-dependant factors that need to be considered in the management of such risk. These conceptual models illustrate that effective resilience is built upon a range of different strategies that enhance both ‘hard’ and ‘soft’ organisational capabilities . They emphasise the concept that there is no quick fix, no single process, management system or software application that will create resilience.
Resumo:
On 12 June 2006, the lights went out in New Zealand’s largest city and major commercial centre, Auckland. Business was disrupted and many thousands of people inconvenienced. The unscheduled power cut was the latest in a series of electric power problems in New Zealand over the past decade. Attention turned to state-owned enterprise [SOE] Transpower, which was in charge of maintaining and developing New Zealand’s national electricity grid. The problem of 12 June was traced to two shackles in poor condition, small but essential parts of the electricity grid infrastructure. Closer examination of New Zealand’s electricity sector indicated these shackles were merely the tip of a power supply iceberg. Transpower’s Chief Executive, Ralph Craven, was now answerable to the Prime Minister for the issues creating the problems, and a workable solution to fix them. Transpower Chief Executive Ralph Craven needed to produce answers that went well beyond the problem of the two faulty shackles. The power crisis had brought to the fore wider issues of roles, responsibilities, and expectations in relation to the supply of electric power in New Zealand. Transpower was contending with these issues on a daily basis; however, the incident on 12 June publicly highlighted the urgent need for solutions that served the stakeholders in this critical industry.
Resumo:
We report on a longitudinal research study of the development of novice programmers in their first semester of programming. In the third week, almost half of our sample of students could not answer an explain-in-plain-English question, for code consisting of just three assignment statements, which swapped the values in two variables. We regard code that swaps the values of two variables as the simplest case of where a programming student can manifest a SOLO relational response. Our results demonstrate that the problems many students face with understanding code can begin very early, on relatively trivial code. However, using traditional programming exercises, these problems often go undetected until late in the semester. New approaches are required to detect and fix these problems earlier.
Resumo:
We’ve had a bit of sticker shock in these parts. Well, apparently. Since my last missive, Brisbane’s Clem Jones Tunnel which was initially free now has a toll, at least partially, at the introductory rate of $2.95 for a one-way car ride between 5a.m. and midnight – free overnight. From 9 May 2010 the toll will be $4.28. Since the introductory toll was introduced, use of the tunnel appears to have declined somewhat – no surprise to transport professionals I suppose. An additional factor may have been that the “novelty value” of driving through the tunnel for free had worn off. This demonstrates to me that much of the community may still see the use of road infrastructure as a rite of passage, with only some actually weighing up the true value of their travel time and vehicle wear and tear against their out of pocket (or onto credit card) cost. Thus, we’re in pioneering times and the role of transport economics in the overall transport infrastructure planning realm is of considerable importance – especially as much of the new big ticket infrastructure is likely to be tolled into the future. The Queensland Premier, Anna Bligh, made poignant commentary about Brisbane City Council’s tunnel use in that such infrastructure is built for future times and not just as a quick fix for current traffic problems. My expectation is that once Airport Link, which is really the northern half of the corridor, opens in 2012, there will be a significant spike in Clem7 usage.
Resumo:
BACKGROUND: Treatment of proximal humerus fractures in elderly patients is challenging because of reduced bone quality. We determined the in vitro characteristics of a new implant developed to target the remaining bone stock, and compared it with an implant in clinical use. METHODS: Following osteotomy, left and right humeral pairs from cadavers were treated with either the Button-Fix or the Humerusblock fixation system. Implant stiffness was determined for three clinically relevant cases of load: axial compression, torsion, and varus bending. In addition, a cyclic varus-bending test was performed. RESULTS: We found higher stiffness values for the humeri treated with the ButtonFix system--with almost a doubling of the compression, torsion, and bending stiffness values. Under dynamic loading, the ButtonFix system had superior stiffness and less K-wire migration compared to the Humerusblock system. INTERPRETATION: When compared to the Humerusblock design, the ButtonFix system showed superior biomechanical properties, both static and dynamic. It offers a minimally invasive alternative for the treatment of proximal humerus fractures.
Resumo:
This study aimed to clarify the relationship between the mechanical environment at the fracture site and endogenous fibroblast growth factor-2 (FGF-2). We compared two types of fracture healing with different callus formations and cellular events using MouseFix(TM) plate fixation systems for murine fracture models. Left femoral fractures were induced in 72 ten-week-old mice and then fixed with a flexible (Group F) or rigid (Group R) Mouse Fix(TM) plate. Mice were sacrificed on days 3, 5, 7, 10, 14, and 21. The callus volumes were measured by 3D micro-CT and tissues were histologically stained with hematoxylin & eosin or safranin-O. Sections from days 3, 5, and 7 were immunostained for FGF-2 and Proliferating Cell Nuclear Antigen (PCNA). The callus in Group F was significantly larger than that in Group R. The rigid plate allowed bone union without a marked external callus or chondrogenesis. The flexible plate formed a large external callus as a result of endochondral ossification. Fibroblastic cells in the granulation tissue on days 5 and 7 in Group F showed marked FGF-2 expression compared with Group R. Fibroblastic cells showed ongoing proliferation in granulation tissue in group F, as indicated by PCNA expression, which explained the relative granulation tissue increase in group F. There were major differences in early phase endogenous FGF-2 expression between these two fracture healing processes, due to different mechanical environments.
Resumo:
The processes used in Australian universities for reviewing the ethics of research projects are based on the traditions of research and practice from the medical and health sciences. The national guidelines for ethical conduct in research are heavily based on presumptions that the researcher–participant relationship is similar to a doctor–patient relationship. The National Health and Medical Research Council, Australian Research Council and Australian Vice-Chancellors’ Committee have made a laudable effort to fix this problem by releasing the National Statement on Ethical Conduct in Human Research in 2007, to replace the 1999 National Statement on Ethical Conduct in Research Involving Humans. The new statement better encompasses the needs of the humanities, social sciences and creative industries. However, this paper argues that the revised National Statement and ethical review processes within universities still do not fully encompass the definitions of ‘research’ and the requirements, traditions, codes of practice and standards of the humanities, social sciences and creative industries. The paper argues that scholars within these disciplines often lack the language to articulate their modes of practice and risk management strategies to university-level ethics committees. As a consequence, scholars from these disciplines may find their research is delayed or stymied. The paper focuses on creative industries researchers, and explores the issues that they face in managing the ethical review process, particularly when engaging in practice-based research. Although the focus is on the creative industries, the issues are relevant to most fields in the humanities and social sciences.
Resumo:
In spite of significant research in the development of efficient algorithms for three carrier ambiguity resolution, full performance potential of the additional frequency signals cannot be demonstrated effectively without actual triple frequency data. In addition, all the proposed algorithms showed their difficulties in reliable resolution of the medium-lane and narrow-lane ambiguities in different long-range scenarios. In this contribution, we will investigate the effects of various distance-dependent biases, identifying the tropospheric delay to be the key limitation for long-range three carrier ambiguity resolution. In order to achieve reliable ambiguity resolution in regional networks with the inter-station distances of hundreds of kilometers, a new geometry-free and ionosphere-free model is proposed to fix the integer ambiguities of the medium-lane or narrow-lane observables over just several minutes without distance constraint. Finally, the semi-simulation method is introduced to generate the third frequency signals from dual-frequency GPS data and experimentally demonstrate the research findings of this paper.
Resumo:
In herbaceous ecosystems worldwide, biodiversity has been negatively impacted by changed grazing regimes and nutrient enrichment. Altered disturbance regimes are thought to favour invasive species that have a high phenotypic plasticity, although most studies measure plasticity under controlled conditions in the greenhouse and then assume plasticity is an advantage in the field. Here, we compare trait plasticity between three co-occurring, C 4 perennial grass species, an invader Eragrostis curvula, and natives Eragrostis sororia and Aristida personata to grazing and fertilizer in a three-year field trial. We measured abundances and several leaf traits known to correlate with strategies used by plants to fix carbon and acquire resources, i.e. specific leaf area (SLA), leaf dry matter content (LDMC), leaf nutrient concentrations (N, C:N, P), assimilation rates (Amax) and photosynthetic nitrogen use efficiency (PNUE). In the control treatment (grazed only), trait values for SLA, leaf C:N ratios, Amax and PNUE differed significantly between the three grass species. When trait values were compared across treatments, E. curvula showed higher trait plasticity than the native grasses, and this correlated with an increase in abundance across all but the grazed/fertilized treatment. The native grasses showed little trait plasticity in response to the treatments. Aristida personata decreased significantly in the treatments where E. curvula increased, and E. sororia abundance increased possibly due to increased rainfall and not in response to treatments or invader abundance. Overall, we found that plasticity did not favour an increase in abundance of E. curvula under the grazed/fertilized treatment likely because leaf nutrient contents increased and subsequently its' palatability to consumers. E. curvula also displayed a higher resource use efficiency than the native grasses. These findings suggest resource conditions and disturbance regimes can be manipulated to disadvantage the success of even plastic exotic species.
Resumo:
With many important developments over the last century, nowadays orthopedic bone plate now excels over other types of internal fixators in bone fracture fixation. The developments involve the design, material and implementation techniques of the plates. This paper aims to review the evolution in implementation technique and biomaterial of the orthopedic bone plates. Plates were initially used to fix the underlying bones firmly. Accordingly, Compression plate (CP), Dynamic compression plate (DCP), Limited contact dynamic compression plate (LC-DCP) and Point contact fixator (PC-Fix) were developed. Later, the implementation approach was changed to locking, and the Less Invasive Stabilization System (LISS) plate was introduced as a result. Finally, a combination of both of these approaches has been used by introducing the Locking Compression Plate (LCP). Currently, precontoured LCPs are mainly used for bone fracture fixation. In parallel with structure and implementation techniques, numerous advances have occurred in biomaterials of the plates. Titanium and stainless steel alloys are now the most common biomaterials in production of orthopedic bone plates. However, regarding the biocompatibility, bioactivity and biodegradability characteristics of Mg alloys, Ta alloys, SMAs, carbon fiber composites and bioceramics, these materials are considered as potentially suitable for plates. However, due to poor mechanical properties, they have very limited applications. Therefore, further studies are required in future to solve these problems and make them feasible for heavy-duty bone plates.
Resumo:
Reliable ambiguity resolution (AR) is essential to Real-Time Kinematic (RTK) positioning and its applications, since incorrect ambiguity fixing can lead to largely biased positioning solutions. A partial ambiguity fixing technique is developed to improve the reliability of AR, involving partial ambiguity decorrelation (PAD) and partial ambiguity resolution (PAR). Decorrelation transformation could substantially amplify the biases in the phase measurements. The purpose of PAD is to find the optimum trade-off between decorrelation and worst-case bias amplification. The concept of PAR refers to the case where only a subset of the ambiguities can be fixed correctly to their integers in the integer least-squares (ILS) estimation system at high success rates. As a result, RTK solutions can be derived from these integer-fixed phase measurements. This is meaningful provided that the number of reliably resolved phase measurements is sufficiently large for least-square estimation of RTK solutions as well. Considering the GPS constellation alone, partially fixed measurements are often insufficient for positioning. The AR reliability is usually characterised by the AR success rate. In this contribution an AR validation decision matrix is firstly introduced to understand the impact of success rate. Moreover the AR risk probability is included into a more complete evaluation of the AR reliability. We use 16 ambiguity variance-covariance matrices with different levels of success rate to analyse the relation between success rate and AR risk probability. Next, the paper examines during the PAD process, how a bias in one measurement is propagated and amplified onto many others, leading to more than one wrong integer and to affect the success probability. Furthermore, the paper proposes a partial ambiguity fixing procedure with a predefined success rate criterion and ratio-test in the ambiguity validation process. In this paper, the Galileo constellation data is tested with simulated observations. Numerical results from our experiment clearly demonstrate that only when the computed success rate is very high, the AR validation can provide decisions about the correctness of AR which are close to real world, with both low AR risk and false alarm probabilities. The results also indicate that the PAR procedure can automatically chose adequate number of ambiguities to fix at given high-success rate from the multiple constellations instead of fixing all the ambiguities. This is a benefit that multiple GNSS constellations can offer.
Resumo:
This paper illustrates the complexity of pointing as it is employed in a design workshop. Using the method of interaction analysis, we argue that pointing is not merely employed to index, locate, or fix reference to an object. It also constitutes a practice for reestablishing intersubjectivity and solving interactional trouble such as misunderstandings or disagreements by virtue of enlisting something as part of the participants’ shared experience. We use this analysis to discuss implications for how such practices might be supported with computer mediation, arguing for a “bricolage” approach to systems development that emphasizes the provision of resources for users to collaboratively negotiate the accomplishment of intersubjectivity ra- ther than systems that try to support pointing as a specific gestural action.