958 resultados para Computer-driven foot
Resumo:
The complex interaction of the bones of the foot has been explored in detail in recent years, which has led to the acknowledgement in the biomechanics community that the foot can no longer be considered as a single rigid segment. With the advance of motion analysis technology it has become possible to quantify the biomechanics of simplified units or segments that make up the foot. Advances in technology coupled with reducing hardware prices has resulted in the uptake of more advanced tools available for clinical gait analysis. The increased use of these techniques in clinical practice requires defined standards for modelling and reporting of foot and ankle kinematics. This systematic review aims to provide a critical appraisal of commonly used foot and ankle marker sets designed to assess kinematics and thus provide a theoretical background for the development of modelling standards.
Resumo:
Footwear is designed to reduce injury and enhance performance. However, the effect footwear has on foot and ankle kinematics currently remains unknown. Acknowledging the need for improved understanding, the aim of this study was to describe the effect footwear has on the kinematics of a multi segment foot during stance phase of walking gait.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
At St Thomas' Hospital, we have developed a computer program on a Titan graphics supercomputer to plan the stereotactic implantation of iodine-125 seeds for the palliative treatment of recurrent malignant gliomas. Use of the Gill-Thomas-Cosman relocatable frame allows planning and surgery to be carried out at different hospitals on different days. Stereotactic computed tomography (CT) and positron emission tomography (PET) scans are performed and the images transferred to the planning computer. The head, tumour and frame fiducials are outlined on the relevant images, and a three-dimensional model generated. Structures which could interfere with the surgery or radiotherapy, such as major vessels, shunt tubing etc., can also be outlined and included in the display. Catheter target and entry points are set using a three-dimensional cursor controlled by a set of dials attached to the computer. The program calculates and displays the radiation dose distribution within the target volume for various catheter and seed arrangements. The CT co-ordinates of the fiducial rods are used to convert catheter co-ordinates from CT space to frame space and to calculate the catheter insertion angles and depths. The surgically implanted catheters are after-loaded the next day and the seeds left in place for between 4 and 6 days, giving a nominal dose of 50 Gy to the edge of the target volume. 25 patients have been treated so far.
Resumo:
This series of research vignettes is aimed at sharing current and interesting research findings from our team and other international researchers. In this vignette, Dr Martie-Louise Verreynne from the University of Queensland Business School summaries the findings from a paper written in conjunction with Sarel Gronum and Tim Kastelle from the UQ Business School that examined if networking really contributes to small firms' bottom line. Their findings show that unless networks are used for productive means, efforts to cultivate and maintain them may be wasteful.
Resumo:
The integration of unmanned aircraft into civil airspace is a complex issue. One key question is whether unmanned aircraft can operate just as safely as their manned counterparts. The absence of a human pilot in unmanned aircraft automatically points to a deficiency that is the lack of an inherent see-and-avoid capability. To date, regulators have mandated that an “equivalent level of safety” be demonstrated before UAVs are permitted to routinely operate in civil airspace. This chapter proposes techniques, methods, and hardware integrations that describe a “sense-and-avoid” system designed to address the lack of a see-and-avoid capability in UAVs.
Resumo:
STUDY DESIGN: Controlled laboratory study. OBJECTIVES: To investigate the reliability and concurrent validity of photographic measurements of hallux valgus angle compared to radiographs as the criterion standard. BACKGROUND: Clinical assessment of hallux valgus involves measuring alignment between the first toe and metatarsal on weight-bearing radiographs or visually grading the severity of deformity with categorical scales. Digital photographs offer a noninvasive method of measuring deformity on an exact scale; however, the validity of this technique has not previously been established. METHODS: Thirty-eight subjects (30 female, 8 male) were examined (76 feet, 54 with hallux valgus). Computer software was used to measure hallux valgus angle from digital records of bilateral weight-bearing dorsoplantar foot radiographs and photographs. One examiner measured 76 feet on 2 occasions 2 weeks apart, and a second examiner measured 40 feet on a single occasion. Reliability was investigated by intraclass correlation coefficients and validity by 95% limits of agreement. The Pearson correlation coefficient was also calculated. RESULTS: Intrarater and interrater reliability were very high (intraclass correlation coefficients greater than 0.96) and 95% limits of agreement between photographic and radiographic measurements were acceptable. Measurements from photographs and radiographs were also highly correlated (Pearson r = 0.96). CONCLUSIONS: Digital photographic measurements of hallux valgus angle are reliable and have acceptable validity compared to weight-bearing radiographs. This method provides a convenient and precise tool in assessment of hallux valgus, while avoiding the cost and radiation exposure associated with radiographs.
Resumo:
This chapter deals with technical aspects of how USDL service descriptions can be read from and written to different representations for use by humans and tools. A combination of techniques for representing and exchanging USDL have been drawn from Model-Driven Engineering and Semantic Web technologies. The USDL language's structural definition is specified as a MOF meta-model, but some modules were originally defined using the OWL language from the Semantic Web community and translated to the meta-model format. We begin with the important topic of serializing USDL descriptions into XML, so that they can be exchanged beween editors, repositories, and other tools. The following topic is how USDL can be made available through the Semantic Web as a network of linked data, connected via URIs. Finally, consideration is given to human-readable representations of USDL descriptions, and how they can be generated, in large part, from the contents of a stored USDL model.
Resumo:
Objective Factors associated with the development of hallux valgus (HV) are multifactorial and remain unclear. The objective of this systematic review and meta-analysis was to investigate characteristics of foot structure and footwear associated with HV. Design Electronic databases (Medline, Embase, and CINAHL) were searched to December 2010. Cross-sectional studies with a valid definition of HV and a non-HV comparison group were included. Two independent investigators quality rated all included papers. Effect sizes and 95% confidence intervals (CIs) were calculated (standardized mean differences (SMDs) for continuous data and risk ratios (RRs) for dichotomous data). Where studies were homogeneous, pooling of SMDs was conducted using random effects models. Results A total of 37 papers (34 unique studies) were quality rated. After exclusion of studies without reported measurement reliability for associated factors, data were extracted and analysed from 16 studies reporting results for 45 different factors. Significant factors included: greater first intermetatarsal angle (pooled SMD = 1.5, CI: 0.88–2.1), longer first metatarsal (pooled SMD = 1.0, CI: 0.48–1.6), round first metatarsal head (RR: 3.1–5.4), and lateral sesamoid displacement (RR: 5.1–5.5). Results for clinical factors (e.g., first ray mobility, pes planus, footwear) were less conclusive regarding their association with HV. Conclusions Although conclusions regarding causality cannot be made from cross-sectional studies, this systematic review highlights important factors to monitor in HV assessment and management. Further studies with rigorous methodology are warranted to investigate clinical factors associated with HV.
Resumo:
Strategic renewal has received relatively little attention in the context of new ventures. We examine the relationship among strategic renewal, competitive advantage and performance in opportunity-driven and conservative new ventures. Based on longitudinal data of a random sample of almost 373 new ventures, the link between strategic renewal and performance can be better understood by adding the mediating role of competitive advantage. Our results indicate that increased levels of strategic renewal positively relate to competitive advantage in conservative ventures, but not in opportunity-driven ventures. These findings place a different perspective on the dominant view that entrepreneurs should be opportunity maximizers. It suggests that both conservative and opportunity-driven new ventures can be successful if they follow different paths of strategic renewal in shaping competitive advantage.
Resumo:
A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.