958 resultados para Computer-driven foot
Resumo:
Background From the conservative estimates of registrants with the National Diabetes Supply Scheme, we will be soon passing 1.1 Million Australians affected by all types of diabetes. The diabetes complications of foot ulceration and amputation are costly to all. These costs can be reduced with appropriate prevention strategies, starting with identifying people at risk through primary care diabetic foot screening. Yet levels of diabetic foot screening in Australia are difficult to quantify. This presentation aims to report on foot screening rates as recorded in existing academic literature, national health surveys and national database reports. Methods Literature searches included diabetic foot screening that occurred in the primary care setting for populations over 2000 people from 2002 to 2014. Searches were performed using Medline and CINAHL as well as internet searches of Organisations for Economic Co-operation and Development (OECD) countries health databases. The focus is on type 1 and type 2 diabetes in adults, and not gestational diabetes or children. The two primary outcome measures were foot -screening rates as a percentage of adult diabetic population and major lower limb amputation incidence rates from standardised OECD data. Results The most recent and accurate level for Australian population review was in the AUSDIAB (Australian Diabetes and lifestyle survey) from 2004. This survey reported screening in primary care to be as low as 50%. Countries such as the United Kingdom and United States of America have much higher reported rates of foot screening (67-86%) recorded using national databases and web based initiatives that involve patients and clinicians. By comparison major amputation rates for Australia were similar to the United Kingdom at 6.5 versus 5.1 per 100,000 population, but dis-similar to the United States of America at 17 per 100,000 population. Conclusions Australian rates of diabetic foot screening in primary care centres is ambiguous. There is no direct relationship between foot screening levels in a primary care environment and major lower limb amputation, based on national health survey's and OECD data. Uptake of national registers, incentives and web-based systems improve levels of diabetic foot assessment, which are the first steps to a healthier diabetic population.
Resumo:
The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.
Resumo:
The system (1-x)PbTiO3-(x)BiAlO3 has been investigated with regard to its solid solubility, crystal structure, microstructure, and ferroelectric transition. The unit cell volume and the tetragonality exhibit anomalous behavior near x=0.10. The Curie point (T-C) of PbTiO3 was however found to be nearly unchanged. The study seems to suggest that the decrease in the stability of the ferroelectric state due to dilution of the Ti-sublattice by smaller sized Al+3 ions is compensated by the increase in the ferroelectric stability by the Bi+3 ions.
Resumo:
Background Surgery is an example of expanded practice scope that enhances podiatry and incorporates inter-professional collaboration. By 2050 demand for foot and ankle procedures is predicted to rise nationally by 61.9%. Performance management of this increase motivated the development of an online audit tool. Developed in collaboration with the Australasian College of Podiatric Surgeons (ACPS), the ACPS audit tool provides real-time data capture and reporting. It is the first audit tool designed in Australia to support and improve the outcomes of foot and ankle surgery. Methods Audit activity in general, orthopaedic, plastic and podiatric surgery was examined using a case study design. Audit participation enablers and barriers were explored. Case study results guided a Delphi survey of international experts experienced or associated with foot and ankle surgery. Delphi survey-derived consensus informed modification of a generic data set from the Royal Australasian College of Surgeons (RACS). Based on the Delphi survey findings the ACPS online audit tool was developed and piloted. Reliability and validity of data entry and usability of this new tool was then assessed with an online survey. Results The case study found surgeon attitudes and behaviours positively impacted audit participation, and also indicated that audit data should be: (1) available in real time, (2) identify practice change, (3) applicable for safety and quality management, and; (4) useful for peer review discussion. The Delphi process established consensus on audit variables to be captured, including the modified RACS generic data set. 382 cases of foot and ankle surgery were captured across 3 months using the new tool. Data entry was found to be valid and reliable. Real-time outcome reporting and practice change identification impacted positively on safety and quality management and assisted peer review discussion. An online survey showed high levels of usability. Conclusions Surgeon contribution to audit tool development resulted in 100% audit participation. The data from the ACPS audit tool supported the ACPS submission to the Medical Services Advisory Committee to list podiatric surgery under Medicare, an outcome noted by the Federal Minister of Health.
Resumo:
The leader protease (L-pro) and capsid-coding sequences (P1) constitute approximately 3 kb of the foot-and-mouth disease virus (FMDV). We studied the phylogenetic relationship of 46 FMDV serotype A isolates of Indian origin collected during the period 1968-2005 and also eight vaccine strains using the neighbour-joining tree and Bayesian tree methods. The viruses were categorized under three major groups - Asian, Euro-South American and European. The Indian isolates formed a distinct genetic group among the Asian isolates. The Indian isolates were further classified into different genetic subgroups (<5% divergence). Post-1995 isolates were divided into two subgroups while a few isolates which originated in the year 2005 from Andhra Pradesh formed a separate group. These isolates were closely related to the isolates of the 1970s. The FMDV isolates seem to undergo reverse mutation or onvergent evolution wherein sequences identical to the ancestors are present in the isolates in circulation. The eight vaccine strains included in the study were not related to each other and belonged to different genetic groups. Recombination was detected in the L-pro region in one isolate (A IND 20/82) and in the VP1 coding 1D region in another isolate (A RAJ 21/96). Positive selection was identified at aa positions 23 in the L-pro (P<0.05; 0.046*) and at aa 171 in the capsid protein VP1 (P<0.01; 0.003**).
Resumo:
Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.
Resumo:
The present study examined how personality and social psychological factors affect third and fourth graders' computer-mediated communication. Personality was analysed in terms of the following strategies: optimism, pessimism and defensive pessimism. Students worked either individually or in dyads which were paired homogeneously or heterogeneously according to the strategies. Moreover, the present study compared horizontal and vertical interaction. The study also examined the role that popularity plays, and students were divided into groups based on their popularity level. The results show that an optimistic strategy is useful. Optimism was found to be related to the active production and processing of ideas. Although previous research has identified drawbacks to pessimism in achievement settings, this study shows that the pessimistic strategy is not as debilitating a strategy as is usually assumed. Pessimistic students were able to process their ideas. However, defensive pessimists were somewhat cautious in introducing or changing ideas. Heterogeneous dyads were not beneficial configurations with respect to producing, introducing, or changing ideas. Moreover, many differences were found to exist between the horizontal and vertical interaction; specifically, the students expressed more opinions and feelings when teachers took no part in the discussions. Strong emotions were observed especially in the horizontal interaction. Further, group working skills were found to be more important for boys than for girls, while rejected students were not at a disadvantage compared to popular ones. Schools can encourage emotional and social learning. The present study shows that students can use computers to express their feelings. In addition, students who are unpopular in non-computer contexts or students who use pessimism can benefit from computers. Participation in computer discussions can give unpopular children a chance to develop confidence when relating to peers.
Resumo:
New Internet and Web-based technology applications have meant significant cost and time efficiencies to many American businesses. However, many employers have not yet fully grasped the impact of these new information and communication technologies on applicants and employees with certain disabilities such as vision impairments, hearing problems or limited dexterity. Although not all applicants and employees who have a disability may experience IT-access problems, to select groups it can pose a needless barrier. The increasing dominance of IT in the workplace presents both a challenge and an opportunity for workers with disabilities and their employers. It will be up to HR professionals to ensure that Web-based HR processes and workplace technologies are accessible to their employees with disabilities. .
Resumo:
The nitrogen-driven trade-off between nitrogen utilisation efficiency (yield per unit nitrogen uptake) and water use efficiency (yield per unit evapotranspiration) is widespread and results from well established, multiple effects of nitrogen availability on the water, carbon and nitrogen economy of crops. Here we used a crop model (APSIM) to simulate the yield, evapotranspiration, soil evaporation and nitrogen uptake of wheat, and analysed yield responses to water, nitrogen and climate using a framework analogous to the rate-duration model of determinate growth. The relationship between modelled grain yield (Y) and evapotranspiration (ET) was fitted to a linear-plateau function to derive three parameters: maximum yield (Ymax), the ET break-point when yield reaches its maximum (ET#), and the rate of yield response in the linear phase ([Delta]Y/[Delta]ET). Against this framework, we tested the hypothesis that nitrogen deficit reduces maximum yield by reducing both the rate ([Delta]Y/[Delta]ET) and the range of yield response to evapotranspiration, i.e. ET# - Es, where Es is modelled median soil evaporation. Modelled data reproduced the nitrogen-driven trade-off between nitrogen utilisation efficiency and water use efficiency in a transect from Horsham (36°S) to Emerald (23°S) in eastern Australia. Increasing nitrogen supply from 50 to 250 kg N ha-1 reduced yield per unit nitrogen uptake from 29 to 12 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 15 kg grain ha-1 mm-1 at Emerald. The same increment in nitrogen supply reduced yield per unit nitrogen uptake from 30 to 25 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 25 kg grain ha-1 mm-1 at Horsham. Maximum yield ranged from 0.9 to 6.4 t ha-1. Consistent with our working hypothesis, reductions in maximum yield with nitrogen deficit were associated with both reduction in the rate of yield response to ET and compression of the range of yield response to ET. Against the notion of managing crops to maximise water use efficiency in low rainfall environments, we emphasise the trade-off between water use efficiency and nitrogen utilisation efficiency, particularly under conditions of high nitrogen-to-grain price ratio. The rate-range framework to characterise the relationship between yield and evapotranspiration is useful to capture this trade-off as the parameters were responsive to both nitrogen supply and climatic factors.
Resumo:
Polytypes have been simulated, treating them as analogues of a one-dimensional spin-half Ising chain with competing short-range and infinite-range interactions. Short-range interactions are treated as random variables to approximate conditions of growth from melt as well as from vapour. Besides ordered polytypes up to 12R, short stretches of long-period polytypes (up to 33R) have been observed. Such long-period sequences could be of significance in the context of Frank's theory of polytypism. The form of short-range interactions employed in the study has been justified by carrying out model potential calculations.
Resumo:
A hot billet in contact with relatively cold dies undergoes rapid cooling in the forging operation. This may give rise to unfilled cavities, poor surface finish and stalling of the press. A knowledge of billet-die temperatures as a function of time is therefore essential for process design. A computer code using finite difference method is written to estimate such temperature histories and validated by comparing the predicted cooling of an integral die-billet configuration with that obtained experimentally.
Resumo:
Using the link-link incidence matrix to represent a simple-jointed kinematic chain algebraic procedures have been developed to determine its structural characteristics such as the type of freedom of the chain, the number of distinct mechanisms and driving mechanisms that can be derived from the chain. A computer program incorporating these graph theory based procedures has been applied successfully for the structural analysis of several typical chains.
Resumo:
It is shown that a leaky aquifer model can be used for well field analysis in hard rock areas, treating the upper weathered and clayey layers as a composite unconfined aquitard overlying a deeper fractured aquifer. Two long-duration pump test studies are reported in granitic and schist regions in the Vedavati river basin. The validity of simplifications in the analytical solution is verified by finite difference computations.