998 resultados para Could computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The evaluation of the hand function is an essential element within the clinical practice. The usual assessments are focus on the ability to perform activities of daily life. The inclusion of instruments to measure kinematic variables provides a new approach to the assessment. Inertial sensors adapted to the hand could be used as a complementary instrument to the traditional assessment. Material: clinimetric assessment (Upper Limb Functional Index, Quick Dash), antrophometric variables (eight and weight), dynamometry (palm preasure) was taken. Functional analysis was made with Acceleglove system for the right hand and computer system. The glove has six acceleration sensor, one on each finger and another one on the reverse palm. Method Analytic, transversal approach. Ten healthy subject made six task on evaluation table (tripod pinch, lateral pinch and tip pinch, extension grip, spherical grip and power grip). Each task was made and measure three times, the second one was analyze for the results section. A Matlab script was created for the analysis of each movement and detection phase based on module vector. Results The module acceleration vector offers useful information of the hand function. The data analysis obtained during the performance of functional gestures allows to identify five different phases within the movement, three static phase and tow dynamic, each module vector was allied to one task. Conclusion Module vector variables could be used for the analysis of the different task made by the hand. Inertial sensor could be use as a complement for the traditional assessment system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Study Design Cross-sectional study. Objectives To compare erector spinae (ES) muscle fatigue between chronic non-specific lower back pain (CNLBP) sufferers and healthy subjects from a biomechanical perspective during fatiguing isometric lumbar extensions. Background Paraspinal muscle maximal contraction and fatigue are used as a functional predictor for disabilities. The simplest method to determine muscle fatigue is by evaluating the evolution during specific contractions, such as isometric contractions. There are no studies that evaluate the evolution of the ES muscle during fatiguing isometric lumbar extensions and analyse functional and architectural variables. Methods In a pre-calibrated system, participants performed a maximal isometric extension of the lumbar spine for 5 and 30 seconds. Functional variables (torque and muscle activation) and architecture (pennation angle and muscle thickness) were measured using a load cell, surface electromyography and ultrasound, respectively. The results were normalised and a reliability study of the ultrasound measurement was made. Results: The ultrasound measurements were highly reliable, with Cronbach’s alpha values ranging from 0.951 0.981. All measured variables shown significant differences before and after fatiguing isometric lumbar extension. Conclusion During a lumbar isometric extension test, architecture and functional variables of the ES muscle could be analised using ultrasound, surface EMG and load cell. In adition, during an endurance test, ES muscle suffers an acute effect on architectural and functional variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asset management has broadened from a focus on maintenance management to whole of life cycle asset management requiring a suite of new competencies from asset procurement to management and disposal. Well developed skills and competencies as well as practical experience are a prerequisite to maintain capability, to manage demand as well to plan and set priorities and ensure on-going asset sustainability. This paper has as its focus to establish critical understandings of data, information and knowledge for asset management along with the way in which benchmarking these attributes through computer-aided design may aid a strategic approach to asset management. The paper provides suggestions to improve sharing, integration and creation of asset-related knowledge through the application of codification and personalization approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fair Use Week has celebrated the evolution and development of the defence of fair use under copyright law in the United States. As Krista Cox noted, ‘As a flexible doctrine, fair use can adapt to evolving technologies and new situations that may arise, and its long history demonstrates its importance in promoting access to information, future innovation, and creativity.’ While the defence of fair use has flourished in the United States, the adoption of the defence of fair use in other jurisdictions has often been stymied. Professor Peter Jaszi has reflected: ‘We can only wonder (with some bemusement) why some of our most important foreign competitors, like the European Union, haven’t figured out that fair use is, to a great extent, the “secret sauce” of U.S. cultural competitiveness.’ Jurisdictions such as Australia have been at a dismal disadvantage, because they lack the freedoms and flexibilities of the defence of fair use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speculative property developers, criticised for building dog boxes and the slums of tomorrow, are generally hated by urban planners and the public alike. But the doors of state governments are seemingly always open to developers and their lobbyists. Politicians find it hard to say no to the demands of the development industry for concessions because of the contribution housing construction makes to the economic bottom line and because there is a need for well located housing. New supply is also seen as a solution to declining housing affordability. Classical economic theory however is too simplistic for housing supply. Instead, an offshoot of Game Theory - Market Design – not only offers greater insight into apartment supply but also can simultaneously address price, design and quality issues. New research reveals the most significant risk in residential development is settlement risk – when buyers fail to proceed with their purchase despite there being a pre-sale contract. At the point of settlement, the developer has expended all the project funds only to see forecast revenue evaporate. While new buyers may be found, this process is likely to strip the profitability out of the project. As the global financial crisis exposed, buyers are inclined to walk if property values slide. This settlement problem reflects a poor legal mechanism (the pre-sale contract), and a lack of incentive for truthfulness. A second problem is the search costs of finding buyers. At around 10% of project costs, pre-sales are more expensive to developers than finance. This is where Market Design comes in.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The requirement of distributed computing of all-to-all comparison (ATAC) problems in heterogeneous systems is increasingly important in various domains. Though Hadoop-based solutions are widely used, they are inefficient for the ATAC pattern, which is fundamentally different from the MapReduce pattern for which Hadoop is designed. They exhibit poor data locality and unbalanced allocation of comparison tasks, particularly in heterogeneous systems. The results in massive data movement at runtime and ineffective utilization of computing resources, affecting the overall computing performance significantly. To address these problems, a scalable and efficient data and task distribution strategy is presented in this paper for processing large-scale ATAC problems in heterogeneous systems. It not only saves storage space but also achieves load balancing and good data locality for all comparison tasks. Experiments of bioinformatics examples show that about 89\% of the ideal performance capacity of the multiple machines have be achieved through using the approach presented in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During everyday urban life, people spend time in public urban places waiting for specific events to occur. During these times, people sometimes tend to engage with their information and communication technology (ICT) devices in a way that shuts off interactions with collocated people. These devices could also be used to better connect with the urban space and collocated people within. This chapter presents and discusses the impact of three design interventions on the urban user experience enabling collocated people to share lightweight, non-privacy-sensitive data in the urban space. We investigate and discuss the impact on the urban experience under the notions of people, place, and technology with an emphasis on how the sharing of non-privacy-sensitive data can positively transform anonymous public urban places in various ways through anonymous digital augmentations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The deployment of new emerging technologies, such as cooperative systems, allows the traffic community to foresee relevant improvements in terms of traffic safety and efficiency. Autonomous vehicles are able to share information about the local traffic state in real time, which could result in a better reaction to the mechanism of traffic jam formation. An upstream single-hop radio broadcast network can improve the perception of each cooperative driver within a specific radio range and hence the traffic stability. The impact of vehicle to vehicle cooperation on the onset of traffic congestion is investigated analytically and through simulation. A next generation simulation field dataset is used to calibrate the full velocity difference car-following model, and the MOBIL lane-changing model is implemented. The robustness of the calibration as well as the heterogeneity of the drivers is discussed. Assuming that congestion can be triggered either by the heterogeneity of drivers' behaviours or abnormal lane-changing behaviours, the calibrated car-following model is used to assess the impact of a microscopic cooperative law on egoistic lane-changing behaviours. The cooperative law can help reduce and delay traffic congestion and can have a positive effect on safety indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we look at the concept of reversibility, that is, negating opposites, counterbalances, and actions that can be reversed. Piaget identified reversibility as an indicator of the ability to reason at a concrete operational level. We investigate to what degree novice programmers manifest the ability to work with this concept of reversibility by providing them with a small piece of code and then asking them to write code that undoes the effect of that code. On testing entire cohorts of students in their first year of learning to program, we found an overwhelming majority of them could not cope with such a concept. We then conducted think aloud studies of novices where we observed them working on this task and analyzed their contrasting abilities to deal with it. The results of this study demonstrate the need for better understanding our students' reasoning abilities, and a teaching model aimed at that level of reality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Many koala populations around Australia are in serious decline, with a substantial component of this decline in some Southeast Queensland populations attributed to the impact of Chlamydia. A Chlamydia vaccine for koalas is in development and has shown promise in early trials. This study contributes to implementation preparedness by simulating vaccination strategies designed to reverse population decline and by identifying which age and sex category it would be most effective to target. METHODS We used field data to inform the development and parameterisation of an individual-based stochastic simulation model of a koala population endemic with Chlamydia. The model took into account transmission, morbidity and mortality caused by Chlamydia infections. We calibrated the model to characteristics of typical Southeast Queensland koala populations. As there is uncertainty about the effectiveness of the vaccine in real-world settings, a variety of potential vaccine efficacies, half-lives and dosing schedules were simulated. RESULTS Assuming other threats remain constant, it is expected that current population declines could be reversed in around 5-6 years if female koalas aged 1-2 years are targeted, average vaccine protective efficacy is 75%, and vaccine coverage is around 10% per year. At lower vaccine efficacies the immunological effects of boosting become important: at 45% vaccine efficacy population decline is predicted to reverse in 6 years under optimistic boosting assumptions but in 9 years under pessimistic boosting assumptions. Terminating a successful vaccination programme at 5 years would lead to a rise in Chlamydia prevalence towards pre-vaccination levels. CONCLUSION For a range of vaccine efficacy levels it is projected that population decline due to endemic Chlamydia can be reversed under realistic dosing schedules, potentially in just 5 years. However, a vaccination programme might need to continue indefinitely in order to maintain Chlamydia prevalence at a sufficiently low level for population growth to continue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysing wastewater samples is an innovative approach that overcomes many limitations of traditional surveys to identify and measure a range of chemicals that were consumed by or exposed to people living in a sewer catchment area. First conceptualised in 2001, much progress has been made to make wastewater analysis (WWA) a reliable and robust tool for measuring chemical consumption and/or exposure. At the moment, the most popular application of WWA, sometimes referred as sewage epidemiology, is to monitor the consumption of illicit drugs in communities around the globe, including China. The approach has been largely adopted by law enforcement agencies as a device to monitor the temporal and geographical patterns of drug consumption. In the future, the methodology can be extended to other chemicals including biomarkers of population health (e.g. environmental or oxidative stress biomarkers, lifestyle indicators or medications that are taken by different demographic groups) and pollutants that people are exposed to (e.g. polycyclic aromatic hydrocarbons, perfluorinated chemicals, and toxic pesticides). The extension of WWA to a huge range of chemicals may give rise to a field called sewage chemical-information mining (SCIM) with unexplored potentials. China has many densely populated cities with thousands of sewage treatment plants which are favourable for applying WWA/SCIM in order to help relevant authorities gather information about illicit drug consumption and population health status. However, there are some prerequisites and uncertainties of the methodology that should be addressed for SCIM to reach its full potential in China.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adopting a multi-theoretical approach, I examine external auditors’ perceptions of the reasons why organizations do or do not adopt cloud computing. I interview forensic accountants and IT experts about the adoption, acceptance, institutional motives, and risks of cloud computing. Although the medium to large accounting firms where the external auditors worked almost exclusively used private clouds, both private and public cloud services were gaining a foothold among many of their clients. Despite the advantages of cloud computing, data confidentiality and the involvement of foreign jurisdictions remain a concern, particularly if the data are moved outside Australia. Additionally, some organizations seem to understand neither the technology itself nor their own requirements, which may lead to poorly negotiated contracts and service agreements. To minimize the risks associated with cloud computing, many organizations turn to hybrid solutions or private clouds that include national or dedicated data centers. To the best of my knowledge, this is the first empirical study that reports on cloud computing adoption from the perspectives of external auditors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a decentralized dynamic load scheduling/balancing algorithm called ELISA (Estimated Load Information Scheduling Algorithm) for general purpose distributed computing systems. ELISA uses estimated state information based upon periodic exchange of exact state information between neighbouring nodes to perform load scheduling. The primary objective of the algorithm is to cut down on the communication and load transfer overheads by minimizing the frequency of status exchange and by restricting the load transfer and status exchange within the buddy set of a processor. It is shown that the resulting algorithm performs almost as well as a perfect information algorithm and is superior to other load balancing schemes based on the random sharing and Ni-Hwang algorithms. A sensitivity analysis to study the effect of various design parameters on the effectiveness of load balancing is also carried out. Finally, the algorithm's performance is tested on large dimensional hypercubes in the presence of time-varying load arrival process and is shown to perform well in comparison to other algorithms. This makes ELISA a viable and implementable load balancing algorithm for use in general purpose distributed computing systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simultaneous consideration of both performance and reliability issues is important in the choice of computer architectures for real-time aerospace applications. One of the requirements for such a fault-tolerant computer system is the characteristic of graceful degradation. A shared and replicated resources computing system represents such an architecture. In this paper, a combinatorial model is used for the evaluation of the instruction execution rate of a degradable, replicated resources computing system such as a modular multiprocessor system. Next, a method is presented to evaluate the computation reliability of such a system utilizing a reliability graph model and the instruction execution rate. Finally, this computation reliability measure, which simultaneously describes both performance and reliability, is applied as a constraint in an architecture optimization model for such computing systems. Index Terms-Architecture optimization, computation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is aimed at reviewing the notion of Byzantine-resilient distributed computing systems, the relevant protocols and their possible applications as reported in the literature. The three agreement problems, namely, the consensus problem, the interactive consistency problem, and the generals problem have been discussed. Various agreement protocols for the Byzantine generals problem have been summarized in terms of their performance and level of fault-tolerance. The three classes of Byzantine agreement protocols discussed are the deterministic, randomized, and approximate agreement protocols. Finally, application of the Byzantine agreement protocols to clock synchronization is highlighted.