185 resultados para Could computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In my book Nirvana: The True Story (2006), I undertake an autoethnographical approach to biography, attempting to impart an understanding of my chosen subject - the rock band Nirvana - via discussion of my own experiences. On numerous occasions, I veer off into tangential asides, frequently using extensive footnotes to explain obscure musical references. Personal anecdotes are juxtaposed with "insider" information; at crucial points in the story (notable concerts, the first meeting of singer Kurt Cobain with his future wife Courtney Love, the news of Cobain's suicide), the linear thread of the narrative spills over into a multi-faceted approach, with several different (and sometimes opposing) voices given equal prominence. Despite my firsthand experience of the band, however, Nirvana: The True Story is not considered authoritative, even within its own field. This article considers the reasons why this may be the case.In my book Nirvana: The True Story (2006), I undertake an autoethnographical approach to biography, attempting to impart an understanding of my chosen subject - the rock band Nirvana - via discussion of my own experiences. On numerous occasions, I veer off into tangential asides, frequently using extensive footnotes to explain obscure musical references. Personal anecdotes are juxtaposed with "insider" information; at crucial points in the story (notable concerts, the first meeting of singer Kurt Cobain with his future wife Courtney Love, the news of Cobain's suicide), the linear thread of the narrative spills over into a multi-faceted approach, with several different (and sometimes opposing) voices given equal prominence. Despite my firsthand experience of the band, however, Nirvana: The True Story is not considered authoritative, even within its own field. This article considers the reasons why this may be the case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biological systems are typically complex and adaptive, involving large numbers of entities, or organisms, and many-layered interactions between these. System behaviour evolves over time, and typically benefits from previous experience by retaining memory of previous events. Given the dynamic nature of these phenomena, it is non-trivial to provide a comprehensive description of complex adaptive systems and, in particular, to define the importance and contribution of low-level unsupervised interactions to the overall evolution process. In this chapter, the authors focus on the application of the agent-based paradigm in the context of the immune response to HIV. Explicit implementation of lymph nodes and the associated lymph network, including lymphatic chain structure, is a key objective, and requires parallelisation of the model. Steps taken towards an optimal communication strategy are detailed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Recent advances in Immunology highlighted the importance of local properties on the overall progression of HIV infection. In particular, the gastrointestinal tract is seen as a key area during early infection, and the massive cell depletion associated with it may influence subsequent disease progression. This motivated the development of a large-scale agent-based model. Results Lymph nodes are explicitly implemented, and considerations on parallel computing permit large simulations and the inclusion of local features. The results obtained show that GI tract inclusion in the model leads to an accelerated disease progression, during both the early stages and the long-term evolution, compared to a theoretical, uniform model. Conclusions These results confirm the potential of treatment policies currently under investigation, which focus on this region. They also highlight the potential of this modelling framework, incorporating both agent-based and network-based components, in the context of complex systems where scaling-up alone does not result in models providing additional insights.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical systems involve a large number of entities and intricate interactions between these. Their direct analysis is, therefore, difficult, and it is often necessary to rely on computational models. These models require significant resources and parallel computing solutions. These approaches are particularly suited, given parallel aspects in the nature of biomedical systems. Model hybridisation also permits the integration and simultaneous study of multiple aspects and scales of these systems, thus providing an efficient platform for multidisciplinary research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several algorithms and techniques widely used in Computer Science have been adapted from, or inspired by, known biological phenomena. This is a consequence of the multidisciplinary background of most early computer scientists. The field has now matured, and permits development of tools and collaborative frameworks which play a vital role in advancing current biomedical research. In this paper, we briefly present examples of the former, and elaborate upon two of the latter, applied to immunological modelling and as a new paradigm in gene expression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main challenges in data analytics is that discovering structures and patterns in complex datasets is a computer-intensive task. Recent advances in high-performance computing provide part of the solution. Multicore systems are now more affordable and more accessible. In this paper, we investigate how this can be used to develop more advanced methods for data analytics. We focus on two specific areas: model-driven analysis and data mining using optimisation techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As computational models in fields such as medicine and engineering get more refined, resource requirements are increased. In a first instance, these needs have been satisfied using parallel computing and HPC clusters. However, such systems are often costly and lack flexibility. HPC users are therefore tempted to move to elastic HPC using cloud services. One difficulty in making this transition is that HPC and cloud systems are different, and performance may vary. The purpose of this study is to evaluate cloud services as a means to minimise both cost and computation time for large-scale simulations, and to identify which system properties have the most significant impact on performance. Our simulation results show that, while the performance of Virtual CPU (VCPU) is satisfactory, network throughput may lead to difficulties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research field of urban computing – defined as “the integration of computing, sensing, and actuation technologies into everyday urban settings and lifestyles” – considers the design and use of ubiquitous computing technology in public and shared urban environments. Its impact on cities, buildings, and spaces evokes innumerable kinds of change. Embedded into our everyday lived environments, urban computing technologies have the potential to alter the meaning of physical space, and affect the activities performed in those spaces. This paper starts a multi-themed discussion of various aspects that make up the, at times, messy and certainly transdisciplinary field of urban computing and urban informatics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban agriculture refers to the production of food in urban and peri-urban spaces. It can contribute positively to health and food security of a city, while also reducing ‘food miles.’ It takes on many forms, from the large and organised community garden, to the small and discrete backyard or balcony. This study focuses on small-scale food production in the form of residential gardening for home or personal use. We explore opportunities to support people’s engagement in urban agriculture via human-computer interaction design. This research presents the findings and HCI design insights from our study of residential gardeners in Brisbane, Australia. By exploring their understanding of gardening practice with a human-centred design approach, we present six key themes, highlighting opportunities and challenges relating to available time and space; the process of learning and experimentation; and the role of existing online platforms to support gardening practice. Finally we discuss the overarching theme of shared knowledge, and how HCI could improve community engagement and gardening practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The efficient computation of matrix function vector products has become an important area of research in recent times, driven in particular by two important applications: the numerical solution of fractional partial differential equations and the integration of large systems of ordinary differential equations. In this work we consider a problem that combines these two applications, in the form of a numerical solution algorithm for fractional reaction diffusion equations that after spatial discretisation, is advanced in time using the exponential Euler method. We focus on the efficient implementation of the algorithm on Graphics Processing Units (GPU), as we wish to make use of the increased computational power available with this hardware. We compute the matrix function vector products using the contour integration method in [N. Hale, N. Higham, and L. Trefethen. Computing Aα, log(A), and related matrix functions by contour integrals. SIAM J. Numer. Anal., 46(5):2505–2523, 2008]. Multiple levels of preconditioning are applied to reduce the GPU memory footprint and to further accelerate convergence. We also derive an error bound for the convergence of the contour integral method that allows us to pre-determine the appropriate number of quadrature points. Results are presented that demonstrate the effectiveness of the method for large two-dimensional problems, showing a speedup of more than an order of magnitude compared to a CPU-only implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The evaluation of the hand function is an essential element within the clinical practice. The usual assessments are focus on the ability to perform activities of daily life. The inclusion of instruments to measure kinematic variables provides a new approach to the assessment. Inertial sensors adapted to the hand could be used as a complementary instrument to the traditional assessment. Material: clinimetric assessment (Upper Limb Functional Index, Quick Dash), antrophometric variables (eight and weight), dynamometry (palm preasure) was taken. Functional analysis was made with Acceleglove system for the right hand and computer system. The glove has six acceleration sensor, one on each finger and another one on the reverse palm. Method Analytic, transversal approach. Ten healthy subject made six task on evaluation table (tripod pinch, lateral pinch and tip pinch, extension grip, spherical grip and power grip). Each task was made and measure three times, the second one was analyze for the results section. A Matlab script was created for the analysis of each movement and detection phase based on module vector. Results The module acceleration vector offers useful information of the hand function. The data analysis obtained during the performance of functional gestures allows to identify five different phases within the movement, three static phase and tow dynamic, each module vector was allied to one task. Conclusion Module vector variables could be used for the analysis of the different task made by the hand. Inertial sensor could be use as a complement for the traditional assessment system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Study Design Cross-sectional study. Objectives To compare erector spinae (ES) muscle fatigue between chronic non-specific lower back pain (CNLBP) sufferers and healthy subjects from a biomechanical perspective during fatiguing isometric lumbar extensions. Background Paraspinal muscle maximal contraction and fatigue are used as a functional predictor for disabilities. The simplest method to determine muscle fatigue is by evaluating the evolution during specific contractions, such as isometric contractions. There are no studies that evaluate the evolution of the ES muscle during fatiguing isometric lumbar extensions and analyse functional and architectural variables. Methods In a pre-calibrated system, participants performed a maximal isometric extension of the lumbar spine for 5 and 30 seconds. Functional variables (torque and muscle activation) and architecture (pennation angle and muscle thickness) were measured using a load cell, surface electromyography and ultrasound, respectively. The results were normalised and a reliability study of the ultrasound measurement was made. Results: The ultrasound measurements were highly reliable, with Cronbach’s alpha values ranging from 0.951 0.981. All measured variables shown significant differences before and after fatiguing isometric lumbar extension. Conclusion During a lumbar isometric extension test, architecture and functional variables of the ES muscle could be analised using ultrasound, surface EMG and load cell. In adition, during an endurance test, ES muscle suffers an acute effect on architectural and functional variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asset management has broadened from a focus on maintenance management to whole of life cycle asset management requiring a suite of new competencies from asset procurement to management and disposal. Well developed skills and competencies as well as practical experience are a prerequisite to maintain capability, to manage demand as well to plan and set priorities and ensure on-going asset sustainability. This paper has as its focus to establish critical understandings of data, information and knowledge for asset management along with the way in which benchmarking these attributes through computer-aided design may aid a strategic approach to asset management. The paper provides suggestions to improve sharing, integration and creation of asset-related knowledge through the application of codification and personalization approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fair Use Week has celebrated the evolution and development of the defence of fair use under copyright law in the United States. As Krista Cox noted, ‘As a flexible doctrine, fair use can adapt to evolving technologies and new situations that may arise, and its long history demonstrates its importance in promoting access to information, future innovation, and creativity.’ While the defence of fair use has flourished in the United States, the adoption of the defence of fair use in other jurisdictions has often been stymied. Professor Peter Jaszi has reflected: ‘We can only wonder (with some bemusement) why some of our most important foreign competitors, like the European Union, haven’t figured out that fair use is, to a great extent, the “secret sauce” of U.S. cultural competitiveness.’ Jurisdictions such as Australia have been at a dismal disadvantage, because they lack the freedoms and flexibilities of the defence of fair use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speculative property developers, criticised for building dog boxes and the slums of tomorrow, are generally hated by urban planners and the public alike. But the doors of state governments are seemingly always open to developers and their lobbyists. Politicians find it hard to say no to the demands of the development industry for concessions because of the contribution housing construction makes to the economic bottom line and because there is a need for well located housing. New supply is also seen as a solution to declining housing affordability. Classical economic theory however is too simplistic for housing supply. Instead, an offshoot of Game Theory - Market Design – not only offers greater insight into apartment supply but also can simultaneously address price, design and quality issues. New research reveals the most significant risk in residential development is settlement risk – when buyers fail to proceed with their purchase despite there being a pre-sale contract. At the point of settlement, the developer has expended all the project funds only to see forecast revenue evaporate. While new buyers may be found, this process is likely to strip the profitability out of the project. As the global financial crisis exposed, buyers are inclined to walk if property values slide. This settlement problem reflects a poor legal mechanism (the pre-sale contract), and a lack of incentive for truthfulness. A second problem is the search costs of finding buyers. At around 10% of project costs, pre-sales are more expensive to developers than finance. This is where Market Design comes in.