289 resultados para cloud computing accountability
Resumo:
Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.
Resumo:
Biological systems are typically complex and adaptive, involving large numbers of entities, or organisms, and many-layered interactions between these. System behaviour evolves over time, and typically benefits from previous experience by retaining memory of previous events. Given the dynamic nature of these phenomena, it is non-trivial to provide a comprehensive description of complex adaptive systems and, in particular, to define the importance and contribution of low-level unsupervised interactions to the overall evolution process. In this chapter, the authors focus on the application of the agent-based paradigm in the context of the immune response to HIV. Explicit implementation of lymph nodes and the associated lymph network, including lymphatic chain structure, is a key objective, and requires parallelisation of the model. Steps taken towards an optimal communication strategy are detailed.
Resumo:
Background Recent advances in Immunology highlighted the importance of local properties on the overall progression of HIV infection. In particular, the gastrointestinal tract is seen as a key area during early infection, and the massive cell depletion associated with it may influence subsequent disease progression. This motivated the development of a large-scale agent-based model. Results Lymph nodes are explicitly implemented, and considerations on parallel computing permit large simulations and the inclusion of local features. The results obtained show that GI tract inclusion in the model leads to an accelerated disease progression, during both the early stages and the long-term evolution, compared to a theoretical, uniform model. Conclusions These results confirm the potential of treatment policies currently under investigation, which focus on this region. They also highlight the potential of this modelling framework, incorporating both agent-based and network-based components, in the context of complex systems where scaling-up alone does not result in models providing additional insights.
Resumo:
Biomedical systems involve a large number of entities and intricate interactions between these. Their direct analysis is, therefore, difficult, and it is often necessary to rely on computational models. These models require significant resources and parallel computing solutions. These approaches are particularly suited, given parallel aspects in the nature of biomedical systems. Model hybridisation also permits the integration and simultaneous study of multiple aspects and scales of these systems, thus providing an efficient platform for multidisciplinary research.
Resumo:
Several algorithms and techniques widely used in Computer Science have been adapted from, or inspired by, known biological phenomena. This is a consequence of the multidisciplinary background of most early computer scientists. The field has now matured, and permits development of tools and collaborative frameworks which play a vital role in advancing current biomedical research. In this paper, we briefly present examples of the former, and elaborate upon two of the latter, applied to immunological modelling and as a new paradigm in gene expression.
Resumo:
One of the main challenges in data analytics is that discovering structures and patterns in complex datasets is a computer-intensive task. Recent advances in high-performance computing provide part of the solution. Multicore systems are now more affordable and more accessible. In this paper, we investigate how this can be used to develop more advanced methods for data analytics. We focus on two specific areas: model-driven analysis and data mining using optimisation techniques.
Resumo:
The research field of urban computing – defined as “the integration of computing, sensing, and actuation technologies into everyday urban settings and lifestyles” – considers the design and use of ubiquitous computing technology in public and shared urban environments. Its impact on cities, buildings, and spaces evokes innumerable kinds of change. Embedded into our everyday lived environments, urban computing technologies have the potential to alter the meaning of physical space, and affect the activities performed in those spaces. This paper starts a multi-themed discussion of various aspects that make up the, at times, messy and certainly transdisciplinary field of urban computing and urban informatics.
Resumo:
This Article proposes a meta-regulation approach to address the gap between the objectives, commitment, practice and outcome in the accountability practice of the global supply chain in the developing countries. The literatures on the accountability practice in the global supply chains typically focuses on the strategies for raising corporate social accountability standards in multinational buying firms and seldom focuses on this strategies in the outsourced firms in the developing countries. This article tries to fill this void by examining the situation in Bangladesh, the third largest RMG supply country in the world. It conceptualizes a meta-regulation approach with the aim of raising social accountability practice in this industry. It shows that this regulation approach is suitable to effectively raise this practice standard in a perspective where the non-legal drivers are meagrely low, global buying firms are highly profit driven and the governmental agencies are either inadequate or highly corrupt.
Resumo:
This study investigates stakeholder pressures on corporate climate change-related accountability and disclosure practices in Australia. While existing scholarship investigates stakeholder pressures on companies to discharge their broader accountability through general social and environmental disclosures, there is a lack of research investigating whether and how stakeholder pressures emerge to influence accountability and disclosure practices related to climate change. We surveyed various stakeholder groups to understand their concerns about climate change-related corporate accountability and disclosure practices. We present three primary findings: first, while NGOs and the media have some influence, institutional investors and government bodies (regulators) are perceived to be the most powerful stakeholders in generating climate change-related concern and coercive pressure on corporations to be accountable. Second, corporate climate change-related disclosures, as documented through the Carbon Disclosure Project (CDP), are positively associated with such perceived coercive pressures. Lastly, we find a positive correlation between the level of media attention to climate change and Australian corporate responses to the CDP. Our results indicate that corporations will not disclose climate change information until pressured by non-financial stakeholders. This suggests a larger role for non-financial actors than previously theorized, with several policy implications.
Resumo:
In contrast to single robotic agent, multi-robot systems are highly dependent on reliable communication. Robots have to synchronize tasks or to share poses and sensor readings with other agents, especially for co-operative mapping task where local sensor readings are incorporated into a global map. The drawback of existing communication frameworks is that most are based on a central component which has to be constantly within reach. Additionally, they do not prevent data loss between robots if a failure occurs in the communication link. During a distributed mapping task, loss of data is critical because it will corrupt the global map. In this work, we propose a cloud-based publish/subscribe mechanism which enables reliable communication between agents during a cooperative mission using the Data Distribution Service (DDS) as a transport layer. The usability of our approach is verified by several experiments taking into account complete temporary communication loss.
Resumo:
POSTER: Information Accountability Framework (IAF) to mitigate and manage the risk of data breaches and unauthorised used of medical information (e.g., Electronic Health Records)
Resumo:
The efficient computation of matrix function vector products has become an important area of research in recent times, driven in particular by two important applications: the numerical solution of fractional partial differential equations and the integration of large systems of ordinary differential equations. In this work we consider a problem that combines these two applications, in the form of a numerical solution algorithm for fractional reaction diffusion equations that after spatial discretisation, is advanced in time using the exponential Euler method. We focus on the efficient implementation of the algorithm on Graphics Processing Units (GPU), as we wish to make use of the increased computational power available with this hardware. We compute the matrix function vector products using the contour integration method in [N. Hale, N. Higham, and L. Trefethen. Computing Aα, log(A), and related matrix functions by contour integrals. SIAM J. Numer. Anal., 46(5):2505–2523, 2008]. Multiple levels of preconditioning are applied to reduce the GPU memory footprint and to further accelerate convergence. We also derive an error bound for the convergence of the contour integral method that allows us to pre-determine the appropriate number of quadrature points. Results are presented that demonstrate the effectiveness of the method for large two-dimensional problems, showing a speedup of more than an order of magnitude compared to a CPU-only implementation.
Resumo:
Over the past decade, an exciting area of research has emerged that demonstrates strong links between specific nursing care activities and patient outcomes. This body of research has resulted in the identification of a set of "nursing-sensitive outcomes"(NSOs). These NSOs may be interpreted with more meaning when they are linked to evidence-based best practice guidelines, which provide a structured means of ensuring care is consistent among all health care team members, across geographic locations, and across care settings. Uptake of evidence-based best practices at the point of care has been shown to have a measurable positive impact on processes of care and patient outcomes. The purpose of this paper is to present a systematic, narrative review of the literature regarding the clinical effectiveness of nursing management strategies on stroke patient outcomes sensitive to nursing interventions. Subsequent investigation will explore current applications of nursing-sensitive outcomes to patients with stroke, and identify and validate measurable NSOs within stroke care delivery.