988 resultados para Cloud Workshop
Resumo:
An International Society of Sugar Cane Technologists (ISSCT) Engineering Workshop was held in Piracicaba, Brazil from 30 June to 4 July 2008. The theme of the workshop was Design, manufacturing and maintenance of sugar mill equipment. The workshop consisted of a series of technical sessions and site visits. The Brazilian sugar industry is growing rapidly. The growth has occurred as the result of the sugar industry’s position as a key provider of renewable energy in the form of ethanol and, more recently, electricity. The increased focus on electricity is seeing investment in high pressure (100 bar) boilers, cane cleaning plants that allow an increased biomass supply from trash and digesters that produce biogas from dunder. It is clear that the Brazilian sugar industry has a well defined place in the country’s future. The ISSCT workshop provided a good opportunity to gain information from equipment suppliers and discuss new technology that may have application in Australia. The new technologies of interest included IMCO sintered carbide shredder hammer tips, Fives Cail MillMax mills, planetary mill gearboxes, Bosch Projects chainless diffusers, Fives Cail Zuka centrifugals and Vaperma Siftek membrane systems.
Resumo:
In the HealthMap project for People With HIV, (PWHIV) designers employed a collaborative rapid ‘persona-building' workshop with health researchers to develop patient personas that embodied patient-centred design goals and contextual awareness from a variety of qualitative and quantitative data. On reflection, this collaborative rapid workshop was a process for drawing together the divergent user research insights and expertise of stakeholders into focus for a chronic disease self-management design. This paper discusses, (i) an analysis of the transcript of the workshop and, (ii) interviews with five practising senior designers, in order to reflect on how the persona-building process was enacted and its role in the HealthMap design evolution. The collaborative rapid persona-building methodology supported: embedding user research insights, eliciting domain expertise, introducing design thinking, facilitating stakeholder collaboration and defining early design requirements. The contribution of this paper is to model the process of collaborative rapid persona-building and to introduce the collaborative rapid persona-building framework as a method to generate design priorities from domain expertise and user research data.
Resumo:
This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.
Resumo:
Cloud Computing, based on early virtual computer concepts and technologies, is now itself a maturing technology in the marketplace and it has revolutionized the IT industry, being the powerful platform that many businesses are choosing to migrate their in-premises IT services onto. Cloud solution has the potential to reduce the capital and operational expenses associated with deploying IT services on their own. In this study, we have implemented our own private cloud solution, infrastructure as a service (IaaS), using the OpenStack platform with high availability and a dynamic resource allocation mechanism. Besides, we have hosted unified communication as a service (UCaaS) in the underlying IaaS and successfully tested voice over IP (VoIP), video conferencing, voice mail and instant messaging (IM) with clients located at the remote site. The proposed solution has been developed in order to give advice to bussinesses that want to build their own cloud environment, IaaS and host cloud services and applicatons in the cloud. This paper also aims at providing an alternate option for proprietary cloud solutions for service providers to consider.
Resumo:
Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.
Resumo:
As computational models in fields such as medicine and engineering get more refined, resource requirements are increased. In a first instance, these needs have been satisfied using parallel computing and HPC clusters. However, such systems are often costly and lack flexibility. HPC users are therefore tempted to move to elastic HPC using cloud services. One difficulty in making this transition is that HPC and cloud systems are different, and performance may vary. The purpose of this study is to evaluate cloud services as a means to minimise both cost and computation time for large-scale simulations, and to identify which system properties have the most significant impact on performance. Our simulation results show that, while the performance of Virtual CPU (VCPU) is satisfactory, network throughput may lead to difficulties.
Resumo:
The importance of design practice informed by urban morphology has led to intensification in interest, signalled by the formation of the ISUF Research and Practice Task Force and voiced through several recent academic publications cognisant of this current debate, this paper reports on a recent urban design workshop at which morphology was set as one of the key themes. Initially planned to be programmed as a augmented concurrent event to the 2013 20th ISUF conference held in Brisbane, the two day Bridge to Bridge: Ridge to Ridge urban design workshop nevertheless took place the following month, and involved over one hundred design professionals and academics. The workshop sought to develop several key urban design principles and recommendations addressing a major government development proposal sited in the most important heritage precinct of the city. The paper will focus specifically on one of the nine groups, in which the design proposal was purposefully guided by morphological input. The discussion will examine the design outcomes and illicit review and feedback from participants, shedding critical light on the issues that arise from such a design approach.
Resumo:
Organizational and technological systems analysis and design practices such as process modeling have received much attention in recent years. However, while knowledge about related artifacts such as models, tools, or grammars has substantially matured, little is known about the actual tasks and interaction activities that are conducted as part of analysis and design acts. In particular, key role of the facilitator has not been researched extensively to date. In this paper, we propose a new conceptual framework that can be used to examine facilitation behaviors in process modeling projects. The framework distinguishes four behavioral styles in facilitation (the driving engineer, the driving artist, the catalyzing engineer, and the catalyzing artist) that a facilitator can adopt. To distinguish between the four styles, we provide a set of ten behavioral anchors that underpin facilitation behaviors. We also report on a preliminary empirical exploration of our framework through interviews with experienced analysts in six modeling cases. Our research provides a conceptual foundation for an emerging theory for describing and explaining different behaviors associated with process modeling facilitation, provides first preliminary empirical results about facilitation in modeling projects, and provides a fertile basis for examining facilitation in other conceptual modeling activities.
Resumo:
In contrast to single robotic agent, multi-robot systems are highly dependent on reliable communication. Robots have to synchronize tasks or to share poses and sensor readings with other agents, especially for co-operative mapping task where local sensor readings are incorporated into a global map. The drawback of existing communication frameworks is that most are based on a central component which has to be constantly within reach. Additionally, they do not prevent data loss between robots if a failure occurs in the communication link. During a distributed mapping task, loss of data is critical because it will corrupt the global map. In this work, we propose a cloud-based publish/subscribe mechanism which enables reliable communication between agents during a cooperative mission using the Data Distribution Service (DDS) as a transport layer. The usability of our approach is verified by several experiments taking into account complete temporary communication loss.
Resumo:
The DC9 workshop takes place on June 27, 2015 in Limerick, Ireland and is titled “Hackable Cities: From Subversive City Making to Systemic Change”. The notion of “hacking” originates from the world of media technologies but is increasingly often being used for creative ideals and practices of city making. “City hacking” evokes more participatory, inclusive, decentralized, playful and subversive alternatives to often top-down ICT implementations in smart city making. However, these discourses about “hacking the city” are used ambiguously and are loaded with various ideological presumptions, which makes the term also problematic. For some “urban hacking” is about empowering citizens to organize around communal issues and perform aesthetic urban interventions. For others it raises questions about governance: what kind of “city hacks” should be encouraged or not, and who decides? Can city hacking be curated? For yet others, trendy participatory buzzwords like these are masquerades for deeply libertarian neoliberal values. Furthermore, a question is how “city hacking” may mature from the tactical level of smart and often playful interventions to the strategic level of enduring impact. The Digital Cities 9 workshop welcomes papers that explore the idea of “hackable city making” in constructive and critical ways.
Resumo:
Cloud computing has significantly impacted a broad range of industries, but these technologies and services have been absorbed throughout the marketplace unevenly. Some industries have moved aggressively towards cloud computing, while others have moved much more slowly. For the most part, the energy sector has approached cloud computing in a measured and cautious way, with progress often in the form of private cloud solutions rather than public ones, or hybridized information technology systems that combine cloud and existing non-cloud architectures. By moving towards cloud computing in a very slow and tentative way, however, the energy industry may prevent itself from reaping the full benefit that a more complete migration to the public cloud has brought about in several other industries. This short communication is accordingly intended to offer a high-level overview of cloud computing, and to put forward the argument that the energy sector should make a more complete migration to the public cloud in order to unlock the major system-wide efficiencies that cloud computing can provide. Also, assets within the energy sector should be designed with as much modularity and flexibility as possible so that they are not locked out of cloud-friendly options in the future.
Resumo:
Guaranteeing Quality of Service (QoS) with minimum computation cost is the most important objective of cloud-based MapReduce computations. Minimizing the total computation cost of cloud-based MapReduce computations is done through MapReduce placement optimization. MapReduce placement optimization approaches can be classified into two categories: homogeneous MapReduce placement optimization and heterogeneous MapReduce placement optimization. It is generally believed that heterogeneous MapReduce placement optimization is more effective than homogeneous MapReduce placement optimization in reducing the total running cost of cloud-based MapReduce computations. This paper proposes a new approach to the heterogeneous MapReduce placement optimization problem. In this new approach, the heterogeneous MapReduce placement optimization problem is transformed into a constrained combinatorial optimization problem and is solved by an innovative constructive algorithm. Experimental results show that the running cost of the cloud-based MapReduce computation platform using this new approach is 24:3%-44:0% lower than that using the most popular homogeneous MapReduce placement approach, and 2:0%-36:2% lower than that using the heterogeneous MapReduce placement approach not considering the spare resources from the existing MapReduce computations. The experimental results have also demonstrated the good scalability of this new approach.
Resumo:
Information security and privacy in the healthcare domain is a complex and challenging problem for computer scientists, social scientists, law experts and policy makers. Appropriate healthcare provision requires specialized knowledge, is information intensive and much patient information is of a particularly sensitive nature. Electronic health record systems provide opportunities for information sharing which may enhance healthcare services, for both individuals and populations. However, appropriate information management measures are essential for privacy preservation...
Resumo:
The first User-Focused Service Engineering, Consumption and Aggregation workshop (USECA) in 2011 was held in conjunction with the WISE 2011 conference in Sydney, Australia. Web services and related technology are a widely accepted standard architectural paradigm for application development. The idea of reusing existing software components to build new applications has been well documented and supported for the world of enterprise computing and professional developers. However, this powerful idea has not been transferred to end-users who have limited or no computing knowledge. The current methodologies, models, languages and tools developed for Web service composition are suited to IT professionals and people with years of training in computing technologies. It is still hard to imagine any of these technologies being used by business professionals, as opposed to computing professionals. © 2013 Springer-Verlag.
Resumo:
Fair Use Week has celebrated the evolution and development of the defence of fair use under copyright law in the United States. As Krista Cox noted, ‘As a flexible doctrine, fair use can adapt to evolving technologies and new situations that may arise, and its long history demonstrates its importance in promoting access to information, future innovation, and creativity.’ While the defence of fair use has flourished in the United States, the adoption of the defence of fair use in other jurisdictions has often been stymied. Professor Peter Jaszi has reflected: ‘We can only wonder (with some bemusement) why some of our most important foreign competitors, like the European Union, haven’t figured out that fair use is, to a great extent, the “secret sauce” of U.S. cultural competitiveness.’ Jurisdictions such as Australia have been at a dismal disadvantage, because they lack the freedoms and flexibilities of the defence of fair use.