957 resultados para Beijing (China)--Maps


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recently, researchers reported that nanowires (NWs) are often polycrystalline, which contain grain or twin boundaries that transect the whole NW normal to its axial direction into a bamboo like structure. In this work, large-scale molecular dynamics simulation is employed to investigate the torsional behaviours of bamboo-like structured Cu NWs. The existence of grain boundaries is found to induce a considerably large reduction to the critical angle, and the more of grain boundaries the less reduction appears, whereas, the presence of twin boundaries only results in a relatively smaller reduction to the critical angle. The introduction of grain boundaries reduces the torsional rigidity of the NW, whereas, the twin boundaries exert insignificant influence to the torsional rigidity. NWs with grain boundaries are inclined to produce a local HCP structure during loading, and the plastic deformation is usually evenly distributed along the axial axis of the NW. The plastic deformation of both perfect NW and NWs with twin boundaries is dominated by the nucleation and propagation of parallel intrinsic stacking faults. This study will enrich the current understanding of the mechanical properties of NWs, which will eventually shed lights on their applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Actin is the most abundantly distributed protein in living cells which plays critical roles in the cell interior force generation and transmission. The fracture mechanism of microfilament networks, whose principle component is actin, would provide insights which can contribute to the understandings of self-protective characters of cytoskeleton. In this study, molecular simulations are conducted to investigate the molecular mechanisms of disruption of microfilament networks from the viewpoint of biophysics. By employing a coarse-grained (CG) model of actin filament networks, we focused on the ultimate strength and crack growth mode of microfilament networks that have dependency on the crack length. It can be found that, the fracture mechanism of microfilament network has dependency on the structural properties of microfilament networks. The structure flaws marginally change the strength of microfilament networks which would explain the self-protective characters of cytoskeleton.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Acoustic sensing is a promising approach to scaling faunal biodiversity monitoring. Scaling the analysis of audio collected by acoustic sensors is a big data problem. Standard approaches for dealing with big acoustic data include automated recognition and crowd based analysis. Automatic methods are fast at processing but hard to rigorously design, whilst manual methods are accurate but slow at processing. In particular, manual methods of acoustic data analysis are constrained by a 1:1 time relationship between the data and its analysts. This constraint is the inherent need to listen to the audio data. This paper demonstrates how the efficiency of crowd sourced sound analysis can be increased by an order of magnitude through the visual inspection of audio visualized as spectrograms. Experimental data suggests that an analysis speedup of 12× is obtainable for suitable types of acoustic analysis, given that only spectrograms are shown.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Earth and its peoples are facing great challenges. As a species, humans are over-consuming the Earth’s resources and compromising the capacity of both natural and social systems to function in healthy and sustainable ways. Education at all levels and in all contexts, has a key role in helping societies move to more sustainable ways of living. Two areas in need of catch-up in relation to Education for Sustainable Development (ESD) are early childhood education and teacher education. Another area of challenge for ESD is the way it is currently oriented. To date, a great deal of emphasis has been placed on scientific and technological solutions to sustainability issues. This has led to an emphasis on STEM education as education’s main way of addressing sustainability. However, in this paper it is argued that sustainably is primarily a social issue that requires interdisciplinary education approaches. STEM approaches to ESD - emphasising knowledge construction and problem-solving - cannot, on their own, deal effectively with attitudes, values and actions towards more sustainable ways of living. In China and Australia, there are already policies, frameworks, guidelines and initiatives, such as Green Schools and Sustainable Schools that support such forms of ESD. STEM educators need to reach out to social scientists and social educators in order to more fully engage with activist and collaborative educational responses that equip learners with the knowledge, dispositions and capacities to ‘make a difference’.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Live migration of multiple Virtual Machines (VMs) has become an indispensible management activity in datacenters for application performance, load balancing, server consolidation. While state-of-the-art live VM migration strategies focus on the improvement of the migration performance of a single VM, little attention has been given to the case of multiple VMs migration. Moreover, existing works on live VM migration ignore the inter-VM dependencies, and underlying network topology and its bandwidth. Different sequences of migration and different allocations of bandwidth result in different total migration times and total migration downtimes. This paper concentrates on developing a multiple VMs migration scheduling algorithm such that the performance of migration is maximized. We evaluate our proposed algorithm through simulation. The simulation results show that our proposed algorithm can migrate multiple VMs on any datacenter with minimum total migration time and total migration downtime.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The assumptions underlying the Probability Ranking Principle (PRP) have led to a number of alternative approaches that cater or compensate for the PRP's limitations. In this poster we focus on the Interactive PRP (iPRP), which rejects the assumption of independence between documents made by the PRP. Although the theoretical framework of the iPRP is appealing, no instantiation has been proposed and investigated. In this poster, we propose a possible instantiation of the principle, performing the first empirical comparison of the iPRP against the PRP. For document diversification, our results show that the iPRP is significantly better than the PRP, and comparable to or better than other methods such as Modern Portfolio Theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NPcomplete. Thus, in this paper we propose a new grouping genetic algorithm for the mappers/reducers placement problem in cloud computing. Compared with the original one, our grouping genetic algorithm uses an innovative coding scheme and also eliminates the inversion operator which is an essential operator in the original grouping genetic algorithm. The new grouping genetic algorithm is evaluated by experiments and the experimental results show that it is much more efficient than four popular algorithms for the problem, including the original grouping genetic algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A Software-as-a-Service or SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. Components in a composite SaaS may need to be scaled – replicated or deleted, to accommodate the user’s load. It may not be necessary to replicate all components of the SaaS, as some components can be shared by other instances. On the other hand, when the load is low, some of the instances may need to be deleted to avoid resource underutilisation. Thus, it is important to determine which components are to be scaled such that the performance of the SaaS is still maintained. Extensive research on the SaaS resource management in Cloud has not yet addressed the challenges of scaling process for composite SaaS. Therefore, a hybrid genetic algorithm is proposed in which it utilises the problem’s knowledge and explores the best combination of scaling plan for the components. Experimental results demonstrate that the proposed algorithm outperforms existing heuristic-based solutions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study two problems of online learning under restricted information access. In the first problem, prediction with limited advice, we consider a game of prediction with expert advice, where on each round of the game we query the advice of a subset of M out of N experts. We present an algorithm that achieves O(√(N/M)TlnN ) regret on T rounds of this game. The second problem, the multiarmed bandit with paid observations, is a variant of the adversarial N-armed bandit game, where on round t of the game we can observe the reward of any number of arms, but each observation has a cost c. We present an algorithm that achieves O((cNlnN) 1/3 T2/3+√TlnN ) regret on T rounds of this game in the worst case. Furthermore, we present a number of refinements that treat arm- and time-dependent observation costs and achieve lower regret under benign conditions. We present lower bounds that show that, apart from the logarithmic factors, the worst-case regret bounds cannot be improved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

First Asia Pacific Conference, AP-BPM 2013, Beijing, China, August 29-30, 2013. Selected Papers

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Searching for relevant peer-reviewed material is an integral part of corporate and academic researchers. Researchers collect huge amount of information over the years and sometimes struggle organizing it. Based on a study with 30 academic researchers, we explore, in combination, different searching and archiving activities of document-based information. Based on our results we provide several implications for design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is considerable interest internationally in developing product libraries to support the use of BIM. Product library initiatives are driven by national bodies, manufacturers and private companies who see their potential. A major issue with the production and distribution of product information for BIM is that separate library objects need to be produced for all of the different software systems that are going to use the library. This increases the cost of populating product libraries and also increases the difficulty in maintaining consistency between the representations for the different software over time. This paper describes a project which uses “software transformation” technology from the field of software engineering to support the definition of a single generic representation of a product which can then be automatically converted to the format required by receiving software. The paper covers the current state of implementation of the product library, the technology underlying the transformations for the currently supported software and the business model for creating a national library in Australia. This is placed within the context of other current product library systems to highlight the differences. The responsibilities of the various actors involved in supporting the product library are also discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Construction industry accounts for a tenth of global GDP. Still, challenges such as slow adoption of new work processes, islands of information, and legal disputes, remain frequent, industry-wide occurrences despite various attempts to address them. In response, IT-based approaches have been adopted to explore collaborative ways of executing construction projects. Building Information Modelling (BIM) is an exemplar of integrative technologies whose 3D-visualisation capabilities have fostered collaboration especially between clients and design teams. Yet, the ways in which specification documents are created and used in capturing clients' expectations based on industry standards have remained largely unchanged since the 18th century. As a result, specification-related errors are still common place in an industry where vast amounts of information are consumed as well as produced in the course project implementation in the built environment. By implication, processes such as cost planning which depend on specification-related information remain largely inaccurate even with the use of BIM-based technologies. This paper briefly distinguishes between non-BIM-based and BIM-based specifications and reports on-going efforts geared towards the latter. We review exemplars aimed at extending Building Information Models to specification information embedded within the objects in a product library and explore a viable way of reasoning about a semi-automated process of specification using our product library.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

4D modeling - the simulation and visualisation of the construction process - is now a common method used during the building construction process with reasonable support from existing software. The goal of this paper is to examine the information needs required to model the deconstruction/demolition process of a building. The motivation is the need to reduce the impacts on the local environment during the deconstruction process. The focus is on the definition and description of the activities to remove building components and on the assessment of the noise, dust and vibration implications of these activities on the surrounding environment. The outcomes of the research are: i. requirements specification for BIM models to support operational deconstruction process planning, ii. algorithms for augmenting the BIM with the derived information necessary to automate planning of the deconstruction process with respect to impacts on the surrounding environment, iii. algorithms to build naive deconstruction activity schedules.