969 resultados para Point cloud
Resumo:
For the past few years, research works on the topic of secure outsourcing of cryptographic computations has drawn significant attention from academics in security and cryptology disciplines as well as information security practitioners. One main reason for this interest is their application for resource constrained devices such as RFID tags. While there has been significant progress in this domain since Hohenberger and Lysyanskaya have provided formal security notions for secure computation delegation, there are some interesting challenges that need to be solved that can be useful towards a wider deployment of cryptographic protocols that enable secure outsourcing of cryptographic computations. This position paper brings out these challenging problems with RFID technology as the use case together with our ideas, where applicable, that can provide a direction towards solving the problems.
Resumo:
This paper relates to the importance of impact of the chosen bottle-point method when conducting ion exchange equilibria experiments. As an illustration, potassium ion exchange with strong acid cation resin was investigated due to its relevance to the treatment of various industrial effluents and groundwater. The “constant mass” bottle-point method was shown to be problematic in that depending upon the resin mass used the equilibrium isotherm profiles were different. Indeed, application of common equilibrium isotherm models revealed that the optimal fit could be with either the Freundlich or Temkin equations, depending upon the conditions employed. It could be inferred that the resin surface was heterogeneous in character, but precise conclusions regarding the variation in the heat of sorption were not possible. Estimation of the maximum potassium loading was also inconsistent when employing the “constant mass” method. The “constant concentration” bottle-point method illustrated that the Freundlich model was a good representation of the exchange process. The isotherms recorded were relatively consistent when compared to the “constant mass” approach. Unification of all the equilibrium isotherm data acquired was achieved by use of the Langmuir Vageler expression. The maximum loading of potassium ions was predicted to be at least 116.5 g/kg resin.
Resumo:
This paper aims to develop a meshless approach based on the Point Interpolation Method (PIM) for numerical simulation of a space fractional diffusion equation. Two fully-discrete schemes for the one-dimensional space fractional diffusion equation are obtained by using the PIM and the strong-forms of the space diffusion equation. Numerical examples with different nodal distributions are studied to validate and investigate the accuracy and efficiency of the newly developed meshless approach.
Resumo:
Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.
Resumo:
The April 2015 edition of Curriculum Perspectives has a special focus and casts light on the continuing development of the Australian Curriculum. This paper provides an introduction to a series of papers in the Point and Counterpoint section of this edition on the Review of the Australian Curriculum with reference to History. It makes clear that History is one of the most contested areas of the curriculum and that whilst politicians and policy makers are concerned with the importance of history in relation to national identity and nation building, history serves other purposes. The paper reiterates the need to pay attention to the particularities of discipline–based knowledge for the study of history in schools and the central role of inquiry for student learning in history. In doing so, it establishes the context for the five papers which follow.
Resumo:
As computational models in fields such as medicine and engineering get more refined, resource requirements are increased. In a first instance, these needs have been satisfied using parallel computing and HPC clusters. However, such systems are often costly and lack flexibility. HPC users are therefore tempted to move to elastic HPC using cloud services. One difficulty in making this transition is that HPC and cloud systems are different, and performance may vary. The purpose of this study is to evaluate cloud services as a means to minimise both cost and computation time for large-scale simulations, and to identify which system properties have the most significant impact on performance. Our simulation results show that, while the performance of Virtual CPU (VCPU) is satisfactory, network throughput may lead to difficulties.
Resumo:
The ways in which technology mediates daily activities is shifting rapidly. Global trends point toward the uptake of ambient and interactive media to create radical new ways of working, interacting and socialising. Tech giants such as Google and Apple are banking on the success of this emerging market by investing in new future focused consumer products such as Google Glass and the Apple Watch. The potential implications of ubiquitous technological interactions via tangible and ambient media have never been more real or more accessible.
Resumo:
As cities are rapidly developing new interventions against climate change, embedding renewable energy in public spaces is an important strategy. However, most interventions primarily include environmental sustainability while neglecting the social and economic interrelationships of electricity production. Although there is a growing interest in sustainability within environmental design and landscape architecture, public spaces are still awaiting viable energy-conscious design and assessment interventions. The purpose of this paper is to investigate this issue in a renowned public space—Ballast Point Park in Sydney—using a triple bottom line (TBL) case study approach. The emerging factors and relationships of each component of TBL, within the context of public open space, are identified and discussed. With specific focus on renewable energy distribution in and around Ballast Point Park, the paper concludes with a general design framework, which conceptualizes an optimal distribution of onsite electricity produced from renewable sources embedded in public open spaces.
Resumo:
Background: It is important for nutrition intervention in malnourished patients to be guided by accurate evaluation and detection of small changes in the patient’s nutrition status over time. However, the current Subjective Global Assessment (SGA) is not able to detect changes in a short period of time. The aim of the study was to determine whether 7-point SGA is more time sensitive to nutrition changes than the conventional SGA. Methods: In this prospective study, 67 adult inpatients assessed as malnourished using both the 7-point SGA and conventional SGA were recruited. Each patient received nutrition intervention and was followed up post-discharge. Patients were reassessed using both tools at 1, 3 and 5 months from baseline assessment. Results: It took significantly shorter time to see a one-point change using 7-point SGA compared to conventional SGA (median: 1 month vs. 3 months, p = 0.002). The likelihood of at least a one-point change is 6.74 times greater in 7-point SGA compared to conventional SGA after controlling for age, gender and medical specialties (odds ratio = 6.74, 95% CI 2.88-15.80, p<0.001). Fifty-six percent of patients who had no change in SGA score had changes detected using 7-point SGA. The level of agreement was 100% (k = 1, p < 0.001) between 7-point SGA and 3-point SGA and 83% (k=0.726, p<0.001) between two blinded assessors for 7-point SGA. Conclusion: The 7-point SGA is more time sensitive in its response to nutrition changes than conventional SGA. It can be used to guide nutrition intervention for patients.
Resumo:
In contrast to single robotic agent, multi-robot systems are highly dependent on reliable communication. Robots have to synchronize tasks or to share poses and sensor readings with other agents, especially for co-operative mapping task where local sensor readings are incorporated into a global map. The drawback of existing communication frameworks is that most are based on a central component which has to be constantly within reach. Additionally, they do not prevent data loss between robots if a failure occurs in the communication link. During a distributed mapping task, loss of data is critical because it will corrupt the global map. In this work, we propose a cloud-based publish/subscribe mechanism which enables reliable communication between agents during a cooperative mission using the Data Distribution Service (DDS) as a transport layer. The usability of our approach is verified by several experiments taking into account complete temporary communication loss.
Resumo:
An increasingly regulated higher education sector is renewing its attention to those activities referred to as ‘moderation’ in its efforts to ensure that judgements of student achievement are based on appropriate standards. Moderation practices conducted throughout the assessment process can result in purposes identified as equity, justification, accountability and community building. This paper draws on the limited studies of moderation and wider relevant research on judgement, standards and professional learning to test commonly used moderation practices against these identified purposes. The paper concludes with recommendations for maximising the potential of moderation practices to establish and maintain achievement standards.
Cooperative choice and its framing effect under threshold uncertainty in a provision point mechanism
Resumo:
This paper explores how threshold uncertainty affects cooperative behaviors in the provision of public goods and the prevention of public bads. The following facts motivate our study. First, environmental (resource) problems are either framed as public bads prevention or public goods provision. Second, the occurrence of these problems is characterized by thresholds that are interchangeably represented as "nonconvexity," "bifurcation," "bi-stability," or "catastrophes." Third, the threshold location is mostly unknown. We employ a provision point mechanism with threshold uncertainty and analyze the responses of cooperative behaviors to uncertainty and to the framing for each type of social preferences categorized by a value orientation test. We find that aggregate framing effects are negligible, although the response to the frame is the opposite depending on the type of social preferences. "Cooperative" subjects become more cooperative in negative frames than in positive frames, whereas "individualistic" subjects are less cooperative in negative frames than in positive ones. This finding implies that the insignificance of aggregate framing effects arises from behavioral asymmetry. We also find that the percentage of cooperative choices non-monotonically varies with the degree of threshold uncertainty, irrespective of framing and value orientation. Specifically, the degree of cooperation is highest at intermediate levels of threshold uncertainty and decreases as the uncertainty becomes sufficiently large.
Resumo:
Cloud computing has significantly impacted a broad range of industries, but these technologies and services have been absorbed throughout the marketplace unevenly. Some industries have moved aggressively towards cloud computing, while others have moved much more slowly. For the most part, the energy sector has approached cloud computing in a measured and cautious way, with progress often in the form of private cloud solutions rather than public ones, or hybridized information technology systems that combine cloud and existing non-cloud architectures. By moving towards cloud computing in a very slow and tentative way, however, the energy industry may prevent itself from reaping the full benefit that a more complete migration to the public cloud has brought about in several other industries. This short communication is accordingly intended to offer a high-level overview of cloud computing, and to put forward the argument that the energy sector should make a more complete migration to the public cloud in order to unlock the major system-wide efficiencies that cloud computing can provide. Also, assets within the energy sector should be designed with as much modularity and flexibility as possible so that they are not locked out of cloud-friendly options in the future.
Resumo:
Guaranteeing Quality of Service (QoS) with minimum computation cost is the most important objective of cloud-based MapReduce computations. Minimizing the total computation cost of cloud-based MapReduce computations is done through MapReduce placement optimization. MapReduce placement optimization approaches can be classified into two categories: homogeneous MapReduce placement optimization and heterogeneous MapReduce placement optimization. It is generally believed that heterogeneous MapReduce placement optimization is more effective than homogeneous MapReduce placement optimization in reducing the total running cost of cloud-based MapReduce computations. This paper proposes a new approach to the heterogeneous MapReduce placement optimization problem. In this new approach, the heterogeneous MapReduce placement optimization problem is transformed into a constrained combinatorial optimization problem and is solved by an innovative constructive algorithm. Experimental results show that the running cost of the cloud-based MapReduce computation platform using this new approach is 24:3%-44:0% lower than that using the most popular homogeneous MapReduce placement approach, and 2:0%-36:2% lower than that using the heterogeneous MapReduce placement approach not considering the spare resources from the existing MapReduce computations. The experimental results have also demonstrated the good scalability of this new approach.
Resumo:
Background In the emergency department, portable point-of-care testing (POCT) coagulation devices may facilitate stroke patient care by providing rapid International Normalized Ratio (INR) measurement. The objective of this study was to evaluate the reliability, validity, and impact on clinical decision-making of a POCT device for INR testing in the setting of acute ischemic stroke (AIS). Methods A total of 150 patients (50 healthy volunteers, 51 anticoagulated patients, 49 AIS patients) were assessed in a tertiary care facility. The INR's were measured using the Roche Coaguchek S and the standard laboratory technique. Results The interclass correlation coefficient and 95% confidence interval between overall POCT device and standard laboratory value INRs was high (0.932 (0.69 - 0.78). In the AIS group alone, the correlation coefficient and 95% CI was also high 0.937 (0.59 - 0.74) and diagnostic accuracy of the POCT device was 94%. Conclusions When used by a trained health professional in the emergency department to assess INR in acute ischemic stroke patients, the CoaguChek S is reliable and provides rapid results. However, as concordance with laboratory INR values decreases with higher INR values, it is recommended that with CoaguChek S INRs in the > 1.5 range, a standard laboratory measurement be used to confirm the results.