174 resultados para cloud point


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mammalian target of rapamycin (mTOR) is a highly conserved atypical serine-threonine kinase that controls numerous functions essential for cell homeostasis and adaptation in mammalian cells via 2 distinct protein complex formations. Moreover, mTOR is a key regulatory protein in the insulin signalling cascade and has also been characterized as an insulin-independent nutrient sensor that may represent a critical mediator in obesity-related impairments of insulin action in skeletal muscle. Exercise characterizes a remedial modality that enhances mTOR activity and subsequently promotes beneficial metabolic adaptation in skeletal muscle. Thus, the metabolic effects of nutrients and exercise have the capacity to converge at the mTOR protein complexes and subsequently modify mTOR function. Accordingly, the aim of the present review is to highlight the role of mTOR in the regulation of insulin action in response to overnutrition and the capacity for exercise to enhance mTOR activity in skeletal muscle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose The use of intravascular devices is associated with a number of potential complications. Despite a number of evidence-based clinical guidelines in this area, there continues to be nursing practice discrepancies. This study aims to examine nursing practice in a cancer care setting to identify nursing practice and areas for improvement respective to best available evidence. Methods A point prevalence survey was undertaken in a tertiary cancer care centre in Queensland, Australia. On a randomly selected day, four nurses assessed intravascular device related nursing practices and collected data using a standardized survey tool. Results 58 inpatients (100%) were assessed. Forty-eight (83%) had a device in situ, comprising 14 Peripheral Intravenous Catheters (29.2%), 14 Peripherally Inserted Central Catheters (29.2%), 14 Hickman catheters (29.2%) and six Port-a-Caths (12.4%). Suboptimal outcomes such as incidences of local site complications, incorrect/inadequate documentation, lack of flushing orders, and unclean/non intact dressings were observed. Conclusions This study has highlighted a number of intravascular device related nursing practice discrepancies compared with current hospital policy. Education and other implementation strategies can be applied to improve nursing practice. Following education strategies, it will be valuable to repeat this survey on a regular basis to provide feedback to nursing staff and implement strategies to improve practice. More research is required to provide evidence to clinical practice with regards to intravascular device related consumables, flushing technique and protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of “the cloud” has attracted significant attention throughout the past few years (Cherry 2009; Sterling and Stark 2009) and, as a result, academics and trade journals have created several competing definitions of “cloud computing” (e.g., Motahari-Nezhad et al. 2009). Underpinning this article is the definition put forward by the US National Institute of Standards and Technology, which describes cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction” (Garfinkel 2011, p. 3). Despite the lack of consensus about definitions, however, there is broad agreement on the growing demand for cloud computing. Some estimates suggest that spending on cloudrelated technologies and services in the next few years may climb as high as USD 42 billion/year (Buyya et al. 2009).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Timely and comprehensive scene segmentation is often a critical step for many high level mobile robotic tasks. This paper examines a projected area based neighbourhood lookup approach with the motivation towards faster unsupervised segmentation of dense 3D point clouds. The proposed algorithm exploits the projection geometry of a depth camera to find nearest neighbours which is time independent of the input data size. Points near depth discontinuations are also detected to reinforce object boundaries in the clustering process. The search method presented is evaluated using both indoor and outdoor dense depth images and demonstrates significant improvements in speed and precision compared to the commonly used Fast library for approximate nearest neighbour (FLANN) [Muja and Lowe, 2009].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Point-to-point speed cameras are a relatively new and innovative technological approach to speed enforcement that is increasingly been used in a number of highly motorised countries. Previous research has provided evidence of the positive impact of this approach on vehicle speeds and crash rates, as well as additional traffic related outcomes such as vehicle emissions and traffic flow. This paper reports on the conclusions and recommendations of a large-scale project involving extensive consultation with international and domestic (Australian) stakeholders to explore the technological, operational, and legislative characteristics associated with the technology. More specifically, this paper provides a number of recommendations for better practice regarding the implementation of point-to-point speed enforcement in the Australian and New Zealand context. The broader implications of the research, as well as directions for future research, are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The geographic location of cloud data storage centres is an important issue for many organisations and individuals due to various regulations that require data and operations to reside in specific geographic locations. Thus, cloud users may want to be sure that their stored data have not been relocated into unknown geographic regions that may compromise the security of their stored data. Albeshri et al. (2012) combined proof of storage (POS) protocols with distance-bounding protocols to address this problem. However, their scheme involves unnecessary delay when utilising typical POS schemes due to computational overhead at the server side. The aim of this paper is to improve the basic GeoProof protocol by reducing the computation overhead at the server side. We show how this can maintain the same level of security while achieving more accurate geographic assurance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The music industry is going through a period of immense change brought about in part by the digital revolution. What is the role of music in the age of computers and the Internet? How has the music industry been transformed by the economic and technological upheavals of recent years, and how is it likely to change in the future? This thoroughly revised and updated new edition provides an international overview of the music industry and its future prospects in the world of global entertainment. Patrik Wikström illuminates the workings of the music industry, and captures the dynamics at work in the production of musical culture between the transnational media conglomerates, the independent music companies and the public. New to this second edition are expanded sections on the structure of the music industry, online business models and the links between social media and music. Engaging and comprehensive, The Music Industry will be a must-read for students and scholars of media and communication studies, cultural studies, popular music, sociology and economics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rigid lenses, which were originally made from glass (between 1888 and 1940) and later from polymethyl methacrylate or silicone acrylate materials, are uncomfortable to wear and are now seldom fitted to new patients. Contact lenses became a popular mode of ophthalmic refractive error correction following the discovery of the first hydrogel material – hydroxyethyl methacrylate – by Czech chemist Otto Wichterle in 1960. To satisfy the requirements for ocular biocompatibility, contact lenses must be transparent and optically stable (for clear vision), have a low elastic modulus (for good comfort), have a hydrophilic surface (for good wettability), and be permeable to certain metabolites, especially oxygen, to allow for normal corneal metabolism and respiration during lens wear. A major breakthrough in respect of the last of these requirements was the development of silicone hydrogel soft lenses in 1999 and techniques for making the surface hydrophilic. The vast majority of contact lenses distributed worldwide are mass-produced using cast molding, although spin casting is also used. These advanced mass-production techniques have facilitated the frequent disposal of contact lenses, leading to improvements in ocular health and fewer complications. More than one-third of all soft contact lenses sold today are designed to be discarded daily (i.e., ‘daily disposable’ lenses).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing is a currently developing revolution in information technology that is disturbing the way that individuals and corporate entities operate while enabling new distributed services that have not existed before. At the foundation of cloud computing is the broader concept of converged infrastructure and shared services. Security is often said to be a major concern of users considering migration to cloud computing. This article examines some of these security concerns and surveys recent research efforts in cryptography to provide new technical mechanisms suitable for the new scenarios of cloud computing. We consider techniques such as homomorphic encryption, searchable encryption, proofs of storage, and proofs of location. These techniques allow cloud computing users to benefit from cloud server processing capabilities while keeping their data encrypted; and to check independently the integrity and location of their data. Overall we are interested in how users may be able to maintain and verify their own security without having to rely on the trust of the cloud provider.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction When it comes to sustainable economic development, it is hard to go past the thought of investment in information technology (IT). The foundation of sustainable economic development is sustainable infrastructure. This situation means that investment in IT is about developing sustainable IT infrastructure. An IT infrastructure is a set of IT tools on which organisations could develop applications to manage their varying business processes. At a national economic level, this is all about developing a national IT infrastructure to provide social and economic services to the various stakeholders. Current troubling economic times call for collaboration and centrality in IT infrastructure development. This notion has led to the idea of national broadband networks, sustainable telecommunication platforms, and national IT development plans and goals. However, these thoughts and actions do not directly impact the critical social and economic processes of organisations. That is, these thoughts set the tone and direction of actions

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)