912 resultados para pacs: neural computing technologies
Resumo:
Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.
Resumo:
With the advancement of new technologies, this author has in 2010 started to engineer an online learning environment for investigating the nature and development of spatial abilities, and the teaching and learning of geometry. This paper documents how this new digital learning environment can afford the opportunity to integrate the learning about 3D shapes with direction, location and movement, and how young children can mentally and visually construct virtual 3D shapes using movements in both egocentric and fixed frames of reference (FOR). Findings suggest that year 4 (aged 9) children can develop the capacity to construct a cube using egocentric FOR only, fixed FOR only or a combination of both FOR. However, these young participants were unable to articulate the effect of individual or combined FOR movements. Directions for future research are proposed.
Resumo:
Topic modelling has been widely used in the fields of information retrieval, text mining, machine learning, etc. In this paper, we propose a novel model, Pattern Enhanced Topic Model (PETM), which makes improvements to topic modelling by semantically representing topics with discriminative patterns, and also makes innovative contributions to information filtering by utilising the proposed PETM to determine document relevance based on topics distribution and maximum matched patterns proposed in this paper. Extensive experiments are conducted to evaluate the effectiveness of PETM by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model significantly outperforms both state-of-the-art term-based models and pattern-based models.
Resumo:
Study Approach The results presented in this report are part of a larger global study on the major issues in BPM. Only one part of the larger study is reported here, viz. interviews with BPM experts. Interviews of BPM tool vendors together with focus group studies involving user organizations were conducted in parallel and set the groundwork for the identification of BPM issues on a global scale. Through this multi-method approach, we identify four distinct sets of outcomes. First, as is the focus of this report, we identify the BPM issues as perceived by BPM experts. Second, the research design allows us to gain insight into the opinions of organizations deploying BPM solutions. Third, an understanding of organizations’ misconceptions of BPM technologies, as confronted by BPM tool vendors, is obtained. Last, we seek to gain an understanding of BPM issues on a global scale, together with knowledge of matters of concern. This final outcome is aimed to produce an industry-driven research agenda that will inform practitioners and, in particular, the research community worldwide on issues and challenges that are prevalent or emerging in BPM and related areas...
Resumo:
L'intérêt suscité par la ré-ingénierie des processus et les technologies de l'information révèle l'émergence du paradigme du management par les processus. Bien que beaucoup d'études aient été publiées sur des outils et techniques alternatives de modélisation de processus, peu d'attention a été portée à l'évaluation post-hoc des activités de modélisation de processus ou à l'établissement de directives sur la façon de conduire efficacement une modélisation de processus. La présente étude a pour objectif de combler ce manque. Nous présentons les résultats d'une étude de cas détaillée, conduite dans une organisation leader australienne dans le but de construire un modèle de réussite de la modélisation des processus.
Resumo:
It is widely acknowledged that effective asset management requires an interdisciplinary approach, in which synergies should exist between traditional disciplines such as: accounting, engineering, finance, humanities, logistics, and information systems technologies. Asset management is also an important, yet complex business practice. Business process modelling is proposed as an approach to manage the complexity of asset management through the modelling of asset management processes. A sound foundation for the systematic application and analysis of business process modelling in asset management is, however, yet to be developed. Fundamentally, a business process consists of activities (termed functions), events/states, and control flow logic. As both events/states and control flow logic are somewhat dependent on the functions themselves, it is a logical step to first identify the functions within a process. This research addresses the current gap in knowledge by developing a method to identify functions common to various industry types (termed core functions). This lays the foundation to extract such functions, so as to identify both commonalities and variation points in asset management processes. This method describes the use of a manual text mining and a taxonomy approach. An example is presented.
Resumo:
Much of what is written about digital technologies in preschool contexts focuses on young children’s acquisition of skills rather than their meaning-making during use of technologies. In this paper, we consider how the viewing of a YouTube video was used by a teacher and children to produce shared understandings about it. Conversation analysis of talk and interaction during the viewing of the video establishes some of the ways that individual accounts of events were produced for others and then endorsed as shared understandings. The analysis establishes how adults and children made use of verbal and embodied actions during interactions to produce shared understandings of the YouTube video, the events it recorded and written commentary about those events
Resumo:
Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation technology. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches consider the energy consumption by physical machines only, but do not consider the energy consumption in communication network, in a data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement. In our preliminary research, we have proposed a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both physical machines and the communication network in a data center. Aiming at improving the performance and efficiency of the genetic algorithm, this paper presents a hybrid genetic algorithm for the energy-efficient virtual machine placement problem. Experimental results show that the hybrid genetic algorithm significantly outperforms the original genetic algorithm, and that the hybrid genetic algorithm is scalable.
Resumo:
The introduction of safety technologies into complex socio-technical systems requires an integrated and holistic approach to HF and engineering, considering the effects of failures not only within system boundaries, but also at the interfaces with other systems and humans. Level crossing warning devices are examples of such systems where technically safe states within the system boundary can influence road user performance, giving rise to other hazards that degrade safety of the system. Chris will discuss the challenges that have been encountered to date in developing a safety argument in support of low-cost level crossing warning devices. The design and failure modes of level crossing warning devices are known to have a significant influence on road user performance; however, quantifying this effect is one of the ongoing challenges in determining appropriate reliability and availability targets for low-cost level crossing warning devices.
Resumo:
The purpose of this paper is to empirically examine the state of cloud computing adoption in Australia. I specifically focus on the drivers, risks, and benefits of cloud computing from the perspective of IT experts and forensic accountants. I use thematic analysis of interview data to answer the research questions of the study. The findings suggest that cloud computing is increasingly gaining foothold in many sectors due to its advantages such as flexibility and the speed of deployment. However, security remains an issue and therefore its adoption is likely to be selective and phased. Of particular concern are the involvement of third parties and foreign jurisdictions, which in the event of damage may complicate litigation and forensic investigations. This is one of the first empirical studies that reports on cloud computing adoption and experiences in Australia.
Resumo:
Highlights • Diabetic foot ulcers (DFUs) are a major complication of diabetes. • We describe the development of next-generation technologies for DFU repair. • We highlight the modest success of growth factor-, scaffold-, and cell-based DFU therapies. • We rationalize that combination therapies will be necessary to enable effective and reliable DFU repair.
Resumo:
Biodiesel, produced from renewable feedstock represents a more sustainable source of energy and will therefore play a significant role in providing the energy requirements for transportation in the near future. Chemically, all biodiesels are fatty acid methyl esters (FAME), produced from raw vegetable oil and animal fat. However, clear differences in chemical structure are apparent from one feedstock to the next in terms of chain length, degree of unsaturation, number of double bonds and double bond configuration-which all determine the fuel properties of biodiesel. In this study, prediction models were developed to estimate kinematic viscosity of biodiesel using an Artificial Neural Network (ANN) modelling technique. While developing the model, 27 parameters based on chemical composition commonly found in biodiesel were used as the input variables and kinematic viscosity of biodiesel was used as output variable. Necessary data to develop and simulate the network were collected from more than 120 published peer reviewed papers. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture and learning algorithm were optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the coefficient of determination (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found high predictive accuracy of the ANN in predicting fuel properties of biodiesel and has demonstrated the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties. Therefore the model developed in this study can be a useful tool to accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.
Resumo:
Olfactory ensheathing cells, the glial cells of the olfactory nervous system, exhibit unique growth-promoting and migratory properties that make them interesting candidates for cell therapies targeting neuronal injuries such as spinal cord injury. Transplantation of olfactory cells is feasible and safe in humans; however, functional outcomes are highly variable with some studies showing dramatic improvements and some no improvements at all. We propose that the reason for this is that the identity and purity of the cells is different in each individual study. We have shown that olfactory ensheathing cells are not a uniform cell population and that individual subpopulations of OECs are present in different regions of the olfactory nervous system, with strikingly different behaviors. Furthermore, the presence of fibroblasts and other cell types in the transplant can dramatically alter the behavior of the transplanted glial cells. Thus, a thorough characterization of the differences between olfactory ensheathing cell subpopulations and how the behavior of these cells is affected by the presence of other cell types is highly warranted.
Resumo:
This research introduces a general methodology in order to create a Coloured Petri Net (CPN) model of a security protocol. Then standard or user-defined security properties of the created CPN model are identified. After adding an attacker model to the protocol model, the security property is verified using state space method. This approach is applied to analyse a number of trusted computing protocols. The results show the applicability of proposed method to analyse both standard and user-defined properties.
Jacobian-free Newton-Krylov methods with GPU acceleration for computing nonlinear ship wave patterns
Resumo:
The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-dimensional ship wave patterns, such as the shape of steep waves close to their limiting configuration, in a manner that has been possible in the two-dimensional analogue for some time.