936 resultados para Biology, Bioinformatics|Computer Science
Resumo:
Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart, by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computer-based intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are nonlinear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of nonlinear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and seven classes of arrhythmia. We present some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. We also extracted features from the HOS and performed an analysis of variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test.
Resumo:
The aggregate structure which occurs in aqueous smectitic suspensions is responsible for poor water clarification, difficulties in sludge dewatering and the unusual rheological behaviour of smectite rich soils. These macroscopic properties are dictated by the 3-D structural arrangement of smectite finest fraction within flocculated aggregates. Here, we report results from a relatively new technique, Transmission X-ray Microscopy (TXM), which makes it possible to investigate the internal structure and 3-D tomographic reconstruction of the smectite clay aggregates modified by Al13 keggin macro-molecule [Al13(O)4(OH)24(H2O)12 ]7+. Three different treatment methods were shown resulted in three different micro-structural environments of the resulting flocculation.
Resumo:
Network induced delay in networked control systems (NCS) is inherently non-uniformly distributed and behaves with multifractal nature. However, such network characteristics have not been well considered in NCS analysis and synthesis. Making use of the information of the statistical distribution of NCS network induced delay, a delay distribution based stochastic model is adopted to link Quality-of-Control and network Quality-of-Service for NCS with uncertainties. From this model together with a tighter bounding technology for cross terms, H∞ NCS analysis is carried out with significantly improved stability results. Furthermore, a memoryless H∞ controller is designed to stabilize the NCS and to achieve the prescribed disturbance attenuation level. Numerical examples are given to demonstrate the effectiveness of the proposed method.
Resumo:
Industrial applications of the simulated-moving-bed (SMB) chromatographic technology have brought an emergent demand to improve the SMB process operation for higher efficiency and better robustness. Improved process modelling and more-efficient model computation will pave a path to meet this demand. However, the SMB unit operation exhibits complex dynamics, leading to challenges in SMB process modelling and model computation. One of the significant problems is how to quickly obtain the steady state of an SMB process model, as process metrics at the steady state are critical for process design and real-time control. The conventional computation method, which solves the process model cycle by cycle and takes the solution only when a cyclic steady state is reached after a certain number of switching, is computationally expensive. Adopting the concept of quasi-envelope (QE), this work treats the SMB operation as a pseudo-oscillatory process because of its large number of continuous switching. Then, an innovative QE computation scheme is developed to quickly obtain the steady state solution of an SMB model for any arbitrary initial condition. The QE computation scheme allows larger steps to be taken for predicting the slow change of the starting state within each switching. Incorporating with the wavelet-based technique, this scheme is demonstrated to be effective and efficient for an SMB sugar separation process. Moreover, investigations are also carried out on when the computation scheme should be activated and how the convergence of the scheme is affected by a variable stepsize.
Resumo:
UCON is an emerging access control framework that lacks an administration model. In this paper we define the problem of administration and propose a novel administrative model. At the core of this model is the concept of attribute, which is also the central component of UCON. In our model, attributes are created by the assertions of subjects, which ascribe properties/rights to other subjects or objects. Through such a treatment of attributes, administration capabilities can be delegated from one subject to another and as a consequence UCON is improved in three aspects. First, immutable attributes that are currently considered as external to the model can be incorporated and thereby treated as mutable at- tributes. Second, the current arbitrary categorisation of users (as modifiers of attributes), to system and administrator can be removed. Attributes and objects are only modifiable by those who possess administration capability over them. Third, the delegation of administration over objects and properties that is not currently expressible in UCON is made possible.
Resumo:
This article presents a survey of authorisation models and considers their ‘fitness-for-purpose’ in facilitating information sharing. Network-supported information sharing is an important technical capability that underpins collaboration in support of dynamic and unpredictable activities such as emergency response, national security, infrastructure protection, supply chain integration and emerging business models based on the concept of a ‘virtual organisation’. The article argues that present authorisation models are inflexible and poorly scalable in such dynamic environments due to their assumption that the future needs of the system can be predicted, which in turn justifies the use of persistent authorisation policies. The article outlines the motivation and requirement for a new flexible authorisation model that addresses the needs of information sharing. It proposes that a flexible and scalable authorisation model must allow an explicit specification of the objectives of the system and access decisions must be made based on a late trade-off analysis between these explicit objectives. A research agenda for the proposed Objective-based Access Control concept is presented.
Resumo:
The following paper presents an evaluation of airborne sensors for use in vegetation management in powerline corridors. Three integral stages in the management process are addressed including, the detection of trees, relative positioning with respect to the nearest powerline and vegetation height estimation. Image data, including multi-spectral and high resolution, are analyzed along with LiDAR data captured from fixed wing aircraft. Ground truth data is then used to establish the accuracy and reliability of each sensor thus providing a quantitative comparison of sensor options. Tree detection was achieved through crown delineation using a Pulse-Coupled Neural Network (PCNN) and morphologic reconstruction applied to multi-spectral imagery. Through testing it was shown to achieve a detection rate of 96%, while the accuracy in segmenting groups of trees and single trees correctly was shown to be 75%. Relative positioning using LiDAR achieved a RMSE of 1.4m and 2.1m for cross track distance and along track position respectively, while Direct Georeferencing achieved RMSE of 3.1m in both instances. The estimation of pole and tree heights measured with LiDAR had a RMSE of 0.4m and 0.9m respectively, while Stereo Matching achieved 1.5m and 2.9m. Overall a small number of poles were missed with detection rates of 98% and 95% for LiDAR and Stereo Matching.
Resumo:
With the emergence of multi-cores into the mainstream, there is a growing need for systems to allow programmers and automated systems to reason about data dependencies and inherent parallelismin imperative object-oriented languages. In this paper we exploit the structure of object-oriented programs to abstract computational side-effects. We capture and validate these effects using a static type system. We use these as the basis of sufficient conditions for several different data and task parallelism patterns. We compliment our static type system with a lightweight runtime system to allow for parallelization in the presence of complex data flows. We have a functioning compiler and worked examples to demonstrate the practicality of our solution.
Resumo:
A configurable process model describes a family of similar process models in a given domain. Such a model can be configured to obtain a specific process model that is subsequently used to handle individual cases, for instance, to process customer orders. Process configuration is notoriously difficult as there may be all kinds of interdependencies between configuration decisions.} In fact, an incorrect configuration may lead to behavioral issues such as deadlocks and livelocks. To address this problem, we present a novel verification approach inspired by the ``operating guidelines'' used for partner synthesis. We view the configuration process as an external service, and compute a characterization of all such services which meet particular requirements using the notion of configuration guideline. As a result, we can characterize all feasible configurations (i.\,e., configurations without behavioral problems) at design time, instead of repeatedly checking each individual configuration while configuring a process model.
Resumo:
To reduce the damage of phishing and spyware attacks, banks, governments, and other security-sensitive industries are deploying one-time password systems, where users have many passwords and use each password only once. If a single password is compromised, it can be only be used to impersonate the user once, limiting the damage caused. However, existing practical approaches to one-time passwords have been susceptible to sophisticated phishing attacks. ---------- We give a formal security treatment of this important practical problem. We consider the use of one-time passwords in the context of password-authenticated key exchange (PAKE), which allows for mutual authentication, session key agreement, and resistance to phishing attacks. We describe a security model for the use of one-time passwords, explicitly considering the compromise of past (and future) one-time passwords, and show a general technique for building a secure one-time-PAKE protocol from any secure PAKE protocol. Our techniques also allow for the secure use of pseudorandomly generated and time-dependent passwords.
Resumo:
We provide the first description of and security model for authenticated key exchange protocols with predicate-based authentication. In addition to the standard goal of session key security, our security model also provides for credential privacy: a participating party learns nothing more about the other party's credentials than whether they satisfy the given predicate. Our model also encompasses attribute-based key exchange since it is a special case of predicate-based key exchange.---------- We demonstrate how to realize a secure predicate-based key exchange protocol by combining any secure predicate-based signature scheme with the basic Diffie-Hellman key exchange protocol, providing an efficient and simple solution.
Resumo:
Abstract—It is easy to create new combinatorial games but more difficult to predict those that will interest human players. We examine the concept of game quality, its automated measurement through self-play simulations, and its use in the evolutionary search for new high-quality games. A general game system called Ludi is described and experiments conducted to test its ability to synthesize and evaluate new games. Results demonstrate the validity of the approach through the automated creation of novel, interesting, and publishable games. Index Terms—Aesthetics, artificial intelligence (AI), combinatorial game, evolutionary search, game design.
Resumo:
Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective than developing them from scratch. As process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. Experiments are conducted to demonstrate that our proposal achieves a significant reduction in query evaluation time.
Resumo:
Alzaid et al. proposed a forward & backward secure key management scheme in wireless sensor networks for Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems. The scheme, however, is still vulnerable to an attack called the sandwich attack that can be launched when the adversary captures two sensor nodes at times t1 and t2, and then reveals all the group keys used between times t1 and t2. In this paper, a fix to the scheme is proposed in order to limit the vulnerable time duration to an arbitrarily chosen time span while keeping the forward and backward secrecy of the scheme untouched. Then, the performance analysis for our proposal, Alzaid et al.’s scheme, and Nilsson et al.’s scheme is given.