924 resultados para IEEE 1451
Resumo:
Computational models for cardiomyocyte action potentials (AP) often make use of a large parameter set. This parameter set can contain some elements that are fitted to experimental data independently of any other element, some elements that are derived concurrently with other elements to match experimental data, and some elements that are derived purely from phenomenological fitting to produce the desired AP output. Furthermore, models can make use of several different data sets, not always derived for the same conditions or even the same species. It is consequently uncertain whether the parameter set for a given model is physiologically accurate. Furthermore, it is only recently that the possibility of degeneracy in parameter values in producing a given simulation output has started to be addressed. In this study, we examine the effects of varying two parameters (the L-type calcium current (I(CaL)) and the delayed rectifier potassium current (I(Ks))) in a computational model of a rabbit ventricular cardiomyocyte AP on both the membrane potential (V(m)) and calcium (Ca(2+)) transient. It will subsequently be determined if there is degeneracy in this model to these parameter values, which will have important implications on the stability of these models to cell-to-cell parameter variation, and also whether the current methodology for generating parameter values is flawed. The accuracy of AP duration (APD) as an indicator of AP shape will also be assessed.
Resumo:
CCTV and surveillance networks are increasingly being used for operational as well as security tasks. One emerging area of technology that lends itself to operational analytics is soft biometrics. Soft biometrics can be used to describe a person and detect them throughout a sparse multi-camera network. This enables them to be used to perform tasks such as determining the time taken to get from point to point, and the paths taken through an environment by detecting and matching people across disjoint views. However, in a busy environment where there are 100's if not 1000's of people such as an airport, attempting to monitor everyone is highly unrealistic. In this paper we propose an average soft biometric, that can be used to identity people who look distinct, and are thus suitable for monitoring through a large, sparse camera network. We demonstrate how an average soft biometric can be used to identify unique people to calculate operational measures such as the time taken to travel from point to point.
Resumo:
Epoxy-multiwall carbon nanotube nanocomposite thin films were prepared by spin casting. High power air plasma was used to preferentially etch a coating of epoxy and expose the underlying carbon nanotube network. Scanning electron microscopy (SEM) examination revealed well distributed and spatially connected carbon nanotube network in both the longitudinal direction (plasma etched surface) and traverse direction (through-thickness fractured surface). Topographical examination and conductive mode imaging of the plasma etched surface using atomic force microscope (AFM) in the contact mode enabled direct imaging of topography and current maps of the embedded carbon nanotube network. Bundles consisting of at least three single carbon nanotubes form part of the percolating network observed under high resolution current maps. Predominantly non-ohmic response is obtained in this study; behaviour attributed to less than effective polymer material removal when using air plasma etching.
Resumo:
Acoustic sensors play an important role in augmenting the traditional biodiversity monitoring activities carried out by ecologists and conservation biologists. With this ability however comes the burden of analysing large volumes of complex acoustic data. Given the complexity of acoustic sensor data, fully automated analysis for a wide range of species is still a significant challenge. This research investigates the use of citizen scientists to analyse large volumes of environmental acoustic data in order to identify bird species. Specifically, it investigates ways in which the efficiency of a user can be improved through the use of species identification tools and the use of reputation models to predict the accuracy of users with unidentified skill levels. Initial experimental results are reported.
Resumo:
Micro aerial vehicles (MAVs) are a rapidly growing area of research and development in robotics. For autonomous robot operations, localization has typically been calculated using GPS, external camera arrays, or onboard range or vision sensing. In cluttered indoor or outdoor environments, onboard sensing is the only viable option. In this paper we present an appearance-based approach to visual SLAM on a flying MAV using only low quality vision. Our approach consists of a visual place recognition algorithm that operates on 1000 pixel images, a lightweight visual odometry algorithm, and a visual expectation algorithm that improves the recall of place sequences and the precision with which they are recalled as the robot flies along a similar path. Using data gathered from outdoor datasets, we show that the system is able to perform visual recognition with low quality, intermittent visual sensory data. By combining the visual algorithms with the RatSLAM system, we also demonstrate how the algorithms enable successful SLAM.
Resumo:
People all over the world are regularly hit by floods, cyclones, and other natural disasters. Many use smart phones and social media to stay connected, seek help, improvise, and cope with crises or challenging situations. This column discusses these practices after dark or during disasters to unveil challenges and opportunities for innovative designs that increase resilience and safety.
Resumo:
Experimental and theoretical studies have shown the importance of stochastic processes in genetic regulatory networks and cellular processes. Cellular networks and genetic circuits often involve small numbers of key proteins such as transcriptional factors and signaling proteins. In recent years stochastic models have been used successfully for studying noise in biological pathways, and stochastic modelling of biological systems has become a very important research field in computational biology. One of the challenge problems in this field is the reduction of the huge computing time in stochastic simulations. Based on the system of the mitogen-activated protein kinase cascade that is activated by epidermal growth factor, this work give a parallel implementation by using OpenMP and parallelism across the simulation. Special attention is paid to the independence of the generated random numbers in parallel computing, that is a key criterion for the success of stochastic simulations. Numerical results indicate that parallel computers can be used as an efficient tool for simulating the dynamics of large-scale genetic regulatory networks and cellular processes
Resumo:
Experimental action potential (AP) recordings in isolated ventricular myoctes display significant temporal beat-to-beat variability in morphology and duration. Furthermore, significant cell-to-cell differences in AP also exist even for isolated cells originating from the same region of the same heart. However, current mathematical models of ventricular AP fail to replicate the temporal and cell-to-cell variability in AP observed experimentally. In this study, we propose a novel mathematical framework for the development of phenomenological AP models capable of capturing cell-to-cell and temporal variabilty in cardiac APs. A novel stochastic phenomenological model of the AP is developed, based on the deterministic Bueno-Orovio/Fentonmodel. Experimental recordings of AP are fit to the model to produce AP models of individual cells from the apex and the base of the guinea-pig ventricles. Our results show that the phenomenological model is able to capture the considerable differences in AP recorded from isolated cells originating from the location. We demonstrate the closeness of fit to the available experimental data which may be achieved using a phenomenological model, and also demonstrate the ability of the stochastic form of the model to capture the observed beat-to-beat variablity in action potential duration.
Resumo:
In this paper we pursue the task of aligning an ensemble of images in an unsupervised manner. This task has been commonly referred to as “congealing” in literature. A form of congealing, using a least-squares criteria, has been recently demonstrated to have desirable properties over conventional congealing. Least-squares congealing can be viewed as an extension of the Lucas & Kanade (LK)image alignment algorithm. It is well understood that the alignment performance for the LK algorithm, when aligning a single image with another, is theoretically and empirically equivalent for additive and compositional warps. In this paper we: (i) demonstrate that this equivalence does not hold for the extended case of congealing, (ii) characterize the inherent drawbacks associated with least-squares congealing when dealing with large numbers of images, and (iii) propose a novel method for circumventing these limitations through the application of an inverse-compositional strategy that maintains the attractive properties of the original method while being able to handle very large numbers of images.
Resumo:
Probabilistic topic models have recently been used for activity analysis in video processing, due to their strong capacity to model both local activities and interactions in crowded scenes. In those applications, a video sequence is divided into a collection of uniform non-overlaping video clips, and the high dimensional continuous inputs are quantized into a bag of discrete visual words. The hard division of video clips, and hard assignment of visual words leads to problems when an activity is split over multiple clips, or the most appropriate visual word for quantization is unclear. In this paper, we propose a novel algorithm, which makes use of a soft histogram technique to compensate for the loss of information in the quantization process; and a soft cut technique in the temporal domain to overcome problems caused by separating an activity into two video clips. In the detection process, we also apply a soft decision strategy to detect unusual events.We show that the proposed soft decision approach outperforms its hard decision counterpart in both local and global activity modelling.
Resumo:
Modelling events in densely crowded environments remains challenging, due to the diversity of events and the noise in the scene. We propose a novel approach for anomalous event detection in crowded scenes using dynamic textures described by the Local Binary Patterns from Three Orthogonal Planes (LBP-TOP) descriptor. The scene is divided into spatio-temporal patches where LBP-TOP based dynamic textures are extracted. We apply hierarchical Bayesian models to detect the patches containing unusual events. Our method is an unsupervised approach, and it does not rely on object tracking or background subtraction. We show that our approach outperforms existing state of the art algorithms for anomalous event detection in UCSD dataset.
Resumo:
Transmission smart grids will use a digital platform for the automation of high voltage substations. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. A time synchronisation system is required for a sampled value process bus, however the details are not defined in IEC 61850-9-2. PTPv2 provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. The suitability of PTPv2 to synchronise sampling in a digital process bus is evaluated, with preliminary results indicating that steady state performance of low cost clocks is an acceptable ±300 ns, but that corrections issued by grandmaster clocks can introduce significant transients. Extremely stable grandmaster oscillators are required to ensure any corrections are sufficiently small that time synchronising performance is not degraded.
Resumo:
Abstract—Computational Intelligence Systems (CIS) is one of advanced softwares. CIS has been important position for solving single-objective / reverse / inverse and multi-objective design problems in engineering. The paper hybridise a CIS for optimisation with the concept of Nash-Equilibrium as an optimisation pre-conditioner to accelerate the optimisation process. The hybridised CIS (Hybrid Intelligence System) coupled to the Finite Element Analysis (FEA) tool and one type of Computer Aided Design(CAD) system; GiD is applied to solve an inverse engineering design problem; reconstruction of High Lift Systems (HLS). Numerical results obtained by the hybridised CIS are compared to the results obtained by the original CIS. The benefits of using the concept of Nash-Equilibrium are clearly demonstrated in terms of solution accuracy and optimisation efficiency.