946 resultados para RM(rate monotonic)algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of temperature on respiration rate has been established, using Cartesian divers, for the meiofaunal sabellid polychaeteManayunkia aestuarina, the free-living nematodeSphaerolaimus hirsutus and the harpacticoid copepodTachidius discipes from a mudflat in the Lynher estuary, Cornwall, U.K. Over the temperature range normally experienced in the field, i.e. 5–20° C the size-compensated respiration rate (R c) was related to the temperature (T) in °C by the equation Log10 R c=-0.635+0.0339T forManayunkia, Log10 R c=0.180+0.0069T forSphaerolaimus and Log10 R c=-0.428+0.0337T forTachidius, being equivalent toQ 10 values of 2.19, 1.17 and 2.17 respectively. In order to derive the temperature response forManayunkia a relationship was first established between respiration rate and body size: Log10 R=0.05+0.75 Log10 V whereR=respiration in nl·O2·ind-1·h-1 andV=body volume in nl. TheQ 10 values are compared with values for other species derived from the literature. From these limited data a dichotomy emerges: species with aQ 10≏2 which apparently feed on diatoms and bacteria, the abundance of which are subject to large short term variability, and species withQ 10≏1 apparently dependent on more stable food sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coccolithophores are the largest source of calcium carbonate in the oceans and are considered to play an important role in oceanic carbon cycles. Current methods to detect the presence of coccolithophore blooms from Earth observation data often produce high numbers of false positives in shelf seas and coastal zones due to the spectral similarity between coccolithophores and other suspended particulates. Current methods are therefore unable to characterise the bloom events in shelf seas and coastal zones, despite the importance of these phytoplankton in the global carbon cycle. A novel approach to detect the presence of coccolithophore blooms from Earth observation data is presented. The method builds upon previous optical work and uses a statistical framework to combine spectral, spatial and temporal information to produce maps of coccolithophore bloom extent. Validation and verification results for an area of the north east Atlantic are presented using an in situ database (N = 432) and all available SeaWiFS data for 2003 and 2004. Verification results show that the approach produces a temporal seasonal signal consistent with biological studies of these phytoplankton. Validation using the in situ coccolithophore cell count database shows a high correct recognition rate of 80% and a low false-positive rate of 0.14 (in comparison to 63% and 0.34 respectively for the established, purely spectral approach). To guide its broader use, a full sensitivity analysis for the algorithm parameters is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-based remote sensing of active fires is the only practical way to consistently and continuously monitor diurnal fluctuations in biomass burning from regional, to continental, to global scales. Failure to understand, quantify, and communicate the performance of an active fire detection algorithm, however, can lead to improper interpretations of the spatiotemporal distribution of biomass burning, and flawed estimates of fuel consumption and trace gas and aerosol emissions. This work evaluates the performance of the Spinning Enhanced Visible and Infrared Imager (SEVIRI) Fire Thermal Anomaly (FTA) detection algorithm using seven months of active fire pixels detected by the Moderate Resolution Imaging Spectroradiometer (MODIS) across the Central African Republic (CAR). Results indicate that the omission rate of the SEVIRI FTA detection algorithm relative to MODIS varies spatially across the CAR, ranging from 25% in the south to 74% in the east. In the absence of confounding artifacts such as sunglint, uncertainties in the background thermal characterization, and cloud cover, the regional variation in SEVIRI's omission rate can be attributed to a coupling between SEVIRI's low spatial resolution detection bias (i.e., the inability to detect fires below a certain size and intensity) and a strong geographic gradient in active fire characteristics across the CAR. SEVIRI's commission rate relative to MODIS increases from 9% when evaluated near MODIS nadir to 53% near the MODIS scene edges, indicating that SEVIRI errors of commission at the MODIS scene edges may not be false alarms but rather true fires that MODIS failed to detect as a result of larger pixel sizes at extreme MODIS scan angles. Results from this work are expected to facilitate (i) future improvements to the SEVIRI FTA detection algorithm; (ii) the assimilation of the SEVIRI and MODIS active fire products; and (iii) the potential inclusion of SEVIRI into a network of geostationary sensors designed to achieve global diurnal active fire monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new search-space-updating technique for genetic algorithms is proposed for continuous optimisation problems. Other than gradually reducing the search space during the evolution process with a fixed reduction rate set ‘a priori’, the upper and the lower boundaries for each variable in the objective function are dynamically adjusted based on its distribution statistics. To test the effectiveness, the technique is applied to a number of benchmark optimisation problems in comparison with three other techniques, namely the genetic algorithms with parameter space size adjustment (GAPSSA) technique [A.B. Djurišic, Elite genetic algorithms with adaptive mutations for solving continuous optimization problems – application to modeling of the optical constants of solids, Optics Communications 151 (1998) 147–159], successive zooming genetic algorithm (SZGA) [Y. Kwon, S. Kwon, S. Jin, J. Kim, Convergence enhanced genetic algorithm with successive zooming method for solving continuous optimization problems, Computers and Structures 81 (2003) 1715–1725] and a simple GA. The tests show that for well-posed problems, existing search space updating techniques perform well in terms of convergence speed and solution precision however, for some ill-posed problems these techniques are statistically inferior to a simple GA. All the tests show that the proposed new search space update technique is statistically superior to its counterparts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Richardson-Lucy algorithm is one of the most important algorithms in the image deconvolution area. However, one of its drawbacks is slow convergence. A very significant acceleration is obtained by the technique proposed by Biggs and Andrews (BA), which is implemented in the deconvlucy function of the Image Processing MATLAB toolbox. The BA method was developed heuristically with no proof of convergence. In this paper, we introduce the Heavy-Ball (H-B) method for Poisson data optimization and extend it to a scaled H-B method, which includes the BA method as a special case. The method has proof of the convergence rate of O(k-2), where k is the number of iterations. We demonstrate the superior convergence performance of the scaled H-B method on both synthetic and real 3D images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we have developed a low-complexity algorithm for epileptic seizure detection with a high degree of accuracy. The algorithm has been designed to be feasibly implementable as battery-powered low-power implantable epileptic seizure detection system or epilepsy prosthesis. This is achieved by utilizing design optimization techniques at different levels of abstraction. Particularly, user-specific critical parameters are identified at the algorithmic level and are explicitly used along with multiplier-less implementations at the architecture level. The system has been tested on neural data obtained from in-vivo animal recordings and has been implemented in 90nm bulk-Si technology. The results show up to 90 % savings in power as compared to prevalent wavelet based seizure detection technique while achieving 97% average detection rate. Copyright 2010 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today there is a growing interest in the integration of health monitoring applications in portable devices necessitating the development of methods that improve the energy efficiency of such systems. In this paper, we present a systematic approach that enables energy-quality trade-offs in spectral analysis systems for bio-signals, which are useful in monitoring various health conditions as those associated with the heart-rate. To enable such trade-offs, the processed signals are expressed initially in a basis in which significant components that carry most of the relevant information can be easily distinguished from the parts that influence the output to a lesser extent. Such a classification allows the pruning of operations associated with the less significant signal components leading to power savings with minor quality loss since only less useful parts are pruned under the given requirements. To exploit the attributes of the modified spectral analysis system, thresholding rules are determined and adopted at design- and run-time, allowing the static or dynamic pruning of less-useful operations based on the accuracy and energy requirements. The proposed algorithm is implemented on a typical sensor node simulator and results show up-to 82% energy savings when static pruning is combined with voltage and frequency scaling, compared to the conventional algorithm in which such trade-offs were not available. In addition, experiments with numerous cardiac samples of various patients show that such energy savings come with a 4.9% average accuracy loss, which does not affect the system detection capability of sinus-arrhythmia which was used as a test case. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a mixed cost-function adaptive initialization algorithm for the time domain equalizer in a discrete multitone (DMT)-based asymmetric digital subscriber line. Using our approach, a higher convergence rate than that of the commonly used least-mean square algorithm is obtained, whilst attaining bit rates close to the optimum maximum shortening SNR and the upper bound SNR. Furthermore, our proposed method outperforms the minimum mean-squared error design for a range of time domain equalizer (TEQ) filter lengths. The improved performance outweighs the small increase in computational complexity required. A block variant of our proposed algorithm is also presented to overcome the increased latency imposed on the feedback path of the adaptive system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models are useful tools for simulation, evaluation, optimal operation and control of solar cells and proton exchange membrane fuel cells (PEMFCs). To identify the model parameters of these two type of cells efficiently, a biogeography-based optimization algorithm with mutation strategies (BBO-M) is proposed. The BBO-M uses the structure of biogeography-based optimization algorithm (BBO), and both the mutation motivated from the differential evolution (DE) algorithm and the chaos theory are incorporated into the BBO structure for improving the global searching capability of the algorithm. Numerical experiments have been conducted on ten benchmark functions with 50 dimensions, and the results show that BBO-M can produce solutions of high quality and has fast convergence rate. Then, the proposed BBO-M is applied to the model parameter estimation of the two type of cells. The experimental results clearly demonstrate the power of the proposed BBO-M in estimating model parameters of both solar and fuel cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a hybrid mixed cost-function adaptive initialization algorithm for the time domain equalizer in a discrete multitone (DMT)-based asymmetric digital subscriber loop. Using our approach, a higher convergence rate than that of the commonly used least-mean square algorithm is obtained, whilst attaining bit rates close to the optimum maximum shortening SNR and the upper bound SNR. Moreover, our proposed method outperforms the minimum mean-squared error design for a range of TEQ filter lengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The iterative nature of turbo-decoding algorithms increases their complexity compare to conventional FEC decoding algorithms. Two iterative decoding algorithms, Soft-Output-Viterbi Algorithm (SOVA) and Maximum A posteriori Probability (MAP) Algorithm require complex decoding operations over several iteration cycles. So, for real-time implementation of turbo codes, reducing the decoder complexity while preserving bit-error-rate (BER) performance is an important design consideration. In this chapter, a modification to the Max-Log-MAP algorithm is presented. This modification is to scale the extrinsic information exchange between the constituent decoders. The remainder of this chapter is organized as follows: An overview of the turbo encoding and decoding processes, the MAP algorithm and its simplified versions the Log-MAP and Max-Log-MAP algorithms are presented in section 1. The extrinsic information scaling is introduced, simulation results are presented, and the performance of different methods to choose the best scaling factor is discussed in Section 2. Section 3 discusses trends and applications of turbo coding from the perspective of wireless applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QUESTIONS UNDER STUDY AND PRINCIPLES: Estimating glomerular filtration rate (GFR) in hospitalised patients with chronic kidney disease (CKD) is important for drug prescription but it remains a difficult task. The purpose of this study was to investigate the reliability of selected algorithms based on serum creatinine, cystatin C and beta-trace protein to estimate GFR and the potential added advantage of measuring muscle mass by bioimpedance. In a prospective unselected group of patients hospitalised in a general internal medicine ward with CKD, GFR was evaluated using inulin clearance as the gold standard and the algorithms of Cockcroft, MDRD, Larsson (cystatin C), White (beta-trace) and MacDonald (creatinine and muscle mass by bioimpedance). 69 patients were included in the study. Median age (interquartile range) was 80 years (73-83); weight 74.7 kg (67.0-85.6), appendicular lean mass 19.1 kg (14.9-22.3), serum creatinine 126 μmol/l (100-149), cystatin C 1.45 mg/l (1.19-1.90), beta-trace protein 1.17 mg/l (0.99-1.53) and GFR measured by inulin 30.9 ml/min (22.0-43.3). The errors in the estimation of GFR and the area under the ROC curves (95% confidence interval) relative to inulin were respectively: Cockcroft 14.3 ml/min (5.55-23.2) and 0.68 (0.55-0.81), MDRD 16.3 ml/min (6.4-27.5) and 0.76 (0.64-0.87), Larsson 12.8 ml/min (4.50-25.3) and 0.82 (0.72-0.92), White 17.6 ml/min (11.5-31.5) and 0.75 (0.63-0.87), MacDonald 32.2 ml/min (13.9-45.4) and 0.65 (0.52-0.78). Currently used algorithms overestimate GFR in hospitalised patients with CKD. As a consequence eGFR targeted prescriptions of renal-cleared drugs, might expose patients to overdosing. The best results were obtained with the Larsson algorithm. The determination of muscle mass by bioimpedance did not provide significant contributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La grossesse induit de profonds changements hémodynamiques et métaboliques de l’organisme maternel qui ont des conséquences sur le cœur. L’adaptation du cœur à cette condition physiologique nécessite un remodelage de sa structure et par conséquent des ajustements de sa fonction. Les mécanismes responsables de ces adaptations sont en grande partie inconnus. Cependant, ces connaissances sont essentielles pour la compréhension des complications cardiovasculaires, telle que l’hypertension gestationnelle (HG), qui constituent un risque pour la santé de la mère et du fœtus. Afin de caractériser les adaptations du cœur lors de la grossesse, l’originalité de notre approche expérimentale consistait à étudier le remodelage à l’échelle des cardiomyocytes du ventricule gauche. Ainsi, notre premier objectif était de déterminer les modifications structurales et fonctionnelles des cardiomyocytes chez la rate en vue d’identifier les altérations lors de l’HG. Chez les rates gestantes, le remodelage structural des cardiomyocytes se caractérise par une hypertrophie cellulaire avec une augmentation proportionnelle des dimensions. L’HG a été induite par un supplément sodique (0.9% NaCl) dans la diète. L’inadaptation structurale lors de l’HG se traduit par une diminution du volume cellulaire. L’étude des modifications fonctionnelles a révélé que lors de la gestation le fonctionnement contractile des cellules est dépendant de l’adaptation du métabolisme maternel. En effet, les substrats énergétiques, lactate et pyruvate, induisent une augmentation de la contractilité des cardiomyocytes. Cet effet est plus faible dans les cellules des rates hypertendues, ce qui suggère des anomalies du couplage excitation-contraction, dans lequel les courants calciques de type L (ICa-L) jouent un rôle important. Paradoxalement, le lactate et le pyruvate ont induit une augmentation de la densité des courants ICa-L seulement chez les rates hypertendues. Le récepteur aux minéralocorticoïdes (RM) est connu pour son implication dans le remodelage structuro-fonctionnel du cœur dans les conditions pathologiques mais pas dans celui induit par la grossesse. Notre deuxième objectif était donc de déterminer le rôle du RM dans l’adaptation de la morphologie et de la contractilité des cardiomyocytes. Des rates gestantes ont été traitées avec le canrénoate de potassium (20 mg/kg/jr), un antagoniste des RM. L’inhibition des RM pendant la gestation empêche l’hypertrophie cellulaire. De plus, l’inhibition des RM bloque l’effet du lactate et du pyruvate sur la contractilité. Chez la femme, la grossesse est associée à des changements des propriétés électriques du cœur. Sur l’électrocardiogramme, l’intervalle QTc est plus long, témoignant de la prolongation de la repolarisation. Les mécanismes régulant cette adaptation restent encore inconnus. Ainsi, notre troisième objectif était de déterminer le rôle du RM dans l’adaptation de la repolarisation. Chez la rate gestante, l’intervalle QTc est prolongé ce qui est corroboré par la diminution des courants potassiques Ito et IK1. L’inhibition des RM pendant la gestation empêche la prolongation de l’intervalle QTc et la diminution des courants Ito. Les travaux exposés dans cette thèse apportent une vision plus précise du remodelage cardiaque induit par la grossesse, qui est permise par l’étude à l’échelle cellulaire. Nos résultats montrent que lors de la gestation et de l’HG les cardiomyocytes subissent des remodelages morphologiques contrastés. Notre étude a aussi révélé que lors de la gestation, la fonction contractile est tributaire des adaptations métaboliques et que cette relation est altérée lors de l’HG. Nos travaux montrent que la régulation de ces adaptations gestationnelles fait intervenir le RM au niveau de la morphologie, de la relation métabolisme/fonctionnement contractile et de la repolarisation. En faisant avancer les connaissances sur l’hypertrophie de la grossesse, ces travaux vont permettre d’améliorer la compréhension des complications cardiovasculaires gestationnelles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel two-pass algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS). compensation. for block base motion On the basis of research from previous algorithms, especially an on-the-edge motion estimation algorithm called hexagonal search (HEXBS), we propose the LHMEA and the Two-Pass Algorithm (TPA). We introduce hashtable into video compression. In this paper we employ LHMEA for the first-pass search in all the Macroblocks (MB) in the picture. Motion Vectors (MV) are then generated from the first-pass and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of MBs. The evaluation of the algorithm considers the three important metrics being time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms. Experimental results show that the proposed algorithm can offer the same compression rate as the Full Search. LHMEA with TPA has significant improvement on HEXBS and shows a direction for improving other fast motion estimation algorithms, for example Diamond Search.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an improved Two-Pass Hexagonal (TPA) algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS) for motion estimation. In the TPA, Motion Vectors (MV) are generated from the first-pass LHMEA and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of Macroblocks (MBs). The hashtable structure of LHMEA is improved compared to the original TPA and LHMEA. The evaluation of the algorithm considers the three important metrics being processing time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms.