197 resultados para Modified algorithms
Resumo:
With the fast development of the Internet, wireless communications and semiconductor devices, home networking has received significant attention. Consumer products can collect and transmit various types of data in the home environment. Typical consumer sensors are often equipped with tiny, irreplaceable batteries and it therefore of the utmost importance to design energy efficient algorithms to prolong the home network lifetime and reduce devices going to landfill. Sink mobility is an important technique to improve home network performance including energy consumption, lifetime and end-to-end delay. Also, it can largely mitigate the hot spots near the sink node. The selection of optimal moving trajectory for sink node(s) is an NP-hard problem jointly optimizing routing algorithms with the mobile sink moving strategy is a significant and challenging research issue. The influence of multiple static sink nodes on energy consumption under different scale networks is first studied and an Energy-efficient Multi-sink Clustering Algorithm (EMCA) is proposed and tested. Then, the influence of mobile sink velocity, position and number on network performance is studied and a Mobile-sink based Energy-efficient Clustering Algorithm (MECA) is proposed. Simulation results validate the performance of the proposed two algorithms which can be deployed in a consumer home network environment.
Resumo:
The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset—the period 1989–2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.
Resumo:
A simple storm loss model is applied to an ensemble of ECHAM5/MPI-OM1 GCM simulations in order to estimate changes of insured loss potentials over Europe in the 21st century. Losses are computed based on the daily maximum wind speed for each grid point. The calibration of the loss model is performed using wind data from the ERA40-Reanalysis and German loss data. The obtained annual losses for the present climate conditions (20C, three realisations) reproduce the statistical features of the historical insurance loss data for Germany. The climate change experiments correspond to the SRES-Scenarios A1B and A2, and for each of them three realisations are considered. On average, insured loss potentials increase for all analysed European regions at the end of the 21st century. Changes are largest for Germany and France, and lowest for Portugal/Spain. Additionally, the spread between the single realisations is large, ranging e.g. for Germany from −4% to +43% in terms of mean annual loss. Moreover, almost all simulations show an increasing interannual variability of storm damage. This assessment is even more pronounced if no adaptation of building structure to climate change is considered. The increased loss potentials are linked with enhanced values for the high percentiles of surface wind maxima over Western and Central Europe, which in turn are associated with an enhanced number and increased intensity of extreme cyclones over the British Isles and the North Sea.
Resumo:
The spatial structure of beta-plane Rossby waves in a sinusoidal basic zonal flow U 0cos(γ,y) is determined analytically in the (stable) asymptotic limit of weak shear, U 0γ2 0/β≈1. The propagating neutral normal modes are found to take their greatest amplitude in the region of maximum westerly flow, while their most rapid phase variation is achieved in the region of maximum easterly flow. These results are shown to be consistent with what is obtained by ray-tracing methods in the limit of small meridional disturbance wavelength.
Resumo:
Recent evidence suggests that immobilization of the upper limb for 2–3 weeks induces changes in cortical thickness as well as motor performance. In constraint induced (CI) therapy, one of the most effective interventions for hemiplegia, the non-paretic arm is constrained to enforce the use of the paretic arm in the home setting. With the present study we aimed to explore whether non-paretic arm immobilization in CI therapy induces structural changes in the non-lesioned hemisphere, and how these changes are related to treatment benefit. 31 patients with chronic hemiparesis participated in CI therapy with (N = 14) and without (N = 17) constraint. Motor ability scores were acquired before and after treatment. Diffusion tensor imaging (DTI) data was obtained prior to treatment. Cortical thickness was measured with the Freesurfer software. In both groups cortical thickness in the contralesional primary somatosensory cortex increased and motor function improved with the intervention. However the cortical thickness change was not associated with the magnitude of motor function improvement. Moreover, the treatment effect and the cortical thickness change were not significantly different between the constraint and the non-constraint groups. There was no correlation between fractional anisotropy changes in the non-lesioned hemisphere and treatment outcome. CI therapy induced cortical thickness changes in contralesional sensorimotor regions, but this effect does not appear to be driven by the immobilization of the non-paretic arm, as indicated by the absence of differences between the constraint and the non-constraint groups. Our data does not suggest that the arm immobilization used in CI therapy is associated with noticeable cortical thinning.
Resumo:
We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.
Resumo:
An updated empirical approach is proposed for specifying coexistence requirements for genetically modified (GM) maize (Zea mays L.) production to ensure compliance with the 0.9% labeling threshold for food and feed in the European Union. The model improves on a previously published (Gustafson et al., 2006) empirical model by adding recent data sources to supplement the original database and including the following additional cases: (i) more than one GM maize source field adjacent to the conventional or organic field, (ii) the possibility of so-called “stacked” varieties with more than one GM trait, and (iii) lower pollen shed in the non-GM receptor field. These additional factors lead to the possibility for somewhat wider combinations of isolation distance and border rows than required in the original version of the empirical model. For instance, in the very conservative case of a 1-ha square non-GM maize field surrounded on all four sides by homozygous GM maize with 12 m isolation (the effective isolation distance for a single GM field), non-GM border rows of 12 m are required to be 95% confident of gene flow less than 0.9% in the non-GM field (with adventitious presence of 0.3%). Stacked traits of higher GM mass fraction and receptor fields of lower pollen shed would require a greater number of border rows to comply with the 0.9% threshold, and an updated extension to the model is provided to quantify these effects.
Resumo:
In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.
Resumo:
Transcriptional dysfunction is a prominent hallmark of Huntington's disease (HD). Several transcription factors have been implicated in the aetiology of HD progression and one of the most prominent is repressor element 1 (RE1) silencing transcription factor (REST). REST is a global repressor of neuronal gene expression and in the presence of mutant Huntingtin increased nuclear REST levels lead to elevated RE1 occupancy and a concomitant increase in target gene repression, including brain-derived neurotrophic factor. It is of great interest to devise strategies to reverse transcriptional dysregulation caused by increased nuclear REST and determine the consequences in HD. Thus far, such strategies have involved RNAi or mutant REST constructs. Decoys are double-stranded oligodeoxynucleotides corresponding to the DNA-binding element of a transcription factor and act to sequester it, thereby abrogating its transcriptional activity. Here, we report the use of a novel decoy strategy to rescue REST target gene expression in a cellular model of HD. We show that delivery of the decoy in cells expressing mutant Huntingtin leads to its specific interaction with REST, a reduction in REST occupancy of RE1s and rescue of target gene expression, including Bdnf. These data point to an alternative strategy for rebalancing the transcriptional dysregulation in HD.
Resumo:
We present a Bayesian image classification scheme for discriminating cloud, clear and sea-ice observations at high latitudes to improve identification of areas of clear-sky over ice-free ocean for SST retrieval. We validate the image classification against a manually classified dataset using Advanced Along Track Scanning Radiometer (AATSR) data. A three way classification scheme using a near-infrared textural feature improves classifier accuracy by 9.9 % over the nadir only version of the cloud clearing used in the ATSR Reprocessing for Climate (ARC) project in high latitude regions. The three way classification gives similar numbers of cloud and ice scenes misclassified as clear but significantly more clear-sky cases are correctly identified (89.9 % compared with 65 % for ARC). We also demonstrate the poetential of a Bayesian image classifier including information from the 0.6 micron channel to be used in sea-ice extent and ice surface temperature retrieval with 77.7 % of ice scenes correctly identified and an overall classifier accuracy of 96 %.
Resumo:
In this paper we propose methods for computing Fresnel integrals based on truncated trapezium rule approximations to integrals on the real line, these trapezium rules modified to take into account poles of the integrand near the real axis. Our starting point is a method for computation of the error function of complex argument due to Matta and Reichel (J Math Phys 34:298–307, 1956) and Hunter and Regan (Math Comp 26:539–541, 1972). We construct approximations which we prove are exponentially convergent as a function of N , the number of quadrature points, obtaining explicit error bounds which show that accuracies of 10−15 uniformly on the real line are achieved with N=12 , this confirmed by computations. The approximations we obtain are attractive, additionally, in that they maintain small relative errors for small and large argument, are analytic on the real axis (echoing the analyticity of the Fresnel integrals), and are straightforward to implement.
Resumo:
A class identification algorithms is introduced for Gaussian process(GP)models.The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank,a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions.The objective of either maximizing the marginal likelihood or the Kullback–Leibler (K–L) divergence between the estimated output probability density function(pdf)and the true pdf has been used as respective cost functions.For each cost function,an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.
Resumo:
Satellite data are increasingly used to provide observation-based estimates of the effects of aerosols on climate. The Aerosol-cci project, part of the European Space Agency's Climate Change Initiative (CCI), was designed to provide essential climate variables for aerosols from satellite data. Eight algorithms, developed for the retrieval of aerosol properties using data from AATSR (4), MERIS (3) and POLDER, were evaluated to determine their suitability for climate studies. The primary result from each of these algorithms is the aerosol optical depth (AOD) at several wavelengths, together with the Ångström exponent (AE) which describes the spectral variation of the AOD for a given wavelength pair. Other aerosol parameters which are possibly retrieved from satellite observations are not considered in this paper. The AOD and AE (AE only for Level 2) were evaluated against independent collocated observations from the ground-based AERONET sun photometer network and against “reference” satellite data provided by MODIS and MISR. Tools used for the evaluation were developed for daily products as produced by the retrieval with a spatial resolution of 10 × 10 km2 (Level 2) and daily or monthly aggregates (Level 3). These tools include statistics for L2 and L3 products compared with AERONET, as well as scoring based on spatial and temporal correlations. In this paper we describe their use in a round robin (RR) evaluation of four months of data, one month for each season in 2008. The amount of data was restricted to only four months because of the large effort made to improve the algorithms, and to evaluate the improvement and current status, before larger data sets will be processed. Evaluation criteria are discussed. Results presented show the current status of the European aerosol algorithms in comparison to both AERONET and MODIS and MISR data. The comparison leads to a preliminary conclusion that the scores are similar, including those for the references, but the coverage of AATSR needs to be enhanced and further improvements are possible for most algorithms. None of the algorithms, including the references, outperforms all others everywhere. AATSR data can be used for the retrieval of AOD and AE over land and ocean. PARASOL and one of the MERIS algorithms have been evaluated over ocean only and both algorithms provide good results.
Resumo:
Currently, infrared filters for astronomical telescopes and satellite radiometers are based on multilayer thin film stacks of alternating high and low refractive index materials. However, the choice of suitable layer materials is limited and this places limitations on the filter performance that can be achieved. The ability to design materials with arbitrary refractive index allows for filter performance to be greatly increased but also increases the complexity of design. Here a differential algorithm was used as a method for optimised design of filters with arbitrary refractive indices, and then materials are designed to these specifications as mono-materials with sub wavelength structures using Bruggeman’s effective material approximation (EMA).