812 resultados para multi-class queueing systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

P>1. The development of sustainable, multi-functional agricultural systems involves reconciling the needs of agricultural production with the objectives for environmental protection, including biodiversity conservation. However, the definition of sustainability remains ambiguous and it has proven difficult to identify suitable indicators for monitoring progress towards, and the successful achievement of, sustainability. 2. In this study, we show that a trait-based approach can be used to assess the detrimental impacts of agricultural change to a broad range of taxonomic groupings and derive a standardised index of farmland biodiversity health, built around an objective of achieving stable or increasing populations in all species associated with agricultural landscapes. 3. To demonstrate its application, we assess the health of UK farmland biodiversity relative to this goal. Our results suggest that the populations of two-thirds of 333 plant and animal species assessed are unsustainable under current UK agricultural practices. 4. We then explore the potential benefits of an agri-environment scheme, Entry Level Stewardship (ELS), to farmland biodiversity in the UK under differing levels of risk mitigation delivery. We show that ELS has the potential to make a significant contribution to progress towards sustainability targets but that this potential is severely restricted by current patterns of scheme deployment. 5.Synthesis and applications: We have developed a cross-taxonomic sustainability index which can be used to assess both the current health of farmland biodiversity and the impacts of future agricultural changes relative to quantitative biodiversity targets. Although biodiversity conservation is just one of a number of factors that must be considered when defining sustainability, we believe our cross-taxonomic index has the potential to be a valuable tool for guiding the development of sustainable agricultural systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dinuclear complex [(tpy)Ru-II(PCP-PCP)Ru-II(tPY)]Cl-2 (bridging PCP-PCP = 3,3',5,5'-tetrakis(diphenylphosphinomethyl)biphenyl, [C6H2(CH2PPh2)(2)-3,5](2)(2-)) was prepared via a transcyclometalation reaction of the bis-pincer ligand [PC(H)P-PC(H)P] and the Ru(II) precursor [Ru(NCN)(tpy)]Cl (NCN = [C6H3(CH2NMe2)(2)-2,6](-)) followed by a reaction with 2,2':6',2 ''-terpyridine (tpy). Electrochemical and spectroscopic properties of [(tpy)Ru-II(PCP-PCP)Ru-II(tPY)]Cl-2 are compared with those of the closely related [(tpy)Ru-II(NCN-NCN)Ru-II(tpy)](PF6)(2) (NCN-NCN = [C6H2(CH2- NMe2)(2)-3,5](2)(2-)) obtained by two-electron reduction of [(tpy)Ru-III(NCN-NCN)Ru-III(tpy)](PF6)(4). The molecular structure of the latter complex has been determined by single-crystal X-ray structure determination. One-electron reduction of [(tpy)Ru-III(NCN-NCN)Ru-III(tpy)](PF6)(4) and one-electron oxidation of [(tpy)Ru-II(PCP-PCP)RUII(tpy)]Cl-2 yielded the mixed-valence species [(tpy)Ru-III(NCN-NCN)RUII(tpy)](3+) and [(tpy)Ru-III(PCP-PCP)RUII(tpy)](3+), respectively. The comproportionation equilibrium constants K-c (900 and 748 for [(tpy)Ru-III(NCN-NCN)Ru-III(tpy)](4+) and [(tpy)Ru-II(PCP-PCP)RUII(tpy)](2+), respectively) determined from cyclic voltammetric data reveal comparable stability of the [Ru-III-Ru-II] state of both complexes. Spectroelectrochemical measurements and near-infrared (NIR) spectroscopy were employed to further characterize the different redox states with special focus on the mixed-valence species and their NIR bands. Analysis of these bands in the framework of Hush theory indicates that the mixed-valence complexes [(tpy)Ru-III(PCP-PCP)RUII(tpy)](3+) and [(tpy)Ru-III(NCN-NCN)RUII(tpy)](3+) belong to strongly coupled borderline Class II/Class III and intrinsically coupled Class III systems, respectively. Preliminary DFT calculations suggest that extensive delocalization of the spin density over the metal centers and the bridging ligand exists. TD-DFT calculations then suggested a substantial MLCT character of the NIR electronic transitions. The results obtained in this study point to a decreased metal-metal electronic interaction accommodated by the double-cyclometalated bis-pincer bridge when strong sigma-donor NMe2 groups are replaced by weak sigma-donor, pi-acceptor PPh2 groups

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Along-Track Scanning Radiometers (ATSRs) provide a long time-series of measurements suitable for the retrieval of cloud properties. This work evaluates the freely-available Global Retrieval of ATSR Cloud Parameters and Evaluation (GRAPE) dataset (version 3) created from the ATSR-2 (1995�2003) and Advanced ATSR (AATSR; 2002 onwards) records. Users are recommended to consider only retrievals flagged as high-quality, where there is a good consistency between the measurements and the retrieved state (corresponding to about 60% of converged retrievals over sea, and more than 80% over land). Cloud properties are found to be generally free of any significant spurious trends relating to satellite zenith angle. Estimates of the random error on retrieved cloud properties are suggested to be generally appropriate for optically-thick clouds, and up to a factor of two too small for optically-thin cases. The correspondence between ATSR-2 and AATSR cloud properties is high, but a relative calibration difference between the sensors of order 5�10% at 660 nm and 870 nm limits the potential of the current version of the dataset for trend analysis. As ATSR-2 is thought to have the better absolute calibration, the discussion focusses on this portion of the record. Cloud-top heights from GRAPE compare well to ground-based data at four sites, particularly for shallow clouds. Clouds forming in boundary-layer inversions are typically around 1 km too high in GRAPE due to poorly-resolved inversions in the modelled temperature profiles used. Global cloud fields are compared to satellite products derived from the Moderate Resolution Imaging Spectroradiometer (MODIS), Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) measurements, and a climatology of liquid water content derived from satellite microwave radiometers. In all cases the main reasons for differences are linked to differing sensitivity to, and treatment of, multi-layer cloud systems. The correlation coefficient between GRAPE and the two MODIS products considered is generally high (greater than 0.7 for most cloud properties), except for liquid and ice cloud effective radius, which also show biases between the datasets. For liquid clouds, part of the difference is linked to choice of wavelengths used in the retrieval. Total cloud cover is slightly lower in GRAPE (0.64) than the CALIOP dataset (0.66). GRAPE underestimates liquid cloud water path relative to microwave radiometers by up to 100 g m�2 near the Equator and overestimates by around 50 g m�2 in the storm tracks. Finally, potential future improvements to the algorithm are outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of large scale virtual reality and simulation systems have been mostly driven by the DIS and HLA standards community. A number of issues are coming to light about the applicability of these standards, in their present state, to the support of general multi-user VR systems. This paper pinpoints four issues that must be readdressed before large scale virtual reality systems become accessible to a larger commercial and public domain: a reduction in the effects of network delays; scalable causal event delivery; update control; and scalable reliable communication. Each of these issues is tackled through a common theme of combining wall clock and causal time-related entity behaviour, knowledge of network delays and prediction of entity behaviour, that together overcome many of the effects of network delay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of large scale virtual reality and simulation systems have been mostly driven by the DIS and HLA standards community. A number of issues are coming to light about the applicability of these standards, in their present state, to the support of general multi-user VR systems. This paper pinpoints four issues that must be readdressed before large scale virtual reality systems become accessible to a larger commercial and public domain: a reduction in the effects of network delays; scalable causal event delivery; update control; and scalable reliable communication. Each of these issues is tackled through a common theme of combining wall clock and causal time-related entity behaviour, knowledge of network delays and prediction of entity behaviour, that together overcome many of the effects of network delays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivation: A new method that uses support vector machines (SVMs) to predict protein secondary structure is described and evaluated. The study is designed to develop a reliable prediction method using an alternative technique and to investigate the applicability of SVMs to this type of bioinformatics problem. Methods: Binary SVMs are trained to discriminate between two structural classes. The binary classifiers are combined in several ways to predict multi-class secondary structure. Results: The average three-state prediction accuracy per protein (Q3) is estimated by cross-validation to be 77.07 ± 0.26% with a segment overlap (Sov) score of 73.32 ± 0.39%. The SVM performs similarly to the 'state-of-the-art' PSIPRED prediction method on a non-homologous test set of 121 proteins despite being trained on substantially fewer examples. A simple consensus of the SVM, PSIPRED and PROFsec achieves significantly higher prediction accuracy than the individual methods. Availability: The SVM classifier is available from the authors. Work is in progress to make the method available on-line and to integrate the SVM predictions into the PSIPRED server.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the implications of using marketing margins in applied commodity price analysis. The marketing-margin concept has a long and distinguished history, but it has caused considerable controversy. This is particularly the case in the context of analyzing the distribution of research gains in multi-stage production systems. We derive optimal tax schemes for raising revenues to finance research and promotion in a downstream market, derive the rules for efficient allocation of the funds, and compare the rules with an without the marketing-margin assumption. Applying the methodology to quarterly time series on the Australian beef-cattle sector and, with several caveats, we conclude that, during the period 1978:2 - 1988:4, the Australian Meat and Livestock Corporation optimally allocated research resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a generalization of collectively compact operator theory in Banach spaces is developed. A feature of the new theory is that the operators involved are no longer required to be compact in the norm topology. Instead it is required that the image of a bounded set under the operator family is sequentially compact in a weaker topology. As an application, the theory developed is used to establish solvability results for a class of systems of second kind integral equations on unbounded domains, this class including in particular systems of Wiener-Hopf integral equations with L1 convolutions kernels

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particle conservation lattice-gas models with infinitely many absorbing states are studied on a one-dimensional lattice. As one increases the particle density, they exhibit a phase transition from an absorbing to an active phase. The models are solved exactly by the use of the transfer matrix technique from which the critical behavior was obtained. We have found that the exponent related to the order parameter, the density of active sites, is 1 for all studied models except one of them with exponent 2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we investigate the stochastic behavior of a large class of systems with variable damping which are described by a time-dependent Lagrangian. Our stochastic approach is based on the Langevin treatment describing the motion of a classical Brownian particle of mass m. Two situations of physical interest are considered. In the first one, we discuss in detail an application of the standard Langevin treatment (white noise) for the variable damping system. In the second one, a more general viewpoint is adopted by assuming a given expression to the so-called collored noise. For both cases, the basic diffententiaql equations are analytically solved and al the quantities physically relevant are explicitly determined. The results depend on an arbitrary q parameter measuring how the behavior of the system departs from the standard brownian particle with constant viscosity. Several types of sthocastic behavior (superdiffusive and subdiffusive) are obteinded when the free pamameter varies continuosly. However, all the results of the conventional Langevin approach with constant damping are recovered in the limit q = 1