882 resultados para Branch and bound algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Elevated ocean temperatures can cause coral bleaching, the loss of colour from reef-building corals because of a breakdown of the symbiosis with the dinoflagellate Symbiodinium. Recent studies have warned that global climate change could increase the frequency of coral bleaching and threaten the long-term viability of coral reefs. These assertions are based on projecting the coarse output from atmosphere-ocean general circulation models (GCMs) to the local conditions around representative coral reefs. Here, we conduct the first comprehensive global assessment of coral bleaching under climate change by adapting the NOAA Coral Reef Watch bleaching prediction method to the output of a low- and high-climate sensitivity GCM. First, we develop and test algorithms for predicting mass coral bleaching with GCM-resolution sea surface temperatures for thousands of coral reefs, using a global coral reef map and 1985-2002 bleaching prediction data. We then use the algorithms to determine the frequency of coral bleaching and required thermal adaptation by corals and their endosymbionts under two different emissions scenarios. The results indicate that bleaching could become an annual or biannual event for the vast majority of the world's coral reefs in the next 30-50 years without an increase in thermal tolerance of 0.2-1.0 degrees C per decade. The geographic variability in required thermal adaptation found in each model and emissions scenario suggests that coral reefs in some regions, like Micronesia and western Polynesia, may be particularly vulnerable to climate change. Advances in modelling and monitoring will refine the forecast for individual reefs, but this assessment concludes that the global prognosis is unlikely to change without an accelerated effort to stabilize atmospheric greenhouse gas concentrations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep-frying, which consists of immersing a wet material in a large volume of hot oil, presents a process easily adaptable to dry rather than cook materials. A suitable material for drying is sewage sludge, which may be dried using recycled cooking oils (RCO) as frying oil. One advantage is that this prepares both materials for convenient disposal by incineration. This study examines fry-drying of municipal sewage sludge using recycled cooking oil. The transport processes occurring during fry-drying were monitored through sample weight, temperature, and image analysis. Due to the thicker and wetter samples than the common fried foods, high residual moisture is observed in the sludge when the boiling front has reached the geometric center of the sample, suggesting that the operation is heat transfer controlled only during the first half of the process followed by the addition of other mechanisms that allow complete drying of the sample. A series of mechanisms comprising four stages (i.e., initial heating accompanied by a surface boiling onset, film vapor regime, transitional nucleate boiling, and bound water removal) is proposed. In order to study the effect of the operating conditions on the fry-drying kinetics, different oil temperatures (from 120 to 180 degrees C), diameter (D = 15 to 25 mm), and initial moisture content of the sample (4.8 and 5.6 kg water(.)kg(-1) total dry solids) were investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents new laboratory data on the generation of long waves by the shoaling and breaking of transient-focused short-wave groups. Direct offshore radiation of long waves from the breakpoint is shown experimentally for the first time. High spatial resolution enables identification of the relationship between the spatial gradients of the short-wave envelope and the long-wave surface. This relationship is consistent with radiation stress theory even well inside the surf zone and appears as a result of the strong nonlinear forcing associated with the transient group. In shallow water, the change in depth across the group leads to asymmetry in the forcing which generates significant dynamic setup in front of the group during shoaling. Strong amplification of the incident dynamic setup occurs after short-wave breaking. The data show the radiation of a transient long wave dominated by a pulse of positive elevation, preceded and followed by weaker trailing waves with negative elevation. The instantaneous cross-shore structure of the long wave shows the mechanics of the reflection process and the formation of a transient node in the inner surf zone. The wave run-up and relative amplitude of the radiated and incident long waves suggests significant modification of the incident bound wave in the inner surf zone and, the dominance of long waves generated by the breaking process. It is proposed that these conditions occur when the primary short waves and bound wave are not shallow water waves at the breakpoint. A simple criterion is given to determine these conditions, which generally occur for the important case of storm waves.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In empirical studies of Evolutionary Algorithms, it is usually desirable to evaluate and compare algorithms using as many different parameter settings and test problems as possible, in border to have a clear and detailed picture of their performance. Unfortunately, the total number of experiments required may be very large, which often makes such research work computationally prohibitive. In this paper, the application of a statistical method called racing is proposed as a general-purpose tool to reduce the computational requirements of large-scale experimental studies in evolutionary algorithms. Experimental results are presented that show that racing typically requires only a small fraction of the cost of an exhaustive experimental study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the ability to collect and store increasingly large datasets on modern computers comes the need to be able to process the data in a way that can be useful to a Geostatistician or application scientist. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively for likelihood-based Geostatistics. Various methods have been proposed and are extensively used in an attempt to overcome these complexity issues. This thesis introduces a number of principled techniques for treating large datasets with an emphasis on three main areas: reduced complexity covariance matrices, sparsity in the covariance matrix and parallel algorithms for distributed computation. These techniques are presented individually, but it is also shown how they can be combined to produce techniques for further improving computational efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aims of the project were twofold: 1) To investigate classification procedures for remotely sensed digital data, in order to develop modifications to existing algorithms and propose novel classification procedures; and 2) To investigate and develop algorithms for contextual enhancement of classified imagery in order to increase classification accuracy. The following classifiers were examined: box, decision tree, minimum distance, maximum likelihood. In addition to these the following algorithms were developed during the course of the research: deviant distance, look up table and an automated decision tree classifier using expert systems technology. Clustering techniques for unsupervised classification were also investigated. Contextual enhancements investigated were: mode filters, small area replacement and Wharton's CONAN algorithm. Additionally methods for noise and edge based declassification and contextual reclassification, non-probabilitic relaxation and relaxation based on Markov chain theory were developed. The advantages of per-field classifiers and Geographical Information Systems were investigated. The conclusions presented suggest suitable combinations of classifier and contextual enhancement, given user accuracy requirements and time constraints. These were then tested for validity using a different data set. A brief examination of the utility of the recommended contextual algorithms for reducing the effects of data noise was also carried out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigated the variability of response associated with various perimetric techniques, with the aim of improving the clinical interpretation of automated static threshold perirnetry. Evaluation of a third generation of perimetric threshold algorithms (SITA) demonstrated a reduction in test duration by approximately 50% both in normal subjects and in glaucoma patients. SITA produced a slightly higher, but clinically insignificant, Mean Sensitivity than with the previous generations of algorithms. This was associated with a decreased between-subject variability in sensitivity and hence, lower confidence intervals for normality. In glaucoma, the SITA algorithms gave rise to more statistically significant visual field defects and a similar between-visit repeatability to the Full Threshold and FASTPAC algorithms. The higher estimated sensitivity observed with SITA compared to Full Threshold and FASTPAC were not attributed to a reduction in the fatigue effect. The investigation of a novel method of maintaining patient fixation, a roving fixation target which paused immediately prior lo the stimulus presentation, revealed a greater degree of fixational instability with the roving fixation target compared to the conventional static fixation target. Previous experience with traditional white-white perimetry did not eradicate the learning effect in short-wavelength automated perimetry (SWAP) in a group of ocular hypertensive patients. The learning effect was smaller in an experienced group of patients compared to a naive group of patients, but was still at a significant level to require that patients should undertake a series of at least three familiarisation tests with SWAP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study here highlights the potential that analytical methods based on Knowledge Discovery in Databases (KDD) methodologies have to aid both the resolution of unstructured marketing/business problems and the process of scholarly knowledge discovery. The authors present and discuss the application of KDD in these situations prior to the presentation of an analytical method based on fuzzy logic and evolutionary algorithms, developed to analyze marketing databases and uncover relationships among variables. A detailed implementation on a pre-existing data set illustrates the method. © 2012 Published by Elsevier Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The standard reference clinical score quantifying average Parkinson's disease (PD) symptom severity is the Unified Parkinson's Disease Rating Scale (UPDRS). At present, UPDRS is determined by the subjective clinical evaluation of the patient's ability to adequately cope with a range of tasks. In this study, we extend recent findings that UPDRS can be objectively assessed to clinically useful accuracy using simple, self-administered speech tests, without requiring the patient's physical presence in the clinic. We apply a wide range of known speech signal processing algorithms to a large database (approx. 6000 recordings from 42 PD patients, recruited to a six-month, multi-centre trial) and propose a number of novel, nonlinear signal processing algorithms which reveal pathological characteristics in PD more accurately than existing approaches. Robust feature selection algorithms select the optimal subset of these algorithms, which is fed into non-parametric regression and classification algorithms, mapping the signal processing algorithm outputs to UPDRS. We demonstrate rapid, accurate replication of the UPDRS assessment with clinically useful accuracy (about 2 UPDRS points difference from the clinicians' estimates, p < 0.001). This study supports the viability of frequent, remote, cost-effective, objective, accurate UPDRS telemonitoring based on self-administered speech tests. This technology could facilitate large-scale clinical trials into novel PD treatments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review our recent progress on the study of new nonlinear mechanisms of pulse shaping in passively mode-locked fibre lasers. These include a mode-locking regime featuring pulses with a triangular distribution of the intensity, and spectral compression arising from nonlinear pulse propagation. We also report on our recent experimental studies unveiling new families of vector solitons with precessing states of polarization for multipulsing and bound-state soliton operations in a carbon nanotube mode-locked fibre laser with anomalous dispersion cavity. © 2013 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a resource allocation scheme to minimize transmit power for multicast orthogonal frequency division multiple access systems. The proposed scheme allows users to have different symbol error rate (SER) across subcarriers and guarantees an average bit error rate and transmission rate for all users. We first provide an algorithm to determine the optimal bits and target SER on subcarriers. Because the worst-case complexity of the optimal algorithm is exponential, we further propose a suboptimal algorithm that separately assigns bit and adjusts SER with a lower complexity. Numerical results show that the proposed algorithm can effectively improve the performance of multicast orthogonal frequency division multiple access systems and that the performance of the suboptimal algorithm is close to that of the optimal one. Copyright © 2012 John Wiley & Sons, Ltd. This paper proposes optimal and suboptimal algorithms for minimizing transmitting power of multicast orthogonal frequency division multiple access systems with guaranteed average bit error rate and data rate requirement. The proposed scheme allows users to have different symbol error rate across subcarriers and guarantees an average bit error rate and transmission rate for all users. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review our recent progress on the study of new nonlinear mechanisms of pulse shaping in passively mode-locked fibre lasers. These include a mode-locking regime featuring pulses with a triangular distribution of the intensity, and spectral compression arising from nonlinear pulse propagation. We also report on our recent experimental studies unveiling new families of vector solitons with precessing states of polarization for multipulsing and bound-state soliton operations in a carbon nanotube mode-locked fibre laser with anomalous dispersion cavity. © 2013 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose algorithms for combining and ranking answers from distributed heterogeneous data sources in the context of a multi-ontology Question Answering task. Our proposal includes a merging algorithm that aggregates, combines and filters ontology-based search results and three different ranking algorithms that sort the final answers according to different criteria such as popularity, confidence and semantic interpretation of results. An experimental evaluation on a large scale corpus indicates improvements in the quality of the search results with respect to a scenario where the merging and ranking algorithms were not applied. These collective methods for merging and ranking allow to answer questions that are distributed across ontologies, while at the same time, they can filter irrelevant answers, fuse similar answers together, and elicit the most accurate answer(s) to a question.