25 resultados para diagnostic and prognostic algorithms developmen
em CentAUR: Central Archive University of Reading - UK
                                
                                
                                
                                
Resumo:
Genetic algorithms (GAs) have been introduced into site layout planning as reported in a number of studies. In these studies, the objective functions were defined so as to employ the GAs in searching for the optimal site layout. However, few studies have been carried out to investigate the actual closeness of relationships between site facilities; it is these relationships that ultimately govern the site layout. This study has determined that the underlying factors of site layout planning for medium-size projects include work flow, personnel flow, safety and environment, and personal preferences. By finding the weightings on these factors and the corresponding closeness indices between each facility, a closeness relationship has been deduced. Two contemporary mathematical approaches - fuzzy logic theory and an entropy measure - were adopted in finding these results in order to minimize the uncertainty and vagueness of the collected data and improve the quality of the information. GAs were then applied to searching for the optimal site layout in a medium-size government project using the GeneHunter software. The objective function involved minimizing the total travel distance. An optimal layout was obtained within a short time. This reveals that the application of GA to site layout planning is highly promising and efficient.
                                
Resumo:
A new parameter-estimation algorithm, which minimises the cross-validated prediction error for linear-in-the-parameter models, is proposed, based on stacked regression and an evolutionary algorithm. It is initially shown that cross-validation is very important for prediction in linear-in-the-parameter models using a criterion called the mean dispersion error (MDE). Stacked regression, which can be regarded as a sophisticated type of cross-validation, is then introduced based on an evolutionary algorithm, to produce a new parameter-estimation algorithm, which preserves the parsimony of a concise model structure that is determined using the forward orthogonal least-squares (OLS) algorithm. The PRESS prediction errors are used for cross-validation, and the sunspot and Canadian lynx time series are used to demonstrate the new algorithms.
                                
Resumo:
A new autonomous ship collision free (ASCF) trajectory navigation and control system has been introduced with a new recursive navigation algorithm based on analytic geometry and convex set theory for ship collision free guidance. The underlying assumption is that the geometric information of ship environment is available in the form of a polygon shaped free space, which may be easily generated from a 2D image or plots relating to physical hazards or other constraints such as collision avoidance regulations. The navigation command is given as a heading command sequence based on generating a way point which falls within a small neighborhood of the current position, and the sequence of the way points along the trajectory are guaranteed to lie within a bounded obstacle free region using convex set theory. A neurofuzzy network predictor which in practice uses only observed input/output data generated by on board sensors or external sensors (or a sensor fusion algorithm), based on using rudder deflection angle for the control of ship heading angle, is utilised in the simulation of an ESSO 190000 dwt tanker model to demonstrate the effectiveness of the system.
                                
                                
Resumo:
The World Weather Research Programme (WWRP) and the World Climate Research Programme (WCRP) have identified collaborations and scientific priorities to accelerate advances in analysis and prediction at subseasonal-to-seasonal time scales, which include i) advancing knowledge of mesoscale–planetary-scale interactions and their prediction; ii) developing high-resolution global–regional climate simulations, with advanced representation of physical processes, to improve the predictive skill of subseasonal and seasonal variability of high-impact events, such as seasonal droughts and floods, blocking, and tropical and extratropical cyclones; iii) contributing to the improvement of data assimilation methods for monitoring and predicting used in coupled ocean–atmosphere–land and Earth system models; and iv) developing and transferring diagnostic and prognostic information tailored to socioeconomic decision making. The document puts forward specific underpinning research, linkage, and requirements necessary to achieve the goals of the proposed collaboration.
                                
                                
Resumo:
The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset—the period 1989–2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.
                                
Resumo:
Exact error estimates for evaluating multi-dimensional integrals are considered. An estimate is called exact if the rates of convergence for the low- and upper-bound estimate coincide. The algorithm with such an exact rate is called optimal. Such an algorithm has an unimprovable rate of convergence. The problem of existing exact estimates and optimal algorithms is discussed for some functional spaces that define the regularity of the integrand. Important for practical computations data classes are considered: classes of functions with bounded derivatives and Holder type conditions. The aim of the paper is to analyze the performance of two optimal classes of algorithms: deterministic and randomized for computing multidimensional integrals. It is also shown how the smoothness of the integrand can be exploited to construct better randomized algorithms.
                                
Resumo:
Ground-based remote-sensing observations from Atmospheric Radiation Measurement (ARM) and Cloud-Net sites are used to evaluate the clouds predicted by a weather forecasting and climate model. By evaluating the cloud predictions using separate measures for the errors in frequency of occurrence, amount when present, and timing, we provide a detailed assessment of the model performance, which is relevant to weather and climate time-scales. Importantly, this methodology will be of great use when attempting to develop a cloud parametrization scheme, as it provides a clearer picture of the current deficiencies in the predicted clouds. Using the Met Office Unified Model, it is shown that when cloud fractions produced by a diagnostic and a prognostic cloud scheme are compared, the prognostic cloud scheme shows improvements to the biases in frequency of occurrence of low, medium and high cloud and to the frequency distributions of cloud amount when cloud is present. The mean cloud profiles are generally improved, although it is shown that in some cases the diagnostic scheme produced misleadingly good mean profiles as a result of compensating errors in frequency of occurrence and amount when present. Some biases remain when using the prognostic scheme, notably the underprediction of mean ice cloud fraction due to the amount when present being too low, and the overprediction of mean liquid cloud fraction due to the frequency of occurrence being too high.
                                
Resumo:
Applications such as neuroscience, telecommunication, online social networking, transport and retail trading give rise to connectivity patterns that change over time. In this work, we address the resulting need for network models and computational algorithms that deal with dynamic links. We introduce a new class of evolving range-dependent random graphs that gives a tractable framework for modelling and simulation. We develop a spectral algorithm for calibrating a set of edge ranges from a sequence of network snapshots and give a proof of principle illustration on some neuroscience data. We also show how the model can be used computationally and analytically to investigate the scenario where an evolutionary process, such as an epidemic, takes place on an evolving network. This allows us to study the cumulative effect of two distinct types of dynamics.
                                
Resumo:
The combination of radar and lidar in space offers the unique potential to retrieve vertical profiles of ice water content and particle size globally, and two algorithms developed recently claim to have overcome the principal difficulty with this approach-that of correcting the lidar signal for extinction. In this paper "blind tests" of these algorithms are carried out, using realistic 94-GHz radar and 355-nm lidar backscatter profiles simulated from aircraft-measured size spectra, and including the effects of molecular scattering, multiple scattering, and instrument noise. Radiation calculations are performed on the true and retrieved microphysical profiles to estimate the accuracy with which radiative flux profiles could be inferred remotely. It is found that the visible extinction profile can be retrieved independent of assumptions on the nature of the size distribution, the habit of the particles, the mean extinction-to-backscatter ratio, or errors in instrument calibration. Local errors in retrieved extinction can occur in proportion to local fluctuations in the extinction-to-backscatter ratio, but down to 400 m above the height of the lowest lidar return, optical depth is typically retrieved to better than 0.2. Retrieval uncertainties are greater at the far end of the profile, and errors in total optical depth can exceed 1, which changes the shortwave radiative effect of the cloud by around 20%. Longwave fluxes are much less sensitive to errors in total optical depth, and may generally be calculated to better than 2 W m(-2) throughout the profile. It is important for retrieval algorithms to account for the effects of lidar multiple scattering, because if this is neglected, then optical depth is underestimated by approximately 35%, resulting in cloud radiative effects being underestimated by around 30% in the shortwave and 15% in the longwave. Unlike the extinction coefficient, the inferred ice water content and particle size can vary by 30%, depending on the assumed mass-size relationship (a problem common to all remote retrieval algorithms). However, radiative fluxes are almost completely determined by the extinction profile, and if this is correct, then errors in these other parameters have only a small effect in the shortwave (around 6%, compared to that of clear sky) and a negligible effect in the longwave.
                                
Resumo:
Midlatitude cyclones are important contributors to boundary layer ventilation. However, it is uncertain how efficient such systems are at transporting pollutants out of the boundary layer, and variations between cyclones are unexplained. In this study 15 idealized baroclinic life cycles, with a passive tracer included, are simulated to identify the relative importance of two transport processes: horizontal divergence and convergence within the boundary layer and large-scale advection by the warm conveyor belt. Results show that the amount of ventilation is insensitive to surface drag over a realistic range of values. This indicates that although boundary layer processes are necessary for ventilation they do not control the magnitude of ventilation. A diagnostic for the mass flux out of the boundary layer has been developed to identify the synoptic-scale variables controlling the strength of ascent in the warm conveyor belt. A very high level of correlation (R-2 values exceeding 0.98) is found between the diagnostic and the actual mass flux computed from the simulations. This demonstrates that the large-scale dynamics control the amount of ventilation, and the efficiency of midlatitude cyclones to ventilate the boundary layer can be estimated using the new mass flux diagnostic. We conclude that meteorological analyses, such as ERA-40, are sufficient to quantify boundary layer ventilation by the large-scale dynamics.
 
                    