961 resultados para Cartographic updating


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A parallel processor architecture based on a communicating sequential processor chip, the transputer, is described. The architecture is easily linearly extensible to enable separate functions to be included in the controller. To demonstrate the power of the resulting controller some experimental results are presented comparing PID and full inverse dynamics on the first three joints of a Puma 560 robot. Also examined are some of the sample rate issues raised by the asynchronous updating of inertial parameters, and the need for full inverse dynamics at every sample interval is questioned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reviews the treatment of intellectual property rights in the North American Free Trade Agreement (NAFTA) and considers the welfare-theoretic bases for innovation transfer between member and nonmember states. Specifically, we consider the effects of new technology development from within the union and question whether it is efficient (in a welfare sense) to transfer that new technology to nonmember states. When the new technology contains stochastic components, the important issue of information exchange arises and we consider this question in a simple oligopoly model with Bayesian updating. In this context, it is natural to ask the optimal price at which such information should be transferred. Some simple, natural conjugate examples are used to motivate the key parameters upon which the answer is dependent

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The possibility of using a time sequence of surface pressure observations in four-dimensional data assimilation is being investigated. It is shown that a linear multilevel quasi-geostrophic model can be updated successfully with surface data alone, provided the number of time levels are at least as many as the number of vertical levels. It is further demonstrated that current statistical analysis procedures are very inefficient to assimilate surface observations, and it is shown by numerical experiments that the vertical interpolation must be carried out using the structure of the most dominating baroclinic mode in order to obtain a satisfactory updating. Different possible ways towards finding a practical solution are being discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A system for continuous data assimilation described recently (Bengtsson & Gustavsson, 1971) has been further developed and tested under more realistic conditions. A balanced barotropic model is used and the integration is performed over an octagon covering the area to the north of 20° N. Comparisons have been made between using data from the actual aerological network and data from a satellite in a polar orbit. The result of the analyses has been studied in different subregions situated in data sparse as well as in data dense areas. The errors of the analysis have also been studied in the wave spectrum domain. Updating is performed using data generated by the model but also by model-independent data. Rather great differences are obtained between the two experiments especially with respect to the ultra-long waves. The more realistic approach gives much larger analysis error. In general the satellite updating yields somewhat better result than the updating from the conventional aerological network especially in the data sparse areas over the oceans. Most of the experiments are performed by a satellite making 200 observations/track, a sidescan capability of 40° and with a RMS-error of 20 m. It is found that the effect of increasing the number of satellite observations from 100 to 200 per orbit is almost negligible. Similarly the effect is small of improving the observations by diminishing the RMS-error below a certain value. An observing system using two satellites 90° out of phase has also been investigated. This is found to imply a substantial improvement. Finally an experiment has been performed using actual SIRS-soundings from NIMBUS IV. With respect to the very small number of soundings at 500 mb, 142 during 48 hours, the result can be regarded as quite satisfactory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quality control, validation and verification of the European Flood Alert System (EFAS) are described. EFAS is designed as a flood early warning system at pan-European scale, to complement national systems and provide flood warnings more than 2 days before a flood. On average 20–30 alerts per year are sent out to the EFAS partner network which consists of 24 National hydrological authorities responsible for transnational river basins. Quality control of the system includes the evaluation of the hits, misses and false alarms, showing that EFAS has more than 50% of the time hits. Furthermore, the skills of both the meteorological as well as the hydrological forecasts are evaluated, and are included here for a 10-year period. Next, end-user needs and feedback are systematically analysed. Suggested improvements, such as real-time river discharge updating, are currently implemented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a new adaptive nonlinear equalizer relying on a radial basis function (RBF) model, which is designed based on the minimum bit error rate (MBER) criterion, in the system setting of the intersymbol interference channel plus a co-channel interference. Our proposed algorithm is referred to as the on-line mixture of Gaussians estimator aided MBER (OMG-MBER) equalizer. Specifically, a mixture of Gaussians based probability density function (PDF) estimator is used to model the PDF of the decision variable, for which a novel on-line PDF update algorithm is derived to track the incoming data. With the aid of this novel on-line mixture of Gaussians based sample-by-sample updated PDF estimator, our adaptive nonlinear equalizer is capable of updating its equalizer’s parameters sample by sample to aim directly at minimizing the RBF nonlinear equalizer’s achievable bit error rate (BER). The proposed OMG-MBER equalizer significantly outperforms the existing on-line nonlinear MBER equalizer, known as the least bit error rate equalizer, in terms of both the convergence speed and the achievable BER, as is confirmed in our simulation study

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper aims to assess the necessity of updating the intensity-duration-frequency (IDF) curves used in Portugal to design building storm-water drainage systems. A comparative analysis of the design was performed for the three predefined rainfall regions in Portugal using the IDF curves currently in use and estimated for future decades. Data for recent and future climate conditions simulated by a global and regional climate model chain are used to estimate possible changes of rainfall extremes and its implications for the drainage systems. The methodology includes the disaggregation of precipitation up to subhourly scales, the robust development of IDF curves, and the correction of model bias. Obtained results indicate that projected changes are largest for the plains in southern Portugal (5–33%) than for mountainous regions (3–9%) and that these trends are consistent with projected changes in the long-term 95th percentile of the daily precipitation throughout the 21st century. The authors conclude there is a need to review the current precipitation regime classification and change the new drainage systems towards larger dimensions to mitigate the projected changes in extreme precipitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop an on-line Gaussian mixture density estimator (OGMDE) in the complex-valued domain to facilitate adaptive minimum bit-error-rate (MBER) beamforming receiver for multiple antenna based space-division multiple access systems. Specifically, the novel OGMDE is proposed to adaptively model the probability density function of the beamformer’s output by tracking the incoming data sample by sample. With the aid of the proposed OGMDE, our adaptive beamformer is capable of updating the beamformer’s weights sample by sample to directly minimize the achievable bit error rate (BER). We show that this OGMDE based MBER beamformer outperforms the existing on-line MBER beamformer, known as the least BER beamformer, in terms of both the convergence speed and the achievable BER.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many species are extending their leading-edge (cool) range margins polewards in response to recent climate change. In the present study, we investigated range margin changes at the northern (cool) range margins of 1573 southerly-distributed species from 21 animal groups in Great Britain over the past four decades of climate change, updating previous work. Depending on data availability, range margin changes were examined over two time intervals during the past four decades. For four groups (birds, butterflies, macromoths, and dragonflies and damselflies), there were sufficient data available to examine range margin changes over both time intervals. We found that most taxa shifted their northern range margins polewards and this finding was not greatly influenced by changes in recorder effort. The mean northwards range margin change in the first time interval was 23 km per decade (N = 13 taxonomic groups) and, in the second interval, was 18 km per decade (N = 16 taxonomic groups) during periods when the British climate warmed by 0.21 and 0.28 °C per decade, respectively. For the four taxa examined over both intervals, there was evidence for higher rate of range margin change in the more recent time interval in the two Lepidoptera groups. Our analyses confirm a continued range margin shift polewards in a wide range of taxonomic groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question is addressed whether using unbalanced updates in ocean-data assimilation schemes for seasonal forecasting systems can result in a relatively poor simulation of zonal currents. An assimilation scheme, where temperature observations are used for updating only the density field, is compared to a scheme where updates of density field and zonal velocities are related by geostrophic balance. This is done for an equatorial linear shallow-water model. It is found that equatorial zonal velocities can be detoriated if velocity is not updated in the assimilation procedure. Adding balanced updates to the zonal velocity is shown to be a simple remedy for the shallow-water model. Next, optimal interpolation (OI) schemes with balanced updates of the zonal velocity are implemented in two ocean general circulation models. First tests indicate a beneficial impact on equatorial upper-ocean zonal currents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses an important issue related to the implementation and interpretation of the analysis scheme in the ensemble Kalman filter . I t i s shown that the obser vations must be treated as random variables at the analysis steps. That is, one should add random perturbations with the correct statistics to the obser vations and generate an ensemble of obser vations that then is used in updating the ensemble of model states. T raditionally , this has not been done in previous applications of the ensemble Kalman filter and, as will be shown, this has resulted in an updated ensemble with a variance that is too low . This simple modification of the analysis scheme results in a completely consistent approach if the covariance of the ensemble of model states is interpreted as the prediction error covariance, and there are no further requirements on the ensemble Kalman filter method, except for the use of an ensemble of sufficient size. Thus, there is a unique correspondence between the error statistics from the ensemble Kalman filter and the standard Kalman filter approach

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tensor clustering is an important tool that exploits intrinsically rich structures in real-world multiarray or Tensor datasets. Often in dealing with those datasets, standard practice is to use subspace clustering that is based on vectorizing multiarray data. However, vectorization of tensorial data does not exploit complete structure information. In this paper, we propose a subspace clustering algorithm without adopting any vectorization process. Our approach is based on a novel heterogeneous Tucker decomposition model taking into account cluster membership information. We propose a new clustering algorithm that alternates between different modes of the proposed heterogeneous tensor model. All but the last mode have closed-form updates. Updating the last mode reduces to optimizing over the multinomial manifold for which we investigate second order Riemannian geometry and propose a trust-region algorithm. Numerical experiments show that our proposed algorithm compete effectively with state-of-the-art clustering algorithms that are based on tensor factorization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this thesis is to show how to use vulnerability testing to identify and search for security flaws in networks of computers. The goal is partly to give a casual description of different types of methods of vulnerability testing and partly to present the method and results from a vulnerability test. A document containing the results of the vulnerability test will be handed over and a solution to the found high risk vulnerabilities. The goal is also to carry out and present this work as a form of a scholarly work.The problem was to show how to perform vulnerability tests and identify vulnerabilities in the organization's network and systems. Programs would be run under controlled circumstances in a way that they did not burden the network. Vulnerability tests were conducted sequentially, when data from the survey was needed to continue the scan.A survey of the network was done and data in the form of operating system, among other things, were collected in the tables. A number of systems were selected from the tables and were scanned with Nessus. The result was a table across the network and a table of found vulnerabilities. The table of vulnerabilities has helped the organization to prevent these vulnerabilities by updating the affected computers. Also a wireless network with WEP encryption, which is insecure, has been detected and decrypted.