884 resultados para Artificial satellites in telecommunications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los derechos humanos constituyen, a todas luces, uno de los ejes y problemas constitutivos de la humanidad. Mucho se ha avanzado y trabajado en la aplicación, implementación y defensa de los mismos. Sin duda, el Derecho Internacional de los Derechos Humanos (DIDH) constituye, junto con el llamado bloque constitucional, los mejores sustentos prácticos, esto es, al mismo tiempo jurídicos, políticos y sociales, para la defensa y promoción de los derechos humanos. Sin embargo, en general, en Colombia y en el mundo, existen pocos trabajos de fundamentación filosófica de los derechos humanos. La Universidad del Rosario presenta la tercera edición de este libro que se ha constituido en una referencia entre defensores de derechos humanos, juristas, académicos e investigadores en el tema. La tesis que sostiene el libro es sencilla: el fundamento de los derechos humanos es la vida, la vida humana en general, pero con ella, desde ella, también la vida sobre el planeta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La computación evolutiva y muy especialmente los algoritmos genéticos son cada vez más empleados en las organizaciones para resolver sus problemas de gestión y toma de decisiones (Apoteker & Barthelemy, 2000). La literatura al respecto es creciente y algunos estados del arte han sido publicados. A pesar de esto, no hay un trabajo explícito que evalúe de forma sistemática el uso de los algoritmos genéticos en problemas específicos de los negocios internacionales (ejemplos de ello son la logística internacional, el comercio internacional, el mercadeo internacional, las finanzas internacionales o estrategia internacional). El propósito de este trabajo de grado es, por lo tanto, realizar un estado situacional de las aplicaciones de los algoritmos genéticos en los negocios internacionales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a case study that explores the advantages that can be derived from the use of a design support system during the design of wastewater treatment plants (WWTP). With this objective in mind a simplified but plausible WWTP design case study has been generated with KBDS, a computer-based support system that maintains a historical record of the design process. The study shows how, by employing such a historical record, it is possible to: (1) rank different design proposals responding to a design problem; (2) study the influence of changing the weight of the arguments used in the selection of the most adequate proposal; (3) take advantage of keywords to assist the designer in the search of specific items within the historical records; (4) evaluate automatically the compliance of alternative design proposals with respect to the design objectives; (5) verify the validity of previous decisions after the modification of the current constraints or specifications; (6) re-use the design records when upgrading an existing WWTP or when designing similar facilities; (7) generate documentation of the decision making process; and (8) associate a variety of documents as annotations to any component in the design history. The paper also shows one possible future role of design support systems as they outgrow their current reactive role as repositories of historical information and start to proactively support the generation of new knowledge during the design process

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The activated sludge process - the main biological technology usually applied to wastewater treatment plants (WWTP) - directly depends on live beings (microorganisms), and therefore on unforeseen changes produced by them. It could be possible to get a good plant operation if the supervisory control system is able to react to the changes and deviations in the system and can take the necessary actions to restore the system’s performance. These decisions are often based both on physical, chemical, microbiological principles (suitable to be modelled by conventional control algorithms) and on some knowledge (suitable to be modelled by knowledge-based systems). But one of the key problems in knowledge-based control systems design is the development of an architecture able to manage efficiently the different elements of the process (integrated architecture), to learn from previous cases (spec@c experimental knowledge) and to acquire the domain knowledge (general expert knowledge). These problems increase when the process belongs to an ill-structured domain and is composed of several complex operational units. Therefore, an integrated and distributed AI architecture seems to be a good choice. This paper proposes an integrated and distributed supervisory multi-level architecture for the supervision of WWTP, that overcomes some of the main troubles of classical control techniques and those of knowledge-based systems applied to real world systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we describe how we generated written explanations to ‘indirect users’ of a knowledge-based system in the domain of drug prescription. We call ‘indirect users’ the intended recipients of explanations, to distinguish them from the prescriber (the ‘direct’ user) who interacts with the system. The Explanation Generator was designed after several studies about indirect users' information needs and physicians' explanatory attitudes in this domain. It integrates text planning techniques with ATN-based surface generation. A double modeling component enables adapting the information content, order and style to the indirect user to whom explanation is addressed. Several examples of computer-generated texts are provided, and they are contrasted with the physicians' explanations to discuss advantages and limits of the approach adopted.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Top Down Induction of Decision Trees (TDIDT) is the most commonly used method of constructing a model from a dataset in the form of classification rules to classify previously unseen data. Alternative algorithms have been developed such as the Prism algorithm. Prism constructs modular rules which produce qualitatively better rules than rules induced by TDIDT. However, along with the increasing size of databases, many existing rule learning algorithms have proved to be computational expensive on large datasets. To tackle the problem of scalability, parallel classification rule induction algorithms have been introduced. As TDIDT is the most popular classifier, even though there are strongly competitive alternative algorithms, most parallel approaches to inducing classification rules are based on TDIDT. In this paper we describe work on a distributed classifier that induces classification rules in a parallel manner based on Prism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inducing rules from very large datasets is one of the most challenging areas in data mining. Several approaches exist to scaling up classification rule induction to large datasets, namely data reduction and the parallelisation of classification rule induction algorithms. In the area of parallelisation of classification rule induction algorithms most of the work has been concentrated on the Top Down Induction of Decision Trees (TDIDT), also known as the ‘divide and conquer’ approach. However powerful alternative algorithms exist that induce modular rules. Most of these alternative algorithms follow the ‘separate and conquer’ approach of inducing rules, but very little work has been done to make the ‘separate and conquer’ approach scale better on large training data. This paper examines the potential of the recently developed blackboard based J-PMCRI methodology for parallelising modular classification rule induction algorithms that follow the ‘separate and conquer’ approach. A concrete implementation of the methodology is evaluated empirically on very large datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a highly accurate tool for the simulation of shear Alfven waves (SAW) in collisionless plasma. SAW are important in space plasma environments because for small perpendicular scale lengths they can support an electric field parallel to the ambient magnetic field. Electrons can be accelerated by the parallel electric field and these waves have been implicated as the source of vibrant auroral displays. However, the parallel electric field carried by SAW is small in comparison to the perpendicular electric field of the wave, making it difficult to measure directly in the laboratory, or by satellites in the near-Earth plasma environment. In this paper, we present a simulation code that provides a means to study in detail the SAW-particle interaction in both space and laboratory plasma. Using idealised, small-amplitude propagating waves with a single perpendicular wavenumber, the simulation code accurately reproduces the damping rates and parallel electric field amplitudes predicted by linear theory for varying temperatures and perpendicular scale lengths. We present a rigorous kinetic derivation of the parallel electric field strength for small-amplitude SAW and show that commonly-used inertial and kinetic approximations are valid except for where the ratio of thermal to Alfv\'{e}n speed is between 0.7 and 1.0. We also present nonlinear simulations of large-amplitude waves and show that in cases of strong damping, the damping rates and parallel electric field strength deviate from linear predictions when wave energies are greater than only a few percent of the plasma kinetic energy, a situation which is often observed in the magnetosphere. The drift-kinetic code provides reliable, testable predictions of the parallel electric field strength which can be investigated directly in the laboratory, and will help to bridge the gap between studies of SAW in man-made and naturally occuring plasma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present predictions of the signatures of magnetosheath particle precipitation (in the regions classified as open low-latitude boundary layer, cusp, mantle and polar cap) for periods when the interplanetary magnetic field has a southward component. These are made using the “pulsating cusp” model of the effects of time-varying magnetic reconnection at the dayside magnetopause. Predictions are made for both low-altitude satellites in the topside ionosphere and for midaltitude spacecraft in the magnetosphere. Low-altitude cusp signatures, which show a continuous ion dispersion signature, reveal "quasi-steady reconnection" (one limit of the pulsating cusp model), which persists for a period of at least 10 min. We estimate that “quasi-steady” in this context corresponds to fluctuations in the reconnection rate of a factor of 2 or less. The other limit of the pulsating cusp model explains the instantaneous jumps in the precipitating ion spectrum that have been observed at low altitudes. Such jumps are produced by isolated pulses of reconnection: that is, they are separated by intervals when the reconnection rate is zero. These also generate convecting patches on the magnetopause in which the field lines thread the boundary via a rotational discontinuity separated by more extensive regions of tangential discontinuity. Predictions of the corresponding ion precipitation signatures seen by midaltitude spacecraft are presented. We resolve the apparent contradiction between estimates of the width of the injection region from midaltitude data and the concept of continuous entry of solar wind plasma along open field lines. In addition, we reevaluate the use of pitch angle-energy dispersion to estimate the injection distance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To analyze patterns in marine productivity, harmful algal blooms, thermal stress in coral reefs, and oceanographic processes, optical and biophysical marine parameters, such as sea surface temperature, and ocean color products, such as chlorophyll-a concentration, diffuse attenuation coefficient, total suspended matter concentration, chlorophyll fluorescence line height, and remote sensing reflectance, are required. In this paper we present a novel automatic Satellite-based Ocean Monitoring System (SATMO) developed to provide, in near real-time, continuous spatial data sets of the above-mentioned variables for marine-coastal ecosystems in the Gulf of Mexico, northeastern Pacific Ocean, and western Caribbean Sea, with 1 km spatial resolution. The products are obtained from Moderate Resolution Imaging Spectroradiometer (MODIS) images received at the Direct Readout Ground Station (located at CONABIO) after each overpass of the Aqua and Terra satellites. In addition, at the end of each week and month the system provides composite images for several ocean products, as well as weekly and monthly anomaly composites for chlorophyll-a concentration and sea surface temperature. These anomaly data are reported for the first time for the study region and represent valuable information for analyzing time series of ocean color data for the study of coastal and marine ecosystems in Mexico, Central America, and the western Caribbean.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report the use of optical coherence tomography (OCT) to detect and quantify demineralization process induced by S. mutans biofilm in third molars human teeth. Artificial lesions were induced by a S. mutans microbiological culture and the samples (N = 50) were divided into groups according to the demineralization time: 3, 5, 7, 9, and 11days. The OCT system was implemented using a light source delivering an average power of 96 mu W in the sample arm, and spectral characteristics allowing 23 mu m of axial resolution. The images were produced with lateral scans step of 10 pan and analyzed individually. As a result of the evaluation of theses images, lesion depth was calculated as function of demineralization time. The depth of the lesion in the root dentine increased from 70 pm to 230,urn (corrected by the enamel refraction index, 1.62 @ 856 nm), depending of exposure time. The lesion depth in root dentine was correlated to demineralization time, showing that it follows a geometrical progression like a bacteria growth law. [GRAPHICS] Progression of lesion depth in root dentine as function of exposure time, showing that it follows a geometrical progression like a bacteria growth law(C) 2009 by Astro Ltd. Published exclusively by WILEY-VCH Verlag GmbH & Co. KGaA