953 resultados para Artificial satellites in navigation.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An expert system (ES) is a class of computer programs developed by researchers in artificial intelligence. In essence, they are programs made up of a set of rules that analyze information about a specific class of problems, as well as provide analysis of the problems, and, depending upon their design, recommend a course of user action in order to implement corrections. ES are computerized tools designed to enhance the quality and availability of knowledge required by decision makers in a wide range of industries. Decision-making is important for the financial institutions involved due to the high level of risk associated with wrong decisions. The process of making decision is complex and unstructured. The existing models for decision-making do not capture the learned knowledge well enough. In this study, we analyze the beneficial aspects of using ES for decision- making process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite's Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the framework of the global energy balance, the radiative energy exchanges between Sun, Earth and space are now accurately quantified from new satellite missions. Much less is known about the magnitude of the energy flows within the climate system and at the Earth surface, which cannot be directly measured by satellites. In addition to satellite observations, here we make extensive use of the growing number of surface observations to constrain the global energy balance not only from space, but also from the surface. We combine these observations with the latest modeling efforts performed for the 5th IPCC assessment report to infer best estimates for the global mean surface radiative components. Our analyses favor global mean downward surface solar and thermal radiation values near 185 and 342 Wm**-2, respectively, which are most compatible with surface observations. Combined with an estimated surface absorbed solar radiation and thermal emission of 161 Wm**-2 and 397 Wm**-2, respectively, this leaves 106 Wm**-2 of surface net radiation available for distribution amongst the non-radiative surface energy balance components. The climate models overestimate the downward solar and underestimate the downward thermal radiation, thereby simulating nevertheless an adequate global mean surface net radiation by error compensation. This also suggests that, globally, the simulated surface sensible and latent heat fluxes, around 20 and 85 Wm**-2 on average, state realistic values. The findings of this study are compiled into a new global energy balance diagram, which may be able to reconcile currently disputed inconsistencies between energy and water cycle estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information processing in the human brain has always been considered as a source of inspiration in Artificial Intelligence; in particular, it has led researchers to develop different tools such as artificial neural networks. Recent findings in Neurophysiology provide evidence that not only neurons but also isolated and networks of astrocytes are responsible for processing information in the human brain. Artificial neural net- works (ANNs) model neuron-neuron communications. Artificial neuron-glia networks (ANGN), in addition to neuron-neuron communications, model neuron-astrocyte con- nections. In continuation of the research on ANGNs, first we propose, and evaluate a model of adaptive neuro fuzzy inference systems augmented with artificial astrocytes. Then, we propose a model of ANGNs that captures the communications of astrocytes in the brain; in this model, a network of artificial astrocytes are implemented on top of a typical neural network. The results of the implementation of both networks show that on certain combinations of parameter values specifying astrocytes and their con- nections, the new networks outperform typical neural networks. This research opens a range of possibilities for future work on designing more powerful architectures of artificial neural networks that are based on more realistic models of the human brain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research on the mechanisms and processes underlying navigation has traditionally been limited by the practical problems of setting up and controlling navigation in a real-world setting. Thanks to advances in technology, a growing number of researchers are making use of computer-based virtual environments to draw inferences about real-world navigation. However, little research has been done on factors affecting human–computer interactions in navigation tasks. In this study female students completed a virtual route learning task and filled out a battery of questionnaires, which determined levels of computer experience, wayfinding anxiety, neuroticism, extraversion, psychoticism and immersive tendencies as well as their preference for a route or survey strategy. Scores on personality traits and individual differences were then correlated with the time taken to complete the navigation task, the length of path travelled,the velocity of the virtual walk and the number of errors. Navigation performance was significantly influenced by wayfinding anxiety, psychoticism, involvement and overall immersive tendencies and was improved in those participants who adopted a survey strategy. In other words, navigation in virtual environments is effected not only by navigational strategy, but also an individual’s personality, and other factors such as their level of experience with computers. An understanding of these differences is crucial before performance in virtual environments can be generalised to real-world navigational performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first 3D simulation of the last minutes of oxygen shell burning in an 18 solar mass supernova progenitor up to the onset of core collapse. A moving inner boundary is used to accurately model the contraction of the silicon and iron core according to a 1D stellar evolution model with a self-consistent treatment of core deleptonization and nuclear quasi-equilibrium. The simulation covers the full solid angle to allow the emergence of large-scale convective modes. Due to core contraction and the concomitant acceleration of nuclear burning, the convective Mach number increases to ~0.1 at collapse, and an l=2 mode emerges shortly before the end of the simulation. Aside from a growth of the oxygen shell from 0.51 to 0.56 solar masses due to entrainment from the carbon shell, the convective flow is reasonably well described by mixing length theory, and the dominant scales are compatible with estimates from linear stability analysis. We deduce that artificial changes in the physics, such as accelerated core contraction, can have precarious consequences for the state of convection at collapse. We argue that scaling laws for the convective velocities and eddy sizes furnish good estimates for the state of shell convection at collapse and develop a simple analytic theory for the impact of convective seed perturbations on shock revival in the ensuing supernova. We predict a reduction of the critical luminosity for explosion by 12--24% due to seed asphericities for our 3D progenitor model relative to the case without large seed perturbations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The new generation of artificial satellites is providing a huge amount of Earth observation images whose exploitation can report invaluable benefits, both economical and environmental. However, only a small fraction of this data volume has been analyzed, mainly due to the large human resources needed for that task. In this sense, the development of unsupervised methodologies for the analysis of these images is a priority. In this work, a new unsupervised segmentation algorithm for satellite images is proposed. This algorithm is based on the rough-set theory, and it is inspired by a previous segmentation algorithm defined in the RGB color domain. The main contributions of the new algorithm are: (i) extending the original algorithm to four spectral bands; (ii) the concept of the superpixel is used in order to define the neighborhood similarity of a pixel adapted to the local characteristics of each image; (iii) and two new region merged strategies are proposed and evaluated in order to establish the final number of regions in the segmented image. The experimental results show that the proposed approach improves the results provided by the original method when both are applied to satellite images with different spectral and spatial resolutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite’s Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resuscitation and stabilization are key issues in Intensive Care Burn Units and early survival predictions help to decide the best clinical action during these phases. Current survival scores of burns focus on clinical variables such as age or the body surface area. However, the evolution of other parameters (e.g. diuresis or fluid balance) during the first days is also valuable knowledge. In this work we suggest a methodology and we propose a Temporal Data Mining algorithm to estimate the survival condition from the patient’s evolution. Experiments conducted on 480 patients show the improvement of survival prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To compare soft-tissue dissolution by sodium hypochlorite, with an EDTA intermediate rinse, with or without activation with passive ultrasonic activation (PUI) or sonic activation using the Endoactivator (EA) or Eddy tips (ED). Methodology: The root canals of eighty-three human maxillary central incisors were chemo-mechanically prepared and the teeth split. A standardized longitudinal intracanal groove was created in one of the root halves. Eighty-three porcine palatal mucosa samples were collected, adapted to fit into the grooves and weighed. The re-assembled specimens were randomly divided into four experimental groups (n = 20), based on the final rinse: no activation; EA; PUI; ED, using 2.5% sodium hypochlorite, with an EDTA intermediate rinse. A control group (n = 3) was irrigated with distilled water without activation. The solutions were delivered using a syringe and needle 2 mm from working length. Total irrigation time was 150 s, including 60 s of activation in the specific groups. The study was carried out at 36 ± 2 °C. The porcine palatal mucosa samples were weighed after completion of the assays. Student paired t-test and anova were used to assess the intra- and intergroup weight changes. The multiple comparisons were evaluated using Bonferroni correction (α = 0.05). Results: Weight loss occurred in all experimental groups. Irrigant activation resulted in greater weight loss when compared to the nonactivated group [vs. EA (P = 0.001); vs. PUI (P < 0.001); vs. ED (P < 0.001)]. No significant differences were found amongst the different activation systems. Conclusions: Activation increased the tissue-dissolving activity of irrigants from artificial grooves in root canals of maxillary central incisors. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial robots are both versatile and high performant, enabling the flexible automation typical of the modern Smart Factories. For safety reasons, however, they must be relegated inside closed fences and/or virtual safety barriers, to keep them strictly separated from human operators. This can be a limitation in some scenarios in which it is useful to combine the human cognitive skill with the accuracy and repeatability of a robot, or simply to allow a safe coexistence in a shared workspace. Collaborative robots (cobots), on the other hand, are intrinsically limited in speed and power in order to share workspace and tasks with human operators, and feature the very intuitive hand guiding programming method. Cobots, however, cannot compete with industrial robots in terms of performance, and are thus useful only in a limited niche, where they can actually bring an improvement in productivity and/or in the quality of the work thanks to their synergy with human operators. The limitations of both the pure industrial and the collaborative paradigms can be overcome by combining industrial robots with artificial vision. In particular, vision can be exploited for a real-time adjustment of the pre-programmed task-based robot trajectory, by means of the visual tracking of dynamic obstacles (e.g. human operators). This strategy allows the robot to modify its motion only when necessary, thus maintain a high level of productivity but at the same time increasing its versatility. Other than that, vision offers the possibility of more intuitive programming paradigms for the industrial robots as well, such as the programming by demonstration paradigm. These possibilities offered by artificial vision enable, as a matter of fact, an efficacious and promising way of achieving human-robot collaboration, which has the advantage of overcoming the limitations of both the previous paradigms yet keeping their strengths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Em função de suas condições de interface entre águas doces e salinas, desembocaduras estuarinas e lagunares constituem sistemas geomorfológicos altamente complexos e dinâmicos. Como conseqüência da variabilidade espacial e temporal dos fluxos de maré, o leito responde com uma grande variabilidade nas características morfológicas e sedimentares. Neste sentido, é possível relacionar diretamente a circulação de fundo e o transporte sedimentar com as feições submersas geradas. Perfis de ecossondagem, sonar de varredura lateral e sísmica de alta resolução, executados na desembocadura lagunar de Cananéia, revelaram a existência de uma dinâmica de fundo extremamente complexa, caracterizada por marcas onduladas e ondas de areia de alturas métricas. As maiores ondas de areia, localizadas em uma depressão na desembocadura lagunar, apresentam inversão de polaridade em sua assimetria, com a presença de ondas simétricas de grande tamanho no ponto de inversão. Este padrão morfológico não apresenta variação temporal em escala anual, sugerindo a persistência de um padrão de fluxos sobre o leito. Esta dinâmica revela, também, a constância de fluxos convergentes que aparentemente independem das condições de maré enchente ou vazante. Os resultados permitiram o estabelecimento de um primeiro modelo qualitativo de circulação de fundo na área, com aplicações potenciais na navegação e estudos de proteção da costa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is part of a research under construction since 2000, in which the main objective is to measure small dynamic displacements by using L1 GPS receivers. A very sensible way to detect millimetric periodic displacements is based on the Phase Residual Method (PRM). This method is based on the frequency domain analysis of the phase residuals resulted from the L1 double difference static data processing of two satellites in almost orthogonal elevation angle. In this article, it is proposed to obtain the phase residuals directly from the raw phase observable collected in a short baseline during a limited time span, in lieu of obtaining the residual data file from regular GPS processing programs which not always allow the choice of the aimed satellites. In order to improve the ability to detect millimetric oscillations, two filtering techniques are introduced. One is auto-correlation which reduces the phase noise with random time behavior. The other is the running mean to separate low frequency from the high frequency phase sources. Two trials have been carried out to verify the proposed method and filtering techniques. One simulates a 2.5 millimeter vertical antenna displacement and the second uses the GPS data collected during a bridge load test. The results have shown a good consistency to detect millimetric oscillations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: We aimed to evaluate if the co-localisation of calcium and necrosis in intravascular ultrasound virtual histology (IVUS-VH) is due to artefact, and whether this effect can be mathematically estimated. Methods and results: We hypothesised that, in case calcium induces an artefactual coding of necrosis, any addition in calcium content would generate an artificial increment in the necrotic tissue. Stent struts were used to simulate the ""added calcium"". The change in the amount and in the spatial localisation of necrotic tissue was evaluated before and after stenting (n=17 coronary lesions) by means of a especially developed imaging software. The area of ""calcium"" increased from a median of 0.04 mm(2) at baseline to 0.76 mm(2) after stenting (p<0.01). In parallel the median necrotic content increased from 0.19 mm(2) to 0.59 mm(2) (p<0.01). The ""added"" calcium strongly predicted a proportional increase in necrosis-coded tissue in the areas surrounding the calcium-like spots (model R(2)=0.70; p<0.001). Conclusions: Artificial addition of calcium-like elements to the atherosclerotic plaque led to an increase in necrotic tissue in virtual histology that is probably artefactual. The overestimation of necrotic tissue by calcium strictly followed a linear pattern, indicating that it may be amenable to mathematical correction.