884 resultados para Artificial satellites in telecommunications
Resumo:
"Taken from monographs and periodicals published in the U.S.S.R. in the years 1955-1958."
Resumo:
Bibliography: p. 392-401.
Resumo:
In rural Australia in the early twenty-first century, telecommunications reform has seen the rise of local telecommunications as a new way to wire the country, delivering new technologies and meeting community needs and aspirations. 1his paper discusses the prospects for local telecommunications in light of a research project on online rural communities commissioned by the Telstra Consumer Consultative Council. Based on interviews conducted in three small towns in rural eastern Australia, the paper examines the role of community networking as a new force in telecommunications service delivery, posing questions for local and regional communications policy development.
Resumo:
Previous studies have indicated that consonant imprecision in Parkinson's disease (PD) may result from a reduction in the amplitude of lingual movements or articulatory undershoot. While this has been postulated, direct measurement of the tongue's contact with the hard palate during speech production has not been undertaken. Therefore, the present study aimed to use electropalatography (EPG) to determine the exact nature of tongue-palate contact in a group of individuals with PD and consonant imprecision (n=9). Furthermore, the current investigation also aimed to compare the results of the participants with PD to a group of aged (n=7) and young (n=8) control speakers to determine the relative contribution of ageing of the lingual musculature to any articulatory deficits noted. Participants were required to read aloud the phrase 'I saw a ___ today' with the artificial palate in-situ. Target words included the consonants /l/, /s/ and /t/ in initial position in both the /i/ and /a/ vowel environments. Phonetic transcription of phoneme productions and description of error types was completed. Furthermore, representative frames of contact were employed to describe the features of tongue-palate contact and to calculate spatial palatal indices. Results of the perceptual investigation revealed that perceived undershooting of articulatory targets distinguished the participant group with PD from the control groups. However, objective EPG assessment indicated that undershooting of the target consonant was not the cause of the perceived articulatory errors. It is, therefore, possible that reduced pressure of tongue contact with the hard palate, sub-lingual deficits or impaired articulatory timing resulted in the perceived undershooting of the target consonants.
Resumo:
Machine breakdowns are one of the main sources of disruption and throughput fluctuation in highly automated production facilities. One element in reducing this disruption is ensuring that the maintenance team responds correctly to machine failures. It is, however, difficult to determine the current practice employed by the maintenance team, let alone suggest improvements to it. 'Knowledge based improvement' is a methodology that aims to address this issue, by (a) eliciting knowledge on current practice, (b) evaluating that practice and (c) looking for improvements. The methodology, based on visual interactive simulation and artificial intelligence methods, and its application to a Ford engine assembly facility are described. Copyright © 2002 Society of Automotive Engineers, Inc.
Resumo:
Regulation is subject to information asymmetries that can lead to allocative and productive inefficiencies. One solution, suggested by Shleifer in 1985 and now adopted by many regulatory bodies round the world, is 'benchmarking', which is sometimes called 'yardstick competition'. In this paper we consider Shleifer's original approach to benchmarking and contrast this with the actual use of benchmarking by UK regulatory bodies in telecommunications, water and the energy sector since the privatizations of the 1980s and early 1990s. We find that benchmarking plays only one part and sometimes a small part in the setting of regulatory price caps in the UK. We also find that in practice benchmarking has been subject to a number of difficulties, which mean that it is never likely to be more than one tool in the regulator's armoury. The UK's experience provides lessons for regulation internationally. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
Bilateral corneal blindness represents a quarter of the total blind, world-wide. The artificial cornea in assorted forms, was developed to replace opaque non-functional corneas and to return sight in otherwise hopeless cases that were not amenable to corneal grafts; believed to be 2% of corneal blind. Despite technological advances in materials design and tissue engineering no artificial cornea has provided absolute, long-term success. Formidable problems exist, due to a combination of unpredictable wound healing and unmanageable pathology. To have a solid guarantee of reliable success an artificial cornea must possess three attributes: an optical window to replace the opaque cornea; a strong, long term union to surrounding ocular tissue; and the ability to induce desired host responses. A unique artificial cornea possesses all three functional attributes- the Osteo-odonto-keratoprosthesis (OOKP). The OOKP has a high success rate and can survive for up to twenty years, but it is complicated both in structure and in surgical procedure; it is expensive and not universally available. The aim of this project was to develop a synthetic substitute for the OOKP, based upon key features of the tooth and bone structure. In doing so, surgical complexity and biological complications would be reduced. Analysis of the biological effectiveness of the OOKP showed that the structure of bone was the most crucial component for implant retention. An experimental semi-rigid hydroxyapatite framework was fabricated with a complex bone-like architecture, which could be fused to the optical window. The first method for making such a framework, was pressing and sintering of hydroxyapatite powders; however, it was not possible to fabricate a void architecture with the correct sizes and uniformity of pores. Ceramers were synthesised using alternative pore forming methods, providing for improved mechanical properties and stronger attachment to the plastic optical window. Naturally occurring skeletal structures closely match the structural features of all forms of natural bone. Synthetic casts were fabricated using the replamineform process, of desirable natural artifacts, such as coral and sponges. The final method of construction by-passed ceramic fabrication in favour of pre-formed coral derivatives and focused on methods for polymer infiltration, adhesion and fabrication. Prototypes were constructed and evaluated; a fully penetrative synthetic OOKP analogue was fabricated according to the dimensions of the OOKP. Fabrication of the cornea shaped OOKP synthetic analogue was also attempted.
Resumo:
Cognitive systems research involves the synthesis of ideas from natural and artificial systems in the analysis, understanding, and design of all intelligent systems. This chapter discusses the cognitive systems associated with the hippocampus (HC) of the human brain and their possible role in behaviour and neurodegenerative disease. The hippocampus (HC) is concerned with the analysis of highly abstract data derived from all sensory systems but its specific role remains controversial. Hence, there have been three major theories concerning its function, viz., the memory theory, the spatial theory, and the behavioral inhibition theory. The memory theory has its origin in the surgical destruction of the HC, which results in severe anterograde and partial retrograde amnesia. The spatial theory has its origin in the observation that neurons in the HC of animals show activity related to their location within the environment. By contrast, the behavioral inhibition theory suggests that the HC acts as a ‘comparator’, i.e., it compares current sensory events with expected or predicted events. If a set of expectations continues to be verified then no alteration of behavior occurs. If, however, a ‘mismatch’ is detected then the HC intervenes by initiating appropriate action by active inhibition of current motor programs and initiation of new data gathering. Understanding the cognitive systems of the hippocampus in humans may aid in the design of intelligent systems involved in spatial mapping, memory, and decision making. In addition, this information may lead to a greater understanding of the course of clinical dementia in the various neurodegenerative diseases in which there is significant damage to the HC.
Resumo:
We report on high power issues related to the reliability of fibre Bragg gratings inscribed with an infrared femtosecond laser using the point-by-point writing method. Conventionally, fibre Bragg gratings have usually been written in fibres using ultraviolet light, either holographically or using a phase mask. Since the coating is highly absorbing in the UV, this process normally requires that the protective polymer coating is stripped prior to inscription, with the fibre then being recoated. This results in a time consuming fabrication process that, unless great care is taken, can lead to fibre strength degradation, due to the presence of surface damage. The recent development of FBG inscription using NIR femtosecond lasers has eliminated the requirement for the stripping of the coating. At the same time the ability to write gratings point-by-point offers the potential for great flexibility in the grating design. There is, however, a requirement for reliability testing of these gratings, particularly for use in telecommunications systems where high powers are increasingly being used in long-haul transmission systems making use of Raman amplification. We report on a study of such gratings which has revealed the presence of broad spectrum power losses. When high powers are used, even at wavelengths far removed from the Bragg condition, these losses produce an increase in the fibre temperature due to absorption in the coating. We have monitored this temperature rise using the wavelength shift in the grating itself. At power levels of a few watts, various temperature increases were experienced ranging from a few degrees up to the point where the buffer completely melts off the fibre at the grating site. Further investigations are currently under way to study the optical loss mechanisms in order to optimise the inscription mechanism and minimise such losses.
Resumo:
An expert system (ES) is a class of computer programs developed by researchers in artificial intelligence. In essence, they are programs made up of a set of rules that analyze information about a specific class of problems, as well as provide analysis of the problems, and, depending upon their design, recommend a course of user action in order to implement corrections. ES are computerized tools designed to enhance the quality and availability of knowledge required by decision makers in a wide range of industries. Decision-making is important for the financial institutions involved due to the high level of risk associated with wrong decisions. The process of making decision is complex and unstructured. The existing models for decision-making do not capture the learned knowledge well enough. In this study, we analyze the beneficial aspects of using ES for decision- making process.
Resumo:
Random fiber lasers blend together attractive features of traditional random lasers, such as low cost and simplicity of fabrication, with high-performance characteristics of conventional fiber lasers, such as good directionality and high efficiency. Low coherence of random lasers is important for speckle-free imaging applications. The random fiber laser with distributed feedback proposed in 2010 led to a quickly developing class of light sources that utilize inherent optical fiber disorder in the form of the Rayleigh scattering and distributed Raman gain. The random fiber laser is an interesting and practically important example of a photonic device based on exploitation of optical medium disorder. We provide an overview of recent advances in this field, including high-power and high-efficiency generation, spectral and statistical properties of random fiber lasers, nonlinear kinetic theory of such systems, and emerging applications in telecommunications and distributed sensing.
Resumo:
Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite's Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.
Resumo:
In the framework of the global energy balance, the radiative energy exchanges between Sun, Earth and space are now accurately quantified from new satellite missions. Much less is known about the magnitude of the energy flows within the climate system and at the Earth surface, which cannot be directly measured by satellites. In addition to satellite observations, here we make extensive use of the growing number of surface observations to constrain the global energy balance not only from space, but also from the surface. We combine these observations with the latest modeling efforts performed for the 5th IPCC assessment report to infer best estimates for the global mean surface radiative components. Our analyses favor global mean downward surface solar and thermal radiation values near 185 and 342 Wm**-2, respectively, which are most compatible with surface observations. Combined with an estimated surface absorbed solar radiation and thermal emission of 161 Wm**-2 and 397 Wm**-2, respectively, this leaves 106 Wm**-2 of surface net radiation available for distribution amongst the non-radiative surface energy balance components. The climate models overestimate the downward solar and underestimate the downward thermal radiation, thereby simulating nevertheless an adequate global mean surface net radiation by error compensation. This also suggests that, globally, the simulated surface sensible and latent heat fluxes, around 20 and 85 Wm**-2 on average, state realistic values. The findings of this study are compiled into a new global energy balance diagram, which may be able to reconcile currently disputed inconsistencies between energy and water cycle estimates.
Resumo:
Information processing in the human brain has always been considered as a source of inspiration in Artificial Intelligence; in particular, it has led researchers to develop different tools such as artificial neural networks. Recent findings in Neurophysiology provide evidence that not only neurons but also isolated and networks of astrocytes are responsible for processing information in the human brain. Artificial neural net- works (ANNs) model neuron-neuron communications. Artificial neuron-glia networks (ANGN), in addition to neuron-neuron communications, model neuron-astrocyte con- nections. In continuation of the research on ANGNs, first we propose, and evaluate a model of adaptive neuro fuzzy inference systems augmented with artificial astrocytes. Then, we propose a model of ANGNs that captures the communications of astrocytes in the brain; in this model, a network of artificial astrocytes are implemented on top of a typical neural network. The results of the implementation of both networks show that on certain combinations of parameter values specifying astrocytes and their con- nections, the new networks outperform typical neural networks. This research opens a range of possibilities for future work on designing more powerful architectures of artificial neural networks that are based on more realistic models of the human brain.
Resumo:
We present the first 3D simulation of the last minutes of oxygen shell burning in an 18 solar mass supernova progenitor up to the onset of core collapse. A moving inner boundary is used to accurately model the contraction of the silicon and iron core according to a 1D stellar evolution model with a self-consistent treatment of core deleptonization and nuclear quasi-equilibrium. The simulation covers the full solid angle to allow the emergence of large-scale convective modes. Due to core contraction and the concomitant acceleration of nuclear burning, the convective Mach number increases to ~0.1 at collapse, and an l=2 mode emerges shortly before the end of the simulation. Aside from a growth of the oxygen shell from 0.51 to 0.56 solar masses due to entrainment from the carbon shell, the convective flow is reasonably well described by mixing length theory, and the dominant scales are compatible with estimates from linear stability analysis. We deduce that artificial changes in the physics, such as accelerated core contraction, can have precarious consequences for the state of convection at collapse. We argue that scaling laws for the convective velocities and eddy sizes furnish good estimates for the state of shell convection at collapse and develop a simple analytic theory for the impact of convective seed perturbations on shock revival in the ensuing supernova. We predict a reduction of the critical luminosity for explosion by 12--24% due to seed asphericities for our 3D progenitor model relative to the case without large seed perturbations.