930 resultados para new type AgInSbtTe phase change films
Resumo:
A tese aborda os processos comunicacionais nas rádios comunitárias do Sertão do Nordeste do Brasil que estão na Internet. Objetiva-se entender teórica e empiricamente como ocorrem esses processos nestas emissoras e explorar suas especificidades; compreender as estruturas, programações, equipes, financiamentos e históricos de inserção digital; entender os processos de estímulo, emissão e interação, em termos de cidadania; compreender como se dão as novas vozes, territoriais e na Internet; entender como se dá a participação do usuário (internauta); e listar as rádios comunitárias ou que se assumem comunitárias sertanejas, entendendo suas peculiaridades de programação, diferencial em termos de emissão territorial e não territorial (via Internet). A metodologia empregada consiste em pesquisa bibliográfica e documental, em mapeamento prévio das emissoras, bem como de pesquisa de campo por meio visitas in loco e de entrevistas semiestruturadas para entender-se as emissoras nos oito Estados sertanejos nordestinos (Alagoas, Bahia, Ceará, Paraíba, Pernambuco, Piauí, Rio Grande do Norte e Sergipe). Acompanhou-se também o trabalho das emissoras na Internet, tanto em seus sites quanto a presença em redes sociais. Constata-se, com base em parâmetros teóricos, a existência de três tipos de emissoras de rádio comunitária na Internet: as off-line (que apenas têm espaço na Internet, mas não há transmissão simultânea), as online institucionais (que apenas transmitem simultaneamente a programação no dial) e as online dinâmicas (que têm conteúdo diferencial da emissora no dial e promovem interação e interatividade). O fato de estar na Internet faz com que as emissoras de rádio comunitária sertanejas aumentem sua capacidade de promover a participação, a interação e a interatividade, pois ocorre a retroalimentação da comunicação comunitária radiofônica com um novo tipo movido pela desterritorialização, que é o maior desafio dessas emissoras em lugares de baixo poder aquisitivo e comunicacional onde a presença do coronelismo eletrônico ainda persiste.
Resumo:
We have designed and fabricated a new type of fibre Bragg grating (FBG) with a V-shaped dispersion profile for multi-channel dispersion compensation in communication links.
Resumo:
We propose a new type of fiber Bragg grating (FBG) with a V-shaped dispersion profile. We demonstrate that such V-shaped FBGs bring advantages in manipulation of optical signals compared to conventional FBGs with a constant dispersion, e.g., they can produce larger chirp for the same input pulsewidth and/or can be used as pulse shapers. Application of the proposed V-shaped FBGs for signal prechirping in fiber transmission is examined. The proposed design of the V-shaped FBG can be easily extended to embrace multichannel devices.
Resumo:
The paper introduces a framework for the formal specification of autonomic computing policies, and uses it to define a new type of autonomic computing policy termed a resource-definition policy. We describe the semantics of resource-definition policies, explain how they can be used as a basis for the development of autonomic system of systems, and present a sample data-centre application built using the new policy type.
Resumo:
A key objective of autonomic computing is to reduce the cost and expertise required for the management of complex IT systems. As a growing number of these systems are implemented as hierarchies or federations of lower-level systems, techniques that support the development of autonomic systems of systems are required. This article introduces one such technique, which involves the run-time synthesis of autonomic system connectors. These connectors are specified by means of a new type of autonomic computing policy termed a resource definition policy, and enable the dynamic realisation of collections of collaborating autonomic systems, as envisaged by the original vision of autonomic computing. We propose a framework for the formal specification of autonomic computing policies, and use it to define the new policy type and to describe its application to the development of autonomic system of systems. To validate the approach, we present a sample data-centre application that was built using connectors synthesised from resource-definition policies.
Resumo:
This investigation looks critically at conventional magnetic lenses in the light of present-day technology with the aim of advancing electron microscopy in its broadest sense. By optimising the cooling arrangements and heat transfer characteristics of lens windings it was possible to increase substantially the current density in the winding, and achieve a large reduction in the size of conventional magnetic electron lenses. Following investigations into the properties of solenoidal lenses, a new type of lens with only one pole-piece was developed. The focal properties of such lenses, which differ considerably from those.of conventional lenses, have been derived from a combination of mathematical models and experimentally measured axial flux density distributions. These properties can be profitably discussed with reference to "half-lenses". Miniature conventional twin pole-piece lenses and the proposed radial field single pole-piece lenses have been designed and constructed and both types of lenses have been evaluated by constructing miniature electron optical columns. A miniature experimental transmission electron microscope (TEM), a miniature scanning electron microscope (SEM) and a scanning transmission microscope (STEM) have been built. A single pole-piece miniature one million volt projector lens of only lOcm diameter and weighing 2.lkg was designed, built and tested at 1 million volts in a commercial electron microscope. iii. Preliminary experiments indicate that in single pole lenses it is possible to extract secondary electrons from the specimen in spite of the presence of the magnetic field of the probe-forming lens. This may well be relevant for the SEM in which it is desirable to examine a large specimen at a moderately good resolution.
Resumo:
This thesis attempts a psychological investigation of hemispheric functioning in developmental dyslexia. Previous work using neuropsychological methods with developmental dyslexics is reviewed ,and original work is presented both of a conventional psychometric nature and also utilising a new means of intervention. At the inception of inquiry into dyslexia, comparisons were drawn between developmental dyslexia and acquired alexia, promoting a model of brain damage as the common cause. Subsequent investigators found developmental dyslexics to be neurologically intact, and so an alternative hypothesis was offered, namely that language is abnormally localized (not in the left hemisphere). Research in the last decade, using the advanced techniques of modern neuropsychology, has indicated that developmental dyslexics are probably left hemisphere dominant for language. The development of a new type of pharmaceutical prep~ration (that appears to have a left hemisphere effect) offers an oppertunity to test the experimental hypothesis. This hypothesis propounds that most dyslexics are left hemisphere language dominant, but some of these language related operations are dysfunctioning. The methods utilised are those of psychological assessment of cognitive function, both in a traditional psychometric situation, and with a new form of intervention (Piracetam). The information resulting from intervention will be judged on its therapeutic validity and contribution to the understanding of hemispheric functioning in dyslexics. The experimental studies using conventional psychometric evaluation revealed a dyslexic profile of poor sequencing and name coding ability, with adequate spatial and verbal reasoning skills. Neuropsychological information would tend to suggest that this profile was indicative of adequate right hemsiphere abilities and deficits in some left hemsiphere abilities. When an intervention agent (Piracetam) was used with young adult dyslexics there were improvements in both the rate of acquisition and conservation of verbal learning. An experimental study with dyslexic children revealed that Piracetam appeared to improve reading, writing and sequencing, but did not influence spatial abilities. This would seem to concord with other recent findings, that deve~mental dyslexics may have left hemisphere language localisation, although some of these language related abilities are dysfunctioning.
Resumo:
Current analytical assay methods for ampicillin sodium and cloxacillin sodium are discussed and compared, High Performance Liquid Chromatography (H.P.L.C.) being chosen as the most accurate, specific and precise. New H.P.L.C. methods for the analysis of benzathine cloxacillin; benzathine penicillin V; procaine penicillin injection B.P.; benethamine penicillin injection; fortified B.P.C.; benzathine penicillin injection; benzathine penicillin injection, fortified B.P.C.; benzathine penicillin suspnsion; ampicillin syrups and penicillin syrups are described. Mechanical or chemical damage to column packings is often associated with H.P.L.C. analysis. One type, that of channel formation, is investigated. The high linear velocity of solvent and solvent pulsing during the pumping cycle were found to be the cause of this damage. The applicability of nonisotherrnal kinetic experiments to penicillin V preparations, including formulated paediatric syrups, is evaluated. A new type of nonisotherrnal analysis, based on slope estimation and using a 64K Random Access Memory (R.A.M.) microcomputer is described. The name of the program written for this analysis is NONISO. The distribution of active penicillin in granules for reconstitution into ampicillin and penicillin V syrups, and its effect on the stability of the reconstituted products, are investigated. Changing the diluent used to reconstitue the syrups was found to affect the stability of the product. Dissolution and stability of benzathine cloxacillin at pH2, pH6 and pH9 is described, with proposed dissolution mechanisms and kinetic analysis to support these mechanisms. Benzathine and cloxacillin were found to react in solution at pH9, producing an insoluble amide.
Resumo:
Following a scene-setting introduction are detailed reviews of the relevant scientific principles, thermal analysis as a research tool and the development of the zinc-aluminium family of alloys. A recently introduced simultaneous thermal analyser, the STA 1500, its use for differential thermal analysis (DTA) being central to the investigation, is described, together with the sources of support information, chemical analysis, scanning electron microscopy, ingot cooling curves and fluidity spiral castings. The compositions of alloys tested were from the binary zinc-aluminium system, the ternary zinc-aluminium-silicon system at 30%, 50% and 70% aluminium levels, binary and ternary alloys with additions of copper and magnesium to simulate commercial alloys and five widely used commercial alloys. Each alloy was shotted to provide the smaller, 100mg, representative sample required for DTA. The STA 1500 was characterised and calibrated with commercially pure zinc, and an experimental procedure established for the determination of DTA heating curves at 10°C per minute and cooling curves at 2°C per minute. Phase change temperatures were taken from DTA traces, most importantly, liquidus from a cooling curve and solidus from both heating and cooling curves. The accepted zinc-aluminium binary phase diagram was endorsed with the added detail that the eutectic is at 5.2% aluminium rather than 5.0%. The ternary eutectic trough was found to run through the points, 70% Al, 7.1% Si, 545°C; 50% Al, 3.9% Si, 520°C; 30% Al, 1.4% Si, 482°C. The dendrite arm spacing in samples after DTA increased with increasing aluminium content from 130m at 30% to 220m at 70%. The smallest dendrite arm spacing of 60m was in the 30% aluminium 2% silicon alloy. A 1kg ingot of the 10% aluminium binary alloy, insulated with Kaowool, solidified at the same 2°C per minute rate as the DTA samples. A similar sized sand casting was solidified at 3°C per minute and a chill casting at 27°C per minute. During metallographic examination the following features were observed: heavily cored phase which decomposed into ' and '' on cooling; needles of the intermetallic phase FeAl4; copper containing ternary eutectic and copper rich T phase.
Resumo:
Tonal, textural and contextual properties are used in manual photointerpretation of remotely sensed data. This study has used these three attributes to produce a lithological map of semi arid northwest Argentina by semi automatic computer classification procedures of remotely sensed data. Three different types of satellite data were investigated, these were LANDSAT MSS, TM and SIR-A imagery. Supervised classification procedures using tonal features only produced poor classification results. LANDSAT MSS produced classification accuracies in the range of 40 to 60%, while accuracies of 50 to 70% were achieved using LANDSAT TM data. The addition of SIR-A data produced increases in the classification accuracy. The increased classification accuracy of TM over the MSS is because of the better discrimination of geological materials afforded by the middle infra red bands of the TM sensor. The maximum likelihood classifier consistently produced classification accuracies 10 to 15% higher than either the minimum distance to means or decision tree classifier, this improved accuracy was obtained at the cost of greatly increased processing time. A new type of classifier the spectral shape classifier, which is computationally as fast as a minimum distance to means classifier is described. However, the results for this classifier were disappointing, being lower in most cases than the minimum distance or decision tree procedures. The classification results using only tonal features were felt to be unacceptably poor, therefore textural attributes were investigated. Texture is an important attribute used by photogeologists to discriminate lithology. In the case of TM data, texture measures were found to increase the classification accuracy by up to 15%. However, in the case of the LANDSAT MSS data the use of texture measures did not provide any significant increase in the accuracy of classification. For TM data, it was found that second order texture, especially the SGLDM based measures, produced highest classification accuracy. Contextual post processing was found to increase classification accuracy and improve the visual appearance of classified output by removing isolated misclassified pixels which tend to clutter classified images. Simple contextual features, such as mode filters were found to out perform more complex features such as gravitational filter or minimal area replacement methods. Generally the larger the size of the filter, the greater the increase in the accuracy. Production rules were used to build a knowledge based system which used tonal and textural features to identify sedimentary lithologies in each of the two test sites. The knowledge based system was able to identify six out of ten lithologies correctly.
Resumo:
This thesis includes analysis of disordered spin ensembles corresponding to Exact Cover, a multi-access channel problem, and composite models combining sparse and dense interactions. The satisfiability problem in Exact Cover is addressed using a statistical analysis of a simple branch and bound algorithm. The algorithm can be formulated in the large system limit as a branching process, for which critical properties can be analysed. Far from the critical point a set of differential equations may be used to model the process, and these are solved by numerical integration and exact bounding methods. The multi-access channel problem is formulated as an equilibrium statistical physics problem for the case of bit transmission on a channel with power control and synchronisation. A sparse code division multiple access method is considered and the optimal detection properties are examined in typical case by use of the replica method, and compared to detection performance achieved by interactive decoding methods. These codes are found to have phenomena closely resembling the well-understood dense codes. The composite model is introduced as an abstraction of canonical sparse and dense disordered spin models. The model includes couplings due to both dense and sparse topologies simultaneously. The new type of codes are shown to outperform sparse and dense codes in some regimes both in optimal performance, and in performance achieved by iterative detection methods in finite systems.
Resumo:
We propose a new type of fiber Bragg grating (FBG) with a V-shaped dispersion profile. We demonstrate that such V-shaped FBGs bring advantages in manipulation of optical signals compared to conventional FBGs with a constant dispersion, e.g., they can produce larger chirp for the same input pulsewidth and/or can be used as pulse shapers. Application of the proposed V-shaped FBGs for signal prechirping in fiber transmission is examined. The proposed design of the V-shaped FBG can be easily extended to embrace multichannel devices.
Resumo:
This study proposes a new type of greenhouse for water re-use and energy saving for agriculture in arid and semi-arid inland regions affected by groundwater salinity. It combines desalination using reverse osmosis (RO), re-use of saline concentrate rejected by RO for cooling, and rainwater harvesting. Experimental work was carried at GBPUAT, Pantnagar, India. Saline concentrate was fed to evaporative cooling pads of greenhouse and found to evaporate at similar rates as conventional freshwater. Two enhancements to the system are described: i) A jet pump, designed and tested to use pressurized reject stream to re-circulate cooling water and thus maintain uniform wetness in cooling pads, was found capable of multiplying flow of cooling water by a factor of 2.5 to 4 while lifting water to a head of 1.55 m; and ii) Use of solar power to drive ventilation fans of greenhouse, for which an electronic circuit has been produced that uses maximum power-point tracking to maximize energy efficiency. Re-use of RO rejected concentrate for cooling saves water (6 l d-1 m-2) of greenhouse floor area and the improved fan could reduce electricity consumption by a factor 8.
Resumo:
This article focuses on one type of institutional change: conversion. One innovative approach to institutional change, the “political-coalitional approach”, acknowledges that: institutions can have unintended effects, which may privilege certain groups over others; institutions are often created and sustained through compromise with external actors; and institutions’ external context can vary significantly over time, as different coalitions’ power waxes and wanes. This approach helps explain the conversion of one institution drawn from the UK National Health Service, the National Reporting and Learning System. However, the shift of this system from producing formative information to facilitate learning to promote safer care, towards producing summative information to support resource allocation decisions, cannot be explained merely by examining the actions of external power coalitions. An internal focus, which considers factors that are normally viewed as “organisational” (such as leadership and internal stability), is also required.
Resumo:
This investigation originated from work by Dr. A.H. McIlraith of the National Physical Laboratory who, in 1966, described a new type of charged particle oscillator. This makes use of two equal cylindrical electrodes to constrain the particles in such a way that they follow extremely long oscillatory paths between the electrodes under the influence of an electrostatic field alone. The object of this work has been to study the principle of the oscillator in detail and to investigate its properties and applications. Any device which is capable of creating long electron trajectories has potential application in the field of ultra high vacuum technology. It was therefore considered that a critical review of the problems associated with the production and measurement of ultra high vacuum was relevant in the initial stages of the work. The oscillator has been applied with a considerable degree of success as a high energy electrostatic ion source. This offers several advantages over existing ion sources. It can be operated at much lower pressures without the need of a magnetic field. The oscillator principle has also been applied as a thermionic ionization gauge and has been compared with other ionization gauges to pressures as low as 5 x 10- 11 torr.. This new gauge exhibited a number of advantages over most of the existing gauges. Finally the oscillator has been used in an evaporation ion pump and has exhibited fairly high pumping speeds for argon gas relative to those for nitrogen. This investigation supports the original work of Dr. A.H. McIlraith and shows that his proposed oscillator has considerable potential in the fields of vacuum technology and electron physics.