89 resultados para Junius, 18th cent.
Resumo:
Charts relating the capacitance to the width, spacing, thickness and height above the ground plane of coupled microstrips have been obtained. These are used to design hairpin line and hybrid hairpin line filters as well as multiplexers using microstrip comb line filters. The experimental results agree reasonably well with the design specifications. Getsinger's original charts for parallel coupled bars between parallel plates have been formulated for the microstrip case. Corresponding charts relating the capacitances to the width, spacing, thickness and height above the ground plane of coupled microstrips have been obtained. Examples of the use of these charts are shown in the design of hairpin lines and hybrid hairpin line filters as well as multiplexers using comb line filters. The hairpin line/hybrid hairpin line filters were designed to operate at a central frequency of 9÷5 GHz with 11 per cent bandwidth and 0÷5 dB ripple. The three filters constituting the comb line filters have center frequencies of 2÷4, 3÷0 and 3÷6 GHz. The components so designed were fabricated and tested. The dielectric used for the microstrip was teflon. Experimental curves for the attenuation (insertion loss) and VSWR are given. The design specifications arc satisfied quite well.
Resumo:
In the Himalayas, large area is covered by glaciers, seasonal snow and changes in its extent can influence availability of water in the Himalayan Rivers. In this paper, changes in glacial extent, glacial mass balance and seasonal snow cover have been discussed. Field and satellite based investigations suggest, most of the Himalayan glaciers are retreating though the rate of retreat is varying from glacier to glacier, ranging from few meters to almost 50 meters per year, depending upon the numerous glacial, terrain and meteorological parameters. Retreat was estimated for 1868 glaciers in eleven basins distributed across the Indian Himalaya since 1962 to 2001/02. Estimates show an overall reduction in glacier area from 6332 to 5329 sq km, an overall deglaciation of 16 percent.Snow line at the end of ablation season on the Chhota Shigri glacier suggests a change in altitude from 4900 to 5200 m from late 1970’s to the present. Seasonal snow cover monitoring of the Himalaya has shown large amounts of snow cover depletion in early part of winter, i.e. from October to December. For many basins located in lower altitude and in south of Pir Panjal range, snow ablation was observed through out the winter season. In addition, average stream runoff of the Baspa basin during the month of December shows an increase by 75 per cent. This combination of glacial retreat, negative mass balance, early melting of seasonal snow cover and winter time increase in stream runoff suggest an influence of climate change on the Himalayan cryosphere.
Resumo:
The questions that one should answer in engineering computations - deterministic, probabilistic/randomized, as well as heuristic - are (i) how good the computed results/outputs are and (ii) how much the cost in terms of amount of computation and the amount of storage utilized in getting the outputs is. The absolutely errorfree quantities as well as the completely errorless computations done in a natural process can never be captured by any means that we have at our disposal. While the computations including the input real quantities in nature/natural processes are exact, all the computations that we do using a digital computer or are carried out in an embedded form are never exact. The input data for such computations are also never exact because any measuring instrument has inherent error of a fixed order associated with it and this error, as a matter of hypothesis and not as a matter of assumption, is not less than 0.005 per cent. Here by error we imply relative error bounds. The fact that exact error is never known under any circumstances and any context implies that the term error is nothing but error-bounds. Further, in engineering computations, it is the relative error or, equivalently, the relative error-bounds (and not the absolute error) which is supremely important in providing us the information regarding the quality of the results/outputs. Another important fact is that inconsistency and/or near-consistency in nature, i.e., in problems created from nature is completely nonexistent while in our modelling of the natural problems we may introduce inconsistency or near-inconsistency due to human error or due to inherent non-removable error associated with any measuring device or due to assumptions introduced to make the problem solvable or more easily solvable in practice. Thus if we discover any inconsistency or possibly any near-inconsistency in a mathematical model, it is certainly due to any or all of the three foregoing factors. We do, however, go ahead to solve such inconsistent/near-consistent problems and do get results that could be useful in real-world situations. The talk considers several deterministic, probabilistic, and heuristic algorithms in numerical optimisation, other numerical and statistical computations, and in PAC (probably approximately correct) learning models. It highlights the quality of the results/outputs through specifying relative error-bounds along with the associated confidence level, and the cost, viz., amount of computations and that of storage through complexity. It points out the limitation in error-free computations (wherever possible, i.e., where the number of arithmetic operations is finite and is known a priori) as well as in the usage of interval arithmetic. Further, the interdependence among the error, the confidence, and the cost is discussed.
Resumo:
Background & objectives: There is a need to develop an affordable and reliable tool for hearing screening of neonates in resource constrained, medically underserved areas of developing nations. This study valuates a strategy of health worker based screening of neonates using a low cost mechanical calibrated noisemaker followed up with parental monitoring of age appropriate auditory milestones for detecting severe-profound hearing impairment in infants by 6 months of age. Methods: A trained health worker under the supervision of a qualified audiologist screened 425 neonates of whom 20 had confirmed severe-profound hearing impairment. Mechanical calibrated noisemakers of 50, 60, 70 and 80 dB (A) were used to elicit the behavioural responses. The parents of screened neonates were instructed to monitor the normal language and auditory milestones till 6 months of age. This strategy was validated against the reference standard consisting of a battery of tests - namely, auditory brain stem response (ABR), otoacoustic emissions (OAE) and behavioural assessment at 2 years of age. Bayesian prevalence weighted measures of screening were calculated. Results: The sensitivity and specificity was high with least false positive referrals for. 70 and 80 dB (A) noisemakers. All the noisemakers had 100 per cent negative predictive value. 70 and 80 dB (A) noisemakers had high positive likelihood ratios of 19 and 34, respectively. The probability differences for pre- and post- test positive was 43 and 58 for 70 and 80 dB (A) noisemakers, respectively. Interpretation & conclusions: In a controlled setting, health workers with primary education can be trained to use a mechanical calibrated noisemaker made of locally available material to reliably screen for severe-profound hearing loss in neonates. The monitoring of auditory responses could be done by informed parents. Multi-centre field trials of this strategy need to be carried out to examine the feasibility of community health care workers using it in resource constrained settings of developing nations to implement an effective national neonatal hearing screening programme.
Resumo:
This article presents the studies conducted on turbocharged producer gas engines designed originally for natural gas (NG) as the fuel. Producer gas, whose properties like stoichiometric ratio, calorific value, laminar flame speed, adiabatic flame temperature, and related parameters that differ from those of NG, is used as the fuel. Two engines having similar turbochargers are evaluated for performance. Detailed measurements on the mass flowrates of fuel and air, pressures and temperatures at various locations on the turbocharger were carried out. On both the engines, the pressure ratio across the compressor was measured to be 1.40 +/- 0.05 and the density ratio to be 1.35 +/- 0.05 across the turbocharger with after-cooler. Thermodynamic analysis of the data on both the engines suggests a compressor efficiency of 70 per cent. The specific energy consumption at the peak load is found to be 13.1 MJ/kWh with producer gas as the fuel. Compared with the naturally aspirated mode, the mass flow and the peak load in the turbocharged after-cooled condition increased by 35 per cent and 30 per cent, respectively. The pressure ratios obtained with the use of NG and producer gas are compared with corrected mass flow on the compressor map.
Resumo:
This article addresses the adaptation of a low-power natural gas engine for using producer gas as a fuel. The 5.9 L natural gas engine with a compression ratio of 10.5:1, rated at 55 kW shaft power, delivered 30 kW using producer gas as fuel in the naturally aspirated mode. Optimal ignition timing for peak power was found to be 20 degrees before top dead centre. Air-to-fuel ratio (A/F) was found to be 1.2 +/- 0.1 over a range of loads. Critical evaluation of the energy flows in the engine resulted in identifying losses and optimizing the engine cooling. The specific fuel consumption was found to be 1.2 +/- 0.1 kg of biomass per kilowatt hour. A reduction of 40 per cent in brake mean effective pressure was observed compared with natural gas operation. Governor response to load variations has been studied with respect to frequency recovery time. The study also attempts to adopt a turbocharger for higher power output. Preliminary results suggest a possibility of about 30 per cent increase in the output.
Resumo:
The use of high-velocity sheet-forming techniques where the strain rates are in excess of 10(2)/s can help us solve many problems that are difficult to overcome with traditional metal-forming techniques. In this investigation, thin metallic plates/foils were subjected to shock wave loading in the newly developed diaphragmless shock tube. The conventional shock tube used in the aerodynamic applications uses a metal diaphragm for generating shock waves. This method of operation has its own disadvantages including the problems associated with repeatable and reliable generation of shock waves. Moreover, in industrial scenario, changing metal diaphragms after every shot is not desirable. Hence, a diaphragmless shock tube is calibrated and used in this study. Shock Mach numbers up to 3 can be generated with a high degree of repeatability (+/- 4 per cent) for the pressure jumps across the primary shock wave. The shock Mach number scatter is within +/- 1.5 per cent. Copper, brass, and aluminium plates of diameter 60 mm and thickness varying from 0.1 to 1 mm are used. The plate peak over-pressures ranging from 1 to 10 bar are used. The midpoint deflection, circumferential, radial, and thickness strains are measured and using these, the Von Mises strain is also calculated. The experimental results are compared with the numerical values obtained using finite element analysis. The experimental results match well with the numerical values. The plastic hinge effect was also observed in the finite element simulations. Analysis of the failed specimens shows that aluminium plates had mode I failure, whereas copper plates had mode II failure.
Resumo:
The implementation of semiconductor circuits and systems in nano-technology makes it possible to achieve high speed, lower voltage level and smaller area. The unintended and undesirable result of this scaling is that it makes integrated circuits susceptible to soft errors normally caused by alpha particle or neutron hits. These events of radiation strike resulting into bit upsets referred to as single event upsets(SEU), become increasingly of concern for the reliable circuit operation in the field. Storage elements are worst hit by this phenomenon. As we further scale down, there is greater interest in reliability of the circuits and systems, apart from the performance, power and area aspects. In this paper we propose an improved 12T SEU tolerant SRAM cell design. The proposed SRAM cell is economical in terms of area overhead. It is easy to fabricate as compared to earlier designs. Simulation results show that the proposed cell is highly robust, as it does not flip even for a transient pulse with 62 times the Q(crit) of a standard 6T SRAM cell.
Resumo:
We examine a natural, but non-tight, reductionist security proof for deterministic message authentication code (MAC) schemes in the multi-user setting. If security parameters for the MAC scheme are selected without accounting for the non-tightness in the reduction, then the MAC scheme is shown to provide a level of security that is less than desirable in the multi-user setting. We find similar deficiencies in the security assurances provided by non-tight proofs when we analyze some protocols in the literature including ones for network authentication and aggregate MACs. Our observations call into question the practical value of non-tight reductionist security proofs. We also exhibit attacks on authenticated encryption schemes, disk encryption schemes, and stream ciphers in the multi-user setting.
Resumo:
The acoustical behaviour of an elliptical chamber muffler having a side inlet and side outlet port is analyzed in this paper, wherein a uniform velocity piston source is assumed to model the 3-D acoustic field in the elliptical chamber cavity. Towards this end, we consider the modal expansion of the acoustic pressure field in the elliptical cavity in terms of the angular and radial Mathieu func-tions, subjected to the rigid wall condition. Then, the Green's function due to the point source lo-cated on the side (curved) surface of the elliptical chamber is obtained. On integrating this function over the elliptical piston area on the curved surface of the elliptical chamber and subsequent divi-sion by the area of the elliptic piston, one obtains the acoustic pressure field due to the piston driven source which is equivalent to considering plane wave propagation in the side ports. Thus, one can obtain the acoustic pressure response functions, i.e., the impedance matrix (Z) parameters due to the sources (ports) located on the side surface, from which one may also obtain a progressive wave rep-resentation in terms of the scattering matrix (S). Finally, the acoustic performance of the muffler is evaluated in terms of the Transmission loss (TL) which is computed in terms of the scattering pa-rameters. The effect of the axial length of the muffler and the angular location of the ports on the TL characteristics is studied in detail. The acoustically long chambers show dominant axial plane wave propagation while the TL spectrum of short chambers indicates the dominance of the trans-versal modes. The 3-D analytical results are compared with the 3-D FEM simulations carried on a commercial software and are shown to be in an excellent agreement, thereby validating the analyti-cal procedure suggested in this work.
Resumo:
The objective of this paper is to empirically evaluate a framework for designing – GEMS of SAPPhIRE as req-sol – to check if it supports design for variety and novelty. A set of observational studies is designed where three teams of two designers each, solve three different design problems in the following order: without any support, using the framework, and using a combination of the framework and a catalogue. Results from the studies reveal that both variety and novelty of the concept space increases with the use of the framework or the framework and the catalogue. However, the number of concepts and the time taken by the designers decreases with the use of the framework and, the framework and the catalogue. Based on the results and the interview sessions with the designers, an interactive framework for designing to be supported on a computer is proposed as future work.
Resumo:
The role of a computer emerged from modeling and analyzing concepts (ideas) to generate concepts. Research into methods for supporting conceptual design using automated synthesis had attracted much attention in the past decades. To find out how designers synthesize solution concepts for multi-state mechanical devices, ten experimental studies were conducted. Observations from these empirical studies would be used as the basis to develop knowledge involved in the multi-state design synthesis process. In this paper, we propose a computational representation for expressing the multi-state design task and for enumerating multi-state behaviors of kinematic pairs and mechanisms. This computational representation would be used to formulate computational methods for the synthesis process to develop a system for supporting design synthesis of multiple state mechanical devices by generating a comprehensive variety of solution alternatives.
Resumo:
Automated synthesis of mechanical designs is an important step towards the development of an intelligent CAD system. Research into methods for supporting conceptual design using automated synthesis has attracted much attention in the past decades. In our research, ten experimental studies are conducted to find out how designers synthesize solution concepts for multi-state mechanical devices. The designers are asked to think aloud, while carrying out the synthesis. These design synthesis processes are video recorded. It has been found that modification of kinematic pairs and mechanisms is the major activity carried out by all the designers. This paper presents an analysis of these synthesis processes using configuration space and topology graph to identify and classify the types of modifications that take place. Understanding of these modification processes and the context in which they happened is crucial to develop a system for supporting design synthesis of multiple state mechanical devices that is capable of creating a comprehensive variety of solution alternatives.