937 resultados para International relief - Evaluation
Resumo:
Displacement-amplifying compliant mechanisms (DaCMs) reported in literature are mostly used for actuator applications. This paper considers them for sensor applications that rely on displacement measurement, and evaluates them objectively. The main goal is to increase the sensitivity under constraints imposed by several secondary requirements and practical constraints. A spring-mass-lever model that effectively captures the addition of a DaCM to a sensor is used in comparing eight DaCMs. We observe that they significantly differ in performance criteria such as geometric advantage, stiffness, natural frequency, mode amplification, factor of safety against failure, cross-axis stiffness, etc., but none excel in all. Thus, a combined figure of merit is proposed using which the most suitable DaCM could be selected for a sensor application. A case-study of a micro machined capacitive accelerometer and another case-study of a vision-based force sensor are included to illustrate the general evaluation and selection procedure of DaCMs with specific applications. Some other insights gained with the analysis presented here were the optimum size-scale for a DaCM, the effect on its natural frequency, limits on its stiffness, and working range of the sensor.
Resumo:
This paper presents the image reconstruction using the fan-beam filtered backprojection (FBP) algorithm with no backprojection weight from windowed linear prediction (WLP) completed truncated projection data. The image reconstruction from truncated projections aims to reconstruct the object accurately from the available limited projection data. Due to the incomplete projection data, the reconstructed image contains truncation artifacts which extends into the region of interest (ROI) making the reconstructed image unsuitable for further use. Data completion techniques have been shown to be effective in such situations. We use windowed linear prediction technique for projection completion and then use the fan-beam FBP algorithm with no backprojection weight for the 2-D image reconstruction. We evaluate the quality of the reconstructed image using fan-beam FBP algorithm with no backprojection weight after WLP completion.
Resumo:
This paper deals with the solution to the problem of multisensor data fusion for a single target scenario as detected by an airborne track-while-scan radar. The details of a neural network implementation, various training algorithms based on standard backpropagation, and the results of training and testing the neural network are presented. The promising capabilities of RPROP algorithm for multisensor data fusion for various parameters are shown in comparison to other adaptive techniques
Resumo:
Compacted clay liners are widely used for waste contaminant facilities because of their low cost, large leachate attenuation capacity and resistance to damage and puncture. Commonly used bentonite possess many limitations such as high swelling and shrinkage potential, sensitivity to waste fluid characteristics etc. The paper proposes the use of bentonite-sand mixture containing optimal clay content as liner material. It has been brought out, based on detailed geotechnical investigations, that a mixture containing only about 20 to 39% of bentonite is more suited than the clay alone and they possess.
Resumo:
In developing countries, a high rate of growth in the demand for electric energy is felt, and so the addition of new generating units becomes inevitable. In deregulated power systems, private generating stations are encouraged to add new generations. Some of the factors considered while placing a new generating unit are: availability of esources, ease of transmitting power, distance from the load centre, etc. Finding the most appropriate locations for generation expansion can be done by running repeated power flows and carrying system studies like analyzing the voltage profile, voltage stability, loss analysis, etc. In this paper a new methodology is proposed which will mainly consider the existing network topology. A concept of T-index is introduced in this paper, which considers the electrical distances between generator and load nodes. This index is used for ranking the most significant new generation expansion locations and also indicates the amount of permissible generations that can be installed at these new locations. This concept facilitates for the medium and long term planning of power generation expansions within the available transmission corridors. Studies carried out on an EHV equivalent 10-bus system and IEEE 30 bus systems are presented for illustration purposes.
Resumo:
Dynamic Voltage and Frequency Scaling (DVFS) offers a huge potential for designing trade-offs involving energy, power, temperature and performance of computing systems. In this paper, we evaluate three different DVFS schemes - our enhancement of a Petri net performance model based DVFS method for sequential programs to stream programs, a simple profile based Linear Scaling method, and an existing hardware based DVFS method for multithreaded applications - using multithreaded stream applications, in a full system Chip Multiprocessor (CMP) simulator. From our evaluation, we find that the software based methods achieve significant Energy/Throughput2(ET−2) improvements. The hardware based scheme degrades performance heavily and suffers ET−2 loss. Our results indicate that the simple profile based scheme achieves the benefits of the complex Petri net based scheme for stream programs, and present a strong case for the need for independent voltage/frequency control for different cores of CMPs, which is lacking in most of the state-of-the-art CMPs. This is in contrast to the conclusions of a recent evaluation of per-core DVFS schemes for multithreaded applications for CMPs.
Resumo:
Pervasive use of pointers in large-scale real-world applications continues to make points-to analysis an important optimization-enabler. Rapid growth of software systems demands a scalable pointer analysis algorithm. A typical inclusion-based points-to analysis iteratively evaluates constraints and computes a points-to solution until a fixpoint. In each iteration, (i) points-to information is propagated across directed edges in a constraint graph G and (ii) more edges are added by processing the points-to constraints. We observe that prioritizing the order in which the information is processed within each of the above two steps can lead to efficient execution of the points-to analysis. While earlier work in the literature focuses only on the propagation order, we argue that the other dimension, that is, prioritizing the constraint processing, can lead to even higher improvements on how fast the fixpoint of the points-to algorithm is reached. This becomes especially important as we prove that finding an optimal sequence for processing the points-to constraints is NP-Complete. The prioritization scheme proposed in this paper is general enough to be applied to any of the existing points-to analyses. Using the prioritization framework developed in this paper, we implement prioritized versions of Andersen's analysis, Deep Propagation, Hardekopf and Lin's Lazy Cycle Detection and Bloom Filter based points-to analysis. In each case, we report significant improvements in the analysis times (33%, 47%, 44%, 20% respectively) as well as the memory requirements for a large suite of programs, including SPEC 2000 benchmarks and five large open source programs.
Resumo:
The objective of this paper is to empirically evaluate a framework for designing – GEMS of SAPPhIRE as req-sol – to check if it supports design for variety and novelty. A set of observational studies is designed where three teams of two designers each, solve three different design problems in the following order: without any support, using the framework, and using a combination of the framework and a catalogue. Results from the studies reveal that both variety and novelty of the concept space increases with the use of the framework or the framework and the catalogue. However, the number of concepts and the time taken by the designers decreases with the use of the framework and, the framework and the catalogue. Based on the results and the interview sessions with the designers, an interactive framework for designing to be supported on a computer is proposed as future work.
Resumo:
The assembly of aerospace and automotive structures in recent years is increasingly carried out using adhesives. Adhesive joints have advantages of uniform stress distribution and less stress concentration in the bonded region. Nevertheless, they may suffer due to the presence of defects in bond line and at the interface or due to improper curing process. While defects like voids, cracks and delaminations present in the adhesive bond line may be detected using different NDE methods, interfacial defects in the form of kissing bond may go undetected. Attempts using advanced ultrasonic methods like nonlinear ultrasound and guided wave inspection to detect kissing bond have met with limited success stressing the need for alternate methods. This paper concerns the preliminary studies carried out on detectability of dry contact kissing bonds in adhesive joints using the Digital Image Correlation (DIC) technique. In this attempt, adhesive joint samples containing varied area of kissing bond were prepared using the glass fiber reinforced composite (GFRP) as substrates and epoxy resin as the adhesive layer joining them. The samples were also subjected to conventional and high power ultrasonic inspection. Further, these samples were loaded till failure to determine the bond strength during which digital images were recorded and analyzed using the DIC method. This noncontact method could indicate the existence of kissing bonds at less than 50% failure load. Finite element studies carried out showed a similar trend. Results obtained from these preliminary studies are encouraging and further tests need to be done on a larger set of samples to study experimental uncertainties and scatter associated with the method. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
A number of spectral analysis of surface wave tests were performed on asphaltic and cement concrete pavements by dropping freely a 6.5kg spherical mass, having a radius of 5.82cm, from a height (h) of 0.51.5m. The maximum wavelength ((max)), up to which the shear wave velocity profile can be detected with the usage of surface wave measurements, increases continuously with an increase in h. As compared to the asphaltic pavement, the values of (max) and (min) become greater for the chosen cement concrete pavement, where (min) refers to the minimum wavelength. With h=0.5m, a good assessment of the top layers of both the present chosen asphaltic and the cement concrete pavements, including soil subgrade, can be made. For a given h, as compared to the selected asphaltic pavement, the first receiver in case of the chosen cement concrete pavement needs to be placed at a greater distance from the source. Inverse analysis has also been performed to characterise the shear wave velocity profile of different layers of the pavements.
Resumo:
In the present investigation an attempt has been made to develop a new co-polymeric material for controlled release tablet formulations. The acrylamide grafting was successfully performed on the backbone of sago starch. The modified starch was tested for acute toxicity and drug-excipient compatibility study. The grafted material was used in making of controlled release tablets of lamivudine. The formulations were evaluated for physical characteristics such as hardness, friability, %drug content and weight variations. The in vitro release study showed that the optimized formulation exhibited highest correlation (R) value in case of Higuchi model and the release mechanism of the optimized formulation predominantly exhibited combination of diffusion and erosion process. There was a significant difference in the pharmacokinetic parameters (T-max, C-max, AUC, V-d, T-1/2 and MDT) of the optimized formulation as compared to the marketed conventional tablet Lamivir (R) was observed. The pharmacokinetics parameters were showed controlled pattern and better bioavailability. The optimized formulation exhibited good stability and release profile at the accelerated stability conditions. (c) 2013 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we evaluate the performance of a burst retransmission method for an optical burst switched network with intermediate-node-initiation (INI) signaling technique. The proposed method tries to reduce the burst contention probability at the intermediate core nodes. We develop an analytical model to get the burst contention probability and burst loss probability for an optical burst switched network with intermediate-node-initiation signaling technique. The proposed method uses the optical burst retransmission method. We simulate the performance of the optical burst retransmission. Simulation results show that at low traffic loads the loss probability is low compared to the conventional burst retransmission in the OBS network. Result also show that the retransmission method for OBS network with intermediate-node-initiation signaling technique significantly reduces the burst loss probability.
Resumo:
Background & objectives: Pre-clinical toxicology evaluation of biotechnology products is a challenge to the toxicologist. The present investigation is an attempt to evaluate the safety profile of the first indigenously developed recombinant DNA anti-rabies vaccine DRV (100 mu g)] and combination rabies vaccine CRV (100 mu g DRV and 1.25 IU of cell culture-derived inactivated rabies virus vaccine)], which are intended for clinical use by intramuscular route in Rhesus monkeys. Methods: As per the regulatory requirements, the study was designed for acute (single dose - 14 days), sub-chronic (repeat dose - 28 days) and chronic (intended clinical dose - 120 days) toxicity tests using three dose levels, viz. therapeutic, average (2x therapeutic dose) and highest dose (10 x therapeutic dose) exposure in monkeys. The selection of the model i.e. monkey was based on affinity and rapid higher antibody response during the efficacy studies. An attempt was made to evaluate all parameters which included physical, physiological, clinical, haematological and histopathological profiles of all target organs, as well as Tiers I, II, III immunotoxicity parameters. Results: In acute toxicity there was no mortality in spite of exposing the monkeys to 10XDRV. In sub chronic and chronic toxicity studies there were no abnormalities in physical, physiological, neurological, clinical parameters, after administration of test compound in intended and 10 times of clinical dosage schedule of DRV and CRV under the experimental conditions. Clinical chemistry, haematology, organ weights and histopathology studies were essentially unremarkable except the presence of residual DNA in femtogram level at site of injection in animal which received 10X DRV in chronic toxicity study. No Observational Adverse Effects Level (NOAEL) of DRV is 1000 ug/dose (10 times of therapeutic dose) if administered on 0, 4, 7, 14, 28th day. Interpretation & conclusions: The information generated by this study not only draws attention to the need for national and international regulatory agencies in formulating guidelines for pre-clinical safety evaluation of biotech products but also facilitates the development of biopharmaceuticals as safe potential therapeutic agents.