966 resultados para Requirement


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Impact angle constrained guidance laws are important in many applications such as guidance of torpedoes, anti-ballistic missiles and reentry vehicles. In this paper, we design a guidance law which is capable of achieving a wide range of impact angles. Biased proportional navigation guidance uses a bias term in addition to the basic PN command to satisfy additional constraints. Angle constrained BPNG (ACBPNG) uses small angle approximations to derive the bias term for impact angle requirement. We design a modified ACBPNG (MACBPNG) where the required bias term is derived in a closed form considering non-linear equations of motion. Simulations are carried out for a wide range of impact angle requirements. We also analyze capturability from different initial positions and also the launch angles possible at each initial position. The performance of the proposed law is compared with an existing law.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, sliding mode control theory based guidance laws to intercept non-maneuvering targets at a desired impact angle are presented. The desired impact angle, defined in terms of a desired line-of-sight (LOS) angle, is achieved by selecting the missile's lateral acceleration (latax) to enforce sliding mode on a sliding surface based on this LOS angle. As will be shown, this guidance law does not ensure interception for all states of the missile and the target during the engagement. Hence, to satisfy the requirement of interception at the desired impact angle, a second sliding surface is designed and a switching logic, based on the conditions necessary for interception, is presented that allows the latax to switch between enforcing sliding mode on one of these surfaces so that the target can be intercepted at the desired impact angle. The guidance laws are designed using non-linear engagement dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The basic requirement for an autopilot is fast response and minimum steady state error for better guidance performance. The highly nonlinear nature of the missile dynamics due to the severe kinematic and inertial coupling of the missile airframe as well as the aerodynamics has been a challenge for an autopilot that is required to have satisfactory performance for all flight conditions in probable engagements. Dynamic inversion is very popular nonlinear controller for this kind of scenario. But the drawback of this controller is that it is sensitive to parameter perturbation. To overcome this problem, neural network has been used to capture the parameter uncertainty on line. The choice of basis function plays the major role in capturing the unknown dynamics. Here in this paper, many basis function has been studied for approximation of unknown dynamics. Cosine basis function has yield the best response compared to any other basis function for capturing the unknown dynamics. Neural network with Cosine basis function has improved the autopilot performance as well as robustness compared to Dynamic inversion without Neural network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, it has been shown that fusion of the estimates of a set of sparse recovery algorithms result in an estimate better than the best estimate in the set, especially when the number of measurements is very limited. Though these schemes provide better sparse signal recovery performance, the higher computational requirement makes it less attractive for low latency applications. To alleviate this drawback, in this paper, we develop a progressive fusion based scheme for low latency applications in compressed sensing. In progressive fusion, the estimates of the participating algorithms are fused progressively according to the availability of estimates. The availability of estimates depends on computational complexity of the participating algorithms, in turn on their latency requirement. Unlike the other fusion algorithms, the proposed progressive fusion algorithm provides quick interim results and successive refinements during the fusion process, which is highly desirable in low latency applications. We analyse the developed scheme by providing sufficient conditions for improvement of CS reconstruction quality and show the practical efficacy by numerical experiments using synthetic and real-world data. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although many sparse recovery algorithms have been proposed recently in compressed sensing (CS), it is well known that the performance of any sparse recovery algorithm depends on many parameters like dimension of the sparse signal, level of sparsity, and measurement noise power. It has been observed that a satisfactory performance of the sparse recovery algorithms requires a minimum number of measurements. This minimum number is different for different algorithms. In many applications, the number of measurements is unlikely to meet this requirement and any scheme to improve performance with fewer measurements is of significant interest in CS. Empirically, it has also been observed that the performance of the sparse recovery algorithms also depends on the underlying statistical distribution of the nonzero elements of the signal, which may not be known a priori in practice. Interestingly, it can be observed that the performance degradation of the sparse recovery algorithms in these cases does not always imply a complete failure. In this paper, we study this scenario and show that by fusing the estimates of multiple sparse recovery algorithms, which work with different principles, we can improve the sparse signal recovery. We present the theoretical analysis to derive sufficient conditions for performance improvement of the proposed schemes. We demonstrate the advantage of the proposed methods through numerical simulations for both synthetic and real signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Landslide hazards are a major natural disaster that affects most of the hilly regions around the world. In India, significant damages due to earthquake induced landslides have been reported in the Himalayan region and also in the Western Ghat region. Thus there is a requirement of a quantitative macro-level landslide hazard assessment within the Indian subcontinent in order to identify the regions with high hazard. In the present study, the seismic landslide hazard for the entire state of Karnataka, India was assessed using topographic slope map, derived from the Digital Elevation Model (DEM) data. The available ASTER DEM data, resampled to 50 m resolution, was used for deriving the slope map of the entire state. Considering linear source model, deterministic seismic hazard analysis was carried out to estimate peak horizontal acceleration (PHA) at bedrock, for each of the grid points having terrain angle 10A degrees and above. The surface level PHA was estimated using nonlinear site amplification technique, considering B-type NEHRP site class. Based on the surface level PHA and slope angle, the seismic landslide hazard for each grid point was estimated in terms of the static factor of safety required to resist landslide, using Newmark's analysis. The analysis was carried out at the district level and the landslide hazard map for all the districts in the Karnataka state was developed first. These were then merged together to obtain a quantitative seismic landslide hazard map of the entire state of Karnataka. Spatial variations in the landslide hazard for all districts as well as for the entire state Karnataka is presented in this paper. The present study shows that the Western Ghat region of the Karnataka state is found to have high landslide hazard where the static factor of safety required to resist landslide is very high.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Elasticity in cloud systems provides the flexibility to acquire and relinquish computing resources on demand. However, in current virtualized systems resource allocation is mostly static. Resources are allocated during VM instantiation and any change in workload leading to significant increase or decrease in resources is handled by VM migration. Hence, cloud users tend to characterize their workloads at a coarse grained level which potentially leads to under-utilized VM resources or under performing application. A more flexible and adaptive resource allocation mechanism would benefit variable workloads, such as those characterized by web servers. In this paper, we present an elastic resources framework for IaaS cloud layer that addresses this need. The framework provisions for application workload forecasting engine, that predicts at run-time the expected demand, which is input to the resource manager to modulate resource allocation based on the predicted demand. Based on the prediction errors, resources can be over-allocated or under-allocated as compared to the actual demand made by the application. Over-allocation leads to unused resources and under allocation could cause under performance. To strike a good trade-off between over-allocation and under-performance we derive an excess cost model. In this model excess resources allocated are captured as over-allocation cost and under-allocation is captured as a penalty cost for violating application service level agreement (SLA). Confidence interval for predicted workload is used to minimize this excess cost with minimal effect on SLA violations. An example case-study for an academic institute web server workload is presented. Using the confidence interval to minimize excess cost, we achieve significant reduction in resource allocation requirement while restricting application SLA violations to below 2-3%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The maintenance of ion channel homeostasis, or channelostasis, is a complex puzzle in neurons with extensive dendritic arborization, encompassing a combinatorial diversity of proteins that encode these channels and their auxiliary subunits, their localization profiles, and associated signaling machinery. Despite this, neurons exhibit amazingly stereotypic, topographically continuous maps of several functional properties along their active dendritic arbor. Here, we asked whether the membrane composition of neurons, at the level of individual ion channels, is constrained by this structural requirement of sustaining several functional maps along the same topograph. We performed global sensitivity analysis on morphologically realistic conductance-based models of hippocampal pyramidal neurons that coexpressed six well-characterized functional maps along their trunk. We generated randomized models by varying 32 underlying parameters and constrained these models with quantitative experimental measurements from the soma and dendrites of hippocampal pyramidal neurons. Analyzing valid models that satisfied experimental constraints on all six functional maps, we found topographically analogous functional maps to emerge from disparate model parameters with weak pairwise correlations between parameters. Finally, we derived a methodology to assess the contribution of individual channel conductances to the various functional measurements, using virtual knockout simulations on the valid model population. We found that the virtual knockout of individual channels resulted in variable, measurement and location-specific impacts across the population. Our results suggest collective channelostasis as a mechanism behind the robust emergence of analogous functional maps and have significant ramifications for the localization and targeting of ion channels and enzymes that regulate neural coding and homeostasis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Controlled motion of artificial nanomotors in biological environments, such as blood, can lead to fascinating biomedical applications, ranging from targeted drug delivery to microsurgery and many more. In spite of the various strategies used in fabricating and actuating nanomotors, practical issues related to fuel requirement, corrosion, and liquid viscosity have limited the motion of nanomotors to model systems such as water, serum, or biofluids diluted with toxic chemical fuels, such as hydrogen peroxide. As we demonstrate here, integrating conformal ferrite coatings with magnetic nanohelices offer a promising combination of functionalities for having controlled motion in practical biological fluids, such as chemical stability, cytocompatibility, and the generated thrust. These coatings were found to be stable in various biofluids, including human blood, even after overnight incubation, and did not have significant influence on the propulsion efficiency of the magnetically driven nanohelices, thereby facilitating the first successful ``voyage'' of artificial nanomotors in human blood. The motion of the ``nanovoyager'' was found to show interesting stick-slip dynamics, an effect originating in the colloidal jamming of blood cells in the plasma. The system of magnetic ``nanovoyagers'' was found to be cytocompatible with C2C12 mouse myoblast cells, as confirmed using MTT assay and fluorescence microscopy observations of cell morphology. Taken together, the results presented in this work establish the suitability of the ``nanovoyager'' with conformal ferrite coatings toward biomedical applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mass balance between metal and electrolytic solution, separated by a moving interface, in stable pit growth results in a set of governing equations which are solved for concentration field and interface position (pit boundary evolution), which requires only three inputs, namely the solid metal concentration, saturation concentration of the dissolved metal ions and diffusion coefficient. A combined eXtended Finite Element Model (XFEM) and level set method is developed in this paper. The extended finite element model handles the jump discontinuity in the metal concentrations at the interface, by using discontinuous-derivative enrichment formulation for concentration discontinuity at the interface. This eliminates the requirement of using front conforming mesh and re-meshing after each time step as in conventional finite element method. A numerical technique known as level set method tracks the position of the moving interface and updates it over time. Numerical analysis for pitting corrosion of stainless steel 304 is presented. The above proposed method is validated by comparing the numerical results with experimental results, exact solutions and some other approximate solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mass balance between metal and electrolytic solution, separated by a moving interface, in stable pit growth results in a set of governing equations which are solved for concentration field and interface position (pit boundary evolution). The interface experiences a jump discontinuity in metal concentration. The extended finite-element model (XFEM) handles this jump discontinuity by using discontinuous-derivative enrichment formulation, eliminating the requirement of using front conforming mesh and re-meshing after each time step as in the conventional finite-element method. However, prior interface location is required so as to solve the governing equations for concentration field for which a numerical technique, the level set method, is used for tracking the interface explicitly and updating it over time. The level set method is chosen as it is independent of shape and location of the interface. Thus, a combined XFEM and level set method is developed in this paper. Numerical analysis for pitting corrosion of stainless steel 304 is presented. The above proposed model is validated by comparing the numerical results with experimental results, exact solutions and some other approximate solutions. An empirical model for pitting potential is also derived based on the finite-element results. Studies show that pitting profile depends on factors such as ion concentration, solution pH and temperature to a large extent. Studying the individual and combined effects of these factors on pitting potential is worth knowing, as pitting potential directly influences corrosion rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Differential mobility analyzers (DMAs) are commonly used to generate monodisperse nanoparticle aerosols. Commercial DMAs operate at quasi-atmospheric pressures and are therefore not designed to be vacuum-tight. In certain particle synthesis methods, the use of a vacuum-compatible DMA is a requirement as a process step for producing high-purity metallic particles. A vacuum-tight radial DMA (RDMA) has been developed and tested at low pressures. Its performance has been evaluated by using a commercial NANO-DMA as the reference. The performance of this low-pressure RDMA (LP-RDMA) in terms of the width of its transfer function is found to be comparable with that of other NANO-DMAs at atmospheric pressure and is almost independent of the pressure down to 30 mbar. It is shown that LP-RDMA can be used for the classification of nanometer-sized particles (5-20 nm) under low pressure condition (30 mbar) and has been successfully applied to nanoparticles produced by ablating FeNi at low pressures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mitochondrial Hsp70 (mtHsp70) is essential for a vast repertoire of functions, including protein import, and requires effective interdomain communication for efficient partner-protein interactions. However, the in vivo functional significance of allosteric regulation in eukaryotes is poorly defined. Using integrated biochemical and yeast genetic approaches, we provide compelling evidence that a conserved substrate-binding domain (SBD) loop, L-4,L-5, plays a critical role in allosteric communication governing mtHsp70 chaperone functions across species. In yeast, a temperature-sensitive L-4,L-5 mutation (E467A) disrupts bidirectional domain communication, leading to compromised protein import and mitochondrial function. Loop L-4,L-5 functions synergistically with the linker in modulating the allosteric interface and conformational transitions between SBD and the nucleotide-binding domain (NBD), thus regulating interdomain communication. Second-site intragenic suppressors of E467A isolated within the SBD suppress domain communication defects by conformationally altering the allosteric interface, thereby restoring import and growth phenotypes. Strikingly, the suppressor mutations highlight that restoration of communication from NBD to SBD alone is the minimum essential requirement for effective in vivo function when primed at higher basal ATPase activity, mimicking the J-protein-bound state. Together these findings provide the first mechanistic insights into critical regions within the SBD of mtHsp70s regulating interdomain communication, thus highlighting its importance in protein translocation and mitochondrial biogenesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High temperature, high pressure transcritical condensing CO2 cycle (TC-CO2) is compared with transcritical steam (TC-steam) cycle. Performance indicators such as thermal efficiency, volumetric flow rates and entropy generation are used to analyze the power cycle wherein, irreversibilities in turbo-machinery and heat exchangers are taken into account. Although, both cycles yield comparable thermal efficiencies under identical operating conditions, TC-CO2 plant is significantly compact compared to a TC-steam plant. Large specific volume of steam is responsible for a bulky system. It is also found that the performance of a TC-CO2 cycle is less sensitive to source temperature variations, which is an important requirement of a solar thermal system. In addition, issues like wet expansion in turbine and vacuum in condenser are absent in case of a TC-CO2 cycle. External heat addition to working fluid is assumed to take place through a heat transfer fluid (HTF) which receives heat from a solar receiver. A TC-CO2 system receives heat though a single HTF loop, whereas, for TC-steam cycle two HTF loops in series are proposed to avoid high temperature differential between the steam and HTF. (C) 2013 P. Garg. Published by Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Immunomodulators are agents, which can modulate the immune response to specific antigens, while causing least toxicity to the host system. Being part of the modern vaccine formulations, these compounds have contributed remarkably to the field of therapeutics. Despite the successful record maintained by these agents, the requirement of novel immunomodulators keeps increasing due to the increasing severity of diseases. Hence, research regarding the same holds great importance. Areas covered: In this review, we discuss the role of immunomodulators in improving performance of various vaccines used for counteracting most threatening infectious diseases, mechanisms behind their action and criteria for development of novel immunomodulators. Expert opinion: Understanding the molecular mechanisms underlying immune response is a prerequisite for development of effective therapeutics as these are often exploited by pathogens for their own propagation. Keeping this in mind, the present research in the field of immunotherapy focuses on developing immunomodulators that would not only enhance the protection against pathogen, but also generate a long-term memory response. With the introduction of advanced formulations including combination of different kinds of immunomodulators, one can expect tremendous success in near future.