230 resultados para Minimal manipulability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A droplet residing on a vibrating surface and in the pressure antinode of an asymmetric standing wave can spread radially outward and atomize. In this work, proper orthogonal decomposition through high speed imaging is shown to predict the likelihood of atomization for various viscous fluids based on prior information in the droplet spreading phase. Capillary instabilities are seen to affect ligament rupture. Viscous dissipation plays an important role in determining the wavelength of the most unstable mode during the inception phase of the ligaments. However, the highest ligament capillary number achieved was less than 1, and the influence of viscosity in the ligament growth and breakup phases is quite minimal. It is inferred from the data that the growth of a typical ligament is governed by a balance between the inertial force obtained from the inception phase and capillary forces. By including the effect of acoustic pressure field around the droplet, the dynamics of the ligament growth phase is revealed and the ligament growth profiles for different fluids are shown to collapse on a straight line using a new characteristic time scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The emergence of multidrug-resistant bacteria is a global threat for human society. There exist recorded data that silver was used as an antimicrobial agent by the ancient Greeks and Romans during the 8th century. Silver nanoparticles (AgNPs) are of potential interest because of their effective antibacterial and antiviral activities, with minimal cytotoxic effects on the cells. However, very few reports have shown the usage of AgNPs for antibacterial therapy in vivo. In this study, we deciphered the importance of the chosen methods for synthesis and capping of AgNPs for their improved activity in vivo. The interaction of AgNPs with serum albumin has a significant effect on their antibacterial activity. It was observed that uncapped AgNPs exhibited no antibacterial activity in the presence of serum proteins, due to the interaction with bovine serum albumin (BSA), which was confirmed by UV-Vis spectroscopy. However, capped AgNPs with citrate or poly(vinylpyrrolidone)] exhibited antibacterial properties due to minimized interactions with serum proteins. The damage in the bacterial membrane was assessed by flow cytometry, which also showed that only capped AgNPs exhibited antibacterial properties, even in the presence of BSA. In order to understand the in vivo relevance of the antibacterial activities of different AgNPs, a murine salmonellosis model was used. It was conclusively proved that AgNPs capped with citrate or PVP exhibited significant antibacterial activities in vivo against Salmonella infection compared to uncapped AgNPs. These results clearly demonstrate the importance of capping agents and the synthesis method for AgNPs in their use as antimicrobial agents for therapeutic purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider extremal limits of the recently constructed ``subtracted geometry''. We show that extremality makes the horizon attractive against scalar perturbations, but radial evolution of such perturbations changes the asymptotics: from a conical-box to flat Minkowski. Thus these are black holes that retain their near-horizon geometry under perturbations that drastically change their asymptotics. We also show that this extremal subtracted solution (''subttractor'') can arise as a boundary of the basin of attraction for flat space attractors. We demonstrate this by using a fairly minimal action (that has connections with STU model) where the equations of motion are integrable and we are able to find analytic solutions that capture the flow from the horizon to the asymptotic region. The subttractor is a boundary between two qualitatively different flows. We expect that these results have generalizations for other theories with charged dilatonic black holes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimating program worst case execution time(WCET) accurately and efficiently is a challenging task. Several programs exhibit phase behavior wherein cycles per instruction (CPI) varies in phases during execution. Recent work has suggested the use of phases in such programs to estimate WCET with minimal instrumentation. However the suggested model uses a function of mean CPI that has no probabilistic guarantees. We propose to use Chebyshev's inequality that can be applied to any arbitrary distribution of CPI samples, to probabilistically bound CPI of a phase. Applying Chebyshev's inequality to phases that exhibit high CPI variation leads to pessimistic upper bounds. We propose a mechanism that refines such phases into sub-phases based on program counter(PC) signatures collected using profiling and also allows the user to control variance of CPI within a sub-phase. We describe a WCET analyzer built on these lines and evaluate it with standard WCET and embedded benchmark suites on two different architectures for three chosen probabilities, p={0.9, 0.95 and 0.99}. For p= 0.99, refinement based on PC signatures alone, reduces average pessimism of WCET estimate by 36%(77%) on Arch1 (Arch2). Compared to Chronos, an open source static WCET analyzer, the average improvement in estimates obtained by refinement is 5%(125%) on Arch1 (Arch2). On limiting variance of CPI within a sub-phase to {50%, 10%, 5% and 1%} of its original value, average accuracy of WCET estimate improves further to {9%, 11%, 12% and 13%} respectively, on Arch1. On Arch2, average accuracy of WCET improves to 159% when CPI variance is limited to 50% of its original value and improvement is marginal beyond that point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We give explicit construction of vertex-transitive tight triangulations of d-manifolds for d >= 2. More explicitly, for each d >= 2, we construct two (d(2) + 5d + 5)-vertex neighborly triangulated d-manifolds whose vertex-links are stacked spheres. The only other non-trivial series of such tight triangulated manifolds currently known is the series of non-simply connected triangulated d-manifolds with 2d + 3 vertices constructed by Kuhnel. The manifolds we construct are strongly minimal. For d >= 3, they are also tight neighborly as defined by Lutz, Sulanke and Swartz. Like Kuhnel complexes, our manifolds are orientable in even dimensions and non-orientable in odd dimensions. (c) 2013 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem addressed in this paper is concerned with an important issue faced by any green aware global company to keep its emissions within a prescribed cap. The specific problem is to allocate carbon reductions to its different divisions and supply chain partners in achieving a required target of reductions in its carbon reduction program. The problem becomes a challenging one since the divisions and supply chain partners, being autonomous, may exhibit strategic behavior. We use a standard mechanism design approach to solve this problem. While designing a mechanism for the emission reduction allocation problem, the key properties that need to be satisfied are dominant strategy incentive compatibility (DSIC) (also called strategy-proofness), strict budget balance (SBB), and allocative efficiency (AE). Mechanism design theory has shown that it is not possible to achieve the above three properties simultaneously. In the literature, a mechanism that satisfies DSIC and AE has recently been proposed in this context, keeping the budget imbalance minimal. Motivated by the observation that SBB is an important requirement, in this paper, we propose a mechanism that satisfies DSIC and SBB with slight compromise in allocative efficiency. Our experimentation with a stylized case study shows that the proposed mechanism performs satisfactorily and provides an attractive alternative mechanism for carbon footprint reduction by global companies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Waters, in 2009, introduced an important technique, called dual system encryption, to construct identity-based encryption (IBE) and related schemes. The resulting IBE scheme was described in the setting of symmetric pairing. A key feature of the construction is the presence of random tags in the ciphertext and decryption key. Later work by Lewko and Waters removed the tags and proceeding through composite-order pairings led to a more efficient dual system IBE scheme using asymmetric pairings whose security is based on non-standard but static assumptions. In this work, we have systematically simplified Waters 2009 IBE scheme in the setting of asymmetric pairing. The simplifications retain tags used in the original description. This leads to several variants, the first one of which is based on standard assumptions and in comparison to Waters’ original scheme reduces ciphertexts and keys by two elements each. Going through several stages of simplifications, we finally obtain a simple scheme whose security can be based on two standard assumptions and a natural and minimal extension of the decision Diffie-Hellman problem for asymmetric pairing groups. The scheme itself is also minimal in the sense that apart from the tags, both encryption and key generation use exactly one randomiser each. This final scheme is more efficient than both the previous dual system IBE scheme in the asymmetric setting due to Lewko and Waters and the more recent dual system IBE scheme due to Lewko. We extend the IBE scheme to hierarchical IBE (HIBE) and broadcast encryption (BE) schemes. Both primitives are secure in their respective full models and have better efficiencies compared to previously known schemes offering the same level and type of security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider entanglement entropy in the context of gauge/gravity duality for conformal field theories in even dimensions. The holographic prescription due to Ryu and Takayanagi (RT) leads to an equation describing how the entangling surface extends into the bulk geometry. We show that setting to zero, the timetime component of the Brown-York stress tensor evaluated on the co-dimension 1 entangling surface, leads to the same equation. By considering a spherical entangling surface as an example, we observe that the Euclidean actionmethods in AdS/CFT will lead to the RT area functional arising as a counterterm needed to regularize the stress tensor. We present arguments leading to a justification for the minimal area prescription.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing industrial utilization of polyacrylamide to assist water clarification, sludge conditioning, papermaking, and secondary oil recovery leads to environmental pollution. In this work, an acrylamide degrading bacterium was isolated from paper mill effluent at Charan mahadevi, Tamilnadu, India. The minimal medium containing acrylamide (40 mM) served as a sole source of carbon and nitrogen for acrylamide degrading bacteria. The bacterial strain has grown well in 40 mM acrylamide at pH (6-7) at 30 degrees C. Within 24-48 h acrylamide was converted into acrylic acid and other metabolites. Based on biochemical characteristics and 16S rRNA gene sequence, the bacterial strain was identified as Gram negative, diplobacilli Moraxella osloensis MSU11. The acrylamide hydrolyzing bacterial enzyme acrylamidase was purified by HPLC. The enzyme molecular weight was determined to be approximately 38 kDa by SDS-PAGE using reference enzyme Pectinase. These results show that M. osloensis MSU11 has a potential to degrade the acrylamide present in the environment. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Closed loop current sensors used in power electronics applications are expected to have high bandwidth and minimal measurement transients. In this paper, a closed loop compensated Hall-effect current sensor is modeled. The model is used to tune the sensor's compensator. Analytical expression of step response is used to evaluate the performance of the PI compensator in the current sensor. This analysis is used to devise a procedure to design parameters of the PI compensator for fast dynamic response and for small dynamic error. A prototype current sensor is built in the laboratory. Simulations using the model are compared with experimental results to validate the model and to study the variation in performance with compensator parameters. The performance of the designed PI compensator for the sensor is compared with a commercial current sensor. The measured bandwidth of the designed current sensor is above 200 kHz, which is comparable to commercial standards. Implementation issues of PI compensator using operational amplifiers are also addressed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimation of design quantiles of hydrometeorological variables at critical locations in river basins is necessary for hydrological applications. To arrive at reliable estimates for locations (sites) where no or limited records are available, various regional frequency analysis (RFA) procedures have been developed over the past five decades. The most widely used procedure is based on index-flood approach and L-moments. It assumes that values of scale and shape parameters of frequency distribution are identical across all the sites in a homogeneous region. In real-world scenario, this assumption may not be valid even if a region is statistically homogeneous. To address this issue, a novel mathematical approach is proposed. It involves (i) identification of an appropriate frequency distribution to fit the random variable being analyzed for homogeneous region, (ii) use of a proposed transformation mechanism to map observations of the variable from original space to a dimensionless space where the form of distribution does not change, and variation in values of its parameters is minimal across sites, (iii) construction of a growth curve in the dimensionless space, and (iv) mapping the curve to the original space for the target site by applying inverse transformation to arrive at required quantile(s) for the site. Effectiveness of the proposed approach (PA) in predicting quantiles for ungauged sites is demonstrated through Monte Carlo simulation experiments considering five frequency distributions that are widely used in RFA, and by case study on watersheds in conterminous United States. Results indicate that the PA outperforms methods based on index-flood approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Femtocells are a new concept which improves the coverage and capacity of a cellular system. We consider the problem of channel allocation and power control to different users within a Femtocell. Knowing the channels available, the channel states and the rate requirements of different users the Femtocell base station (FBS), allocates the channels to different users to satisfy their requirements. Also, the Femtocell should use minimal power so as to cause least interference to its neighboring Femtocells and outside users. We develop efficient, low complexity algorithms which can be used online by the Femtocell. The users may want to transmit data or voice. We compare our algorithms with the optimal solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Light neutralino dark matter can be achieved in the Minimal Supersymmetric Standard Model if staus are rather light, with mass around 100 GeV. We perform a detailed analysis of the relevant supersymmetric parameter space, including also the possibility of light selectons and smuons, and of light higgsino- or wino-like charginos. In addition to the latest limits from direct and indirect detection of dark matter, ATLAS and CMS constraints on electroweak-inos and on sleptons are taken into account using a ``simplified models'' framework. Measurements of the properties of the Higgs boson at 125 GeV, which constrain amongst others the invisible decay of the Higgs boson into a pair of neutralinos, are also implemented in the analysis. We show that viable neutralino dark matter can be achieved for masses as low as 15 GeV. In this case, light charginos close to the LEP bound are required in addition to light right-chiral staus. Significant deviations are observed in the couplings of the 125 GeV Higgs boson. These constitute a promising way to probe the light neutralino dark matter scenario in the next run of the LHC. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce k-stellated spheres and consider the class W-k(d) of triangulated d-manifolds, all of whose vertex links are k-stellated, and its subclass W-k*; (d), consisting of the (k + 1)-neighbourly members of W-k(d). We introduce the mu-vector of any simplicial complex and show that, in the case of 2-neighbourly simplicial complexes, the mu-vector dominates the vector of Betti numbers componentwise; the two vectors are equal precisely for tight simplicial complexes. We are able to estimate/compute certain alternating sums of the components of the mu-vector of any 2-neighbourly member of W-k(d) for d >= 2k. As a consequence of this theory, we prove a lower bound theorem for such triangulated manifolds, and we determine the integral homology type of members of W-k*(d) for d >= 2k + 2. As another application, we prove that, when d not equal 2k + 1, all members of W-k*(d) are tight. We also characterize the tight members of W-k*(2k + 1) in terms of their kth Betti numbers. These results more or less answer a recent question of Effenberger, and also provide a uniform and conceptual tightness proof for all except two of the known tight triangulated manifolds. We also prove a lower bound theorem for homology manifolds in which the members of W-1(d) provide the equality case. This generalizes a result (the d = 4 case) due to Walkup and Kuhnel. As a consequence, it is shown that every tight member of W-1 (d) is strongly minimal, thus providing substantial evidence in favour of a conjecture of Kuhnel and Lutz asserting that tight homology manifolds should be strongly minimal. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new breed of microscopy techniques is coming to the forefront of optical imaging. They enhance the attainable 3D resolution of imaging in live and ``fixed'' cells' (with minimal structural perturbation) by greater than tenfold, bringing subcellular structures in sharp focus Along with long-term imaging, deep tissue and high throughput capablities, new insights in various fields of biology are being generated. The main set of these next-generation optical microscopy techniques along with select applications is described in this article.