988 resultados para Recovery framework
Resumo:
Digital human modeling (DHM) involves modeling of structure, form and functional capabilities of human users for ergonomics simulation. This paper presents application of geometric procedures for investigating the characteristics of human visual capabilities which are particularly important in the context mentioned above. Using the cone of unrestricted directions through the pupil on a tessellated head model as the geometric interpretation of the clinical field-of-view (FoV), the results obtained are experimentally validated. Estimating the pupil movement for a given gaze direction using Listing's Law, FoVs are re-computed. Significant variation of the FoV is observed with the variation in gaze direction. A novel cube-grid representation, which integrated the unit-cube representation of directions and the enhanced slice representation has been introduced for fast and exact point classification for point visibility analysis for a given FoV. Computation of containment frequency of every grid-cell for a given set of FoVs enabled determination of percentile-based FoV contours for estimating the visual performance of a given population. This is a new concept which makes visibility analysis more meaningful from ergonomics point-of-view. The algorithms are fast enough to support interactive analysis of reasonably complex scenes on a typical desktop computer. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The single-crystal X-ray structure of a cation-templated manganese-oxalate coordination polymer [NH(C2H5)(3)][Mn-2(ox)(3)]center dot(5H(2)O)] (1) is reported. In 1, triethylammonium cation is entrapped between the cavities of 2-D honeycomb layers constructed by oxalate and water. The acyclic tetrameric water clusters and discrete water assemble the parallel 2-D honeycomb oxalate layers via an intricate array of hydrogen bonds into an overall 3-D network. The magnetic susceptibility, with and without the water cluster, are reported with infrared and EPR studies.
Resumo:
Purpose: The authors aim at developing a pseudo-time, sub-optimal stochastic filtering approach based on a derivative free variant of the ensemble Kalman filter (EnKF) for solving the inverse problem of diffuse optical tomography (DOT) while making use of a shape based reconstruction strategy that enables representing a cross section of an inhomogeneous tumor boundary by a general closed curve. Methods: The optical parameter fields to be recovered are approximated via an expansion based on the circular harmonics (CH) (Fourier basis functions) and the EnKF is used to recover the coefficients in the expansion with both simulated and experimentally obtained photon fluence data on phantoms with inhomogeneous inclusions. The process and measurement equations in the pseudo-dynamic EnKF (PD-EnKF) presently yield a parsimonious representation of the filter variables, which consist of only the Fourier coefficients and the constant scalar parameter value within the inclusion. Using fictitious, low-intensity Wiener noise processes in suitably constructed ``measurement'' equations, the filter variables are treated as pseudo-stochastic processes so that their recovery within a stochastic filtering framework is made possible. Results: In our numerical simulations, we have considered both elliptical inclusions (two inhomogeneities) and those with more complex shapes (such as an annular ring and a dumbbell) in 2-D objects which are cross-sections of a cylinder with background absorption and (reduced) scattering coefficient chosen as mu(b)(a)=0.01mm(-1) and mu('b)(s)=1.0mm(-1), respectively. We also assume mu(a) = 0.02 mm(-1) within the inhomogeneity (for the single inhomogeneity case) and mu(a) = 0.02 and 0.03 mm(-1) (for the two inhomogeneities case). The reconstruction results by the PD-EnKF are shown to be consistently superior to those through a deterministic and explicitly regularized Gauss-Newton algorithm. We have also estimated the unknown mu(a) from experimentally gathered fluence data and verified the reconstruction by matching the experimental data with the computed one. Conclusions: The PD-EnKF, which exhibits little sensitivity against variations in the fictitiously introduced noise processes, is also proven to be accurate and robust in recovering a spatial map of the absorption coefficient from DOT data. With the help of shape based representation of the inhomogeneities and an appropriate scaling of the CH expansion coefficients representing the boundary, we have been able to recover inhomogeneities representative of the shape of malignancies in medical diagnostic imaging. (C) 2012 American Association of Physicists in Medicine. [DOI: 10.1118/1.3679855]
Resumo:
Obtaining correctly folded proteins from inclusion bodies of recombinant proteins expressed in bacterial hosts requires solubilization with denaturants and a refolding step. Aggregation competes with the second step. Refolding of eight different proteins was carried out by precipitation with smart polymers. These proteins have different molecular weights, different number of disulfide bridges and some of these are known to be highly prone to aggregation. A high throughput refolding screen based upon fluorescence emission maximum around 340 nm (for correctly folded proteins) was developed to identify the suitable smart polymer. The proteins could be dissociated and recovered after the refolding step. The refolding could be scaled up and high refolding yields in the range of 8 mg L-1 (for CD4D12, the first two domains of human CD4) to 58 mg L-1 (for malETrx, thioredoxin fused with signal peptide of maltose binding protein) were obtained. Dynamic light scattering (DLS) showed that polymer if chosen correctly acted as a pseuclochaperonin and bound to the proteins. It also showed that the time for maximum binding was about 50 min which coincided with the time required for incubation (with the polymer) before precipitation for maximum recovery of folded proteins. The refolded proteins were characterized by fluorescence emission spectra, circular dichroism (CD) spectroscopy, melting temperature (T-m), and surface hydrophobicity measurement by ANS (8-anilinol-naphthalene sulfonic acid) fluorescence. Biological activity assay for thioredoxin and fluorescence based assay in case of maltose binding protein (MBP) were also carried out to confirm correct refolding. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
A hydrothermal reaction of cobalt nitrate, 4,4'-oxybis(benzoic acid) (OBA), 1,2,4-triazole, and NaOH gave rise to a deep purple colored compound Co-4(triazolate)(2)(OBA)(3)], I, possessing Co-4 clusters. The Co-4 clusters are connected together through the tirazolate moieties forming a two-dimensional layer that closely resembles the TiS2 layer. The layers are pillared by the OBA units forming the three-dimensional structure. To the best of our knowledge, this is the first observation of a pillared TiS2 layer in a metal-organic framework compound. Magnetic studies in the temperature range 1.8-300 K indicate strong antiferromagetic interactions for Co-4 clusters. The structure as well as the magnetic behavior of the present compound has been compared with the previously reported related compound Co-2(mu 3-OH)(mu(2)-H2O)(pyrazine)(OBA)(OBAH)] prepared using pyrazine as the linker between the Co-4 clusters.
Resumo:
Today's SoCs are complex designs with multiple embedded processors, memory subsystems, and application specific peripherals. The memory architecture of embedded SoCs strongly influences the power and performance of the entire system. Further, the memory subsystem constitutes a major part (typically up to 70%) of the silicon area for the current day SoC. In this article, we address the on-chip memory architecture exploration for DSP processors which are organized as multiple memory banks, where banks can be single/dual ported with non-uniform bank sizes. In this paper we propose two different methods for physical memory architecture exploration and identify the strengths and applicability of these methods in a systematic way. Both methods address the memory architecture exploration for a given target application by considering the application's data access characteristics and generates a set of Pareto-optimal design points that are interesting from a power, performance and VLSI area perspective. To the best of our knowledge, this is the first comprehensive work on memory space exploration at physical memory level that integrates data layout and memory exploration to address the system objectives from both hardware design and application software development perspective. Further we propose an automatic framework that explores the design space identifying 100's of Pareto-optimal design points within a few hours of running on a standard desktop configuration.
Resumo:
Existing approches to digital halftoning of image are based primarily on thresholding. We propose a general framework fot image halftoning whcrc some function uf the output halftone tracks another function of the input gray-tone.This appcoach is shown lo unify most existing algorithms and to provide useful insights. Further, the new intcrpretation allows us to remedy problems in existing aigorithrms such as the error dlffusion, and sohsequently to achieve halftones haavmg superior quality. The proposed method is very general nature is an advantage since it offers a wide choice of three Cilters and a update rule. An intercstmg product of this framework is that equally good, or better, half-tones are possible ro be obtained by thresholding a noise proccess instead of the image itself.
Resumo:
Rathour RK, Narayanan R. Influence fields: a quantitative framework for representation and analysis of active dendrites. J Neurophysiol 107: 2313-2334, 2012. First published January 18, 2012; doi:10.1152/jn.00846.2011.-Neuronal dendrites express numerous voltage-gated ion channels (VGICs), typically with spatial gradients in their densities and properties. Dendritic VGICs, their gradients, and their plasticity endow neurons with information processing capabilities that are higher than those of neurons with passive dendrites. Despite this, frameworks that incorporate dendritic VGICs and their plasticity into neurophysiological and learning theory models have been far and few. Here, we develop a generalized quantitative framework to analyze the extent of influence of a spatially localized VGIC conductance on different physiological properties along the entire stretch of a neuron. Employing this framework, we show that the extent of influence of a VGIC conductance is largely independent of the conductance magnitude but is heavily dependent on the specific physiological property and background conductances. Morphologically, our analyses demonstrate that the influences of different VGIC conductances located on an oblique dendrite are confined within that oblique dendrite, thus providing further credence to the postulate that dendritic branches act as independent computational units. Furthermore, distinguishing between active and passive propagation of signals within a neuron, we demonstrate that the influence of a VGIC conductance is spatially confined only when propagation is active. Finally, we reconstruct functional gradients from VGIC conductance gradients using influence fields and demonstrate that the cumulative contribution of VGIC conductances in adjacent compartments plays a critical role in determining physiological properties at a given location. We suggest that our framework provides a quantitative basis for unraveling the roles of dendritic VGICs and their plasticity in neural coding, learning, and homeostasis.
Resumo:
Leaves and leaf sheath of banana and areca husk (Areca catechu) constitute an important component of urban solid waste (USW) in India which are difficult to degrade under normal windrow composting conditions. A successful method of anaerobic digestion built around the fermentation properties of these feedstock has been evolved which uses no moving parts, pretreatment or energy input while enabling recovery of four products: fiber, biogas, compost and pest repellent. An SRT of 27 d and 35 d was found to be optimum for fiber recovery for banana leaf and areca husk, respectively. Banana leaf showed a degradation pattern different from other leaves with slow pectin-1 degradation (80%) and 40% lignin removal in 27 d SRT. Areca husk however, showed a degradation pattern similar to other plant biomass. Mass recovery levels for banana leaf were fiber-20%, biogas-70% (400 ml/g TS) and compost-10%. For areca husk recovery was fiber-50%, biogas-45% (250 ml/g TS) and compost-5%. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
Online remote visualization and steering of critical weather applications like cyclone tracking are essential for effective and timely analysis by geographically distributed climate science community. A steering framework for controlling the high-performance simulations of critical weather events needs to take into account both the steering inputs of the scientists and the criticality needs of the application including minimum progress rate of simulations and continuous visualization of significant events. In this work, we have developed an integrated user-driven and automated steering framework INST for simulations, online remote visualization, and analysis for critical weather applications. INST provides the user control over various application parameters including region of interest, resolution of simulation, and frequency of data for visualization. Unlike existing efforts, our framework considers both the steering inputs and the criticality of the application, namely, the minimum progress rate needed for the application, and various resource constraints including storage space and network bandwidth to decide the best possible parameter values for simulations and visualization.
Resumo:
Exascale systems of the future are predicted to have mean time between failures (MTBF) of less than one hour. Malleable applications, where the number of processors on which the applications execute can be changed during executions, can make use of their malleability to better tolerate high failure rates. We present AdFT, an adaptive fault tolerance framework for long running malleable applications to maximize application performance in the presence of failures. AdFT framework includes cost models for evaluating the benefits of various fault tolerance actions including checkpointing, live-migration and rescheduling, and runtime decisions for dynamically selecting the fault tolerance actions at different points of application execution to maximize performance. Simulations with real and synthetic failure traces show that our approach outperforms existing fault tolerance mechanisms for malleable applications yielding up to 23% improvement in application performance, and is effective even for petascale systems and beyond.
Resumo:
Online remote visualization and steering of critical weather applications like cyclone tracking are essential for effective and timely analysis by geographically distributed climate science community. A steering framework for controlling the high-performance simulations of critical weather events needs to take into account both the steering inputs of the scientists and the criticality needs of the application including minimum progress rate of simulations and continuous visualization of significant events. In this work, we have developed an integrated user-driven and automated steering framework InSt for simulations, online remote visualization, and analysis for critical weather applications. InSt provides the user control over various application parameters including region of interest, resolution of simulation, and frequency of data for visualization. Unlike existing efforts, our framework considers both the steering inputs and the criticality of the application, namely, the minimum progress rate needed for the application, and various resource constraints including storage space and network bandwidth to decide the best possible parameter values for simulations and visualization.
Resumo:
We revisit the issue of considering stochasticity of Grassmannian coordinates in N = 1 superspace, which was analyzed previously by Kobakhidze et al. In this stochastic supersymmetry (SUSY) framework, the soft SUSY breaking terms of the minimal supersymmetric Standard Model (MSSM) such as the bilinear Higgs mixing, trilinear coupling, as well as the gaugino mass parameters are all proportional to a single mass parameter xi, a measure of supersymmetry breaking arising out of stochasticity. While a nonvanishing trilinear coupling at the high scale is a natural outcome of the framework, a favorable signature for obtaining the lighter Higgs boson mass m(h) at 125 GeV, the model produces tachyonic sleptons or staus turning to be too light. The previous analyses took Lambda, the scale at which input parameters are given, to be larger than the gauge coupling unification scale M-G in order to generate acceptable scalar masses radiatively at the electroweak scale. Still, this was inadequate for obtaining m(h) at 125 GeV. We find that Higgs at 125 GeV is highly achievable, provided we are ready to accommodate a nonvanishing scalar mass soft SUSY breaking term similar to what is done in minimal anomaly mediated SUSY breaking (AMSB) in contrast to a pure AMSB setup. Thus, the model can easily accommodate Higgs data, LHC limits of squark masses, WMAP data for dark matter relic density, flavor physics constraints, and XENON100 data. In contrast to the previous analyses, we consider Lambda = M-G, thus avoiding any ambiguities of a post-grand unified theory physics. The idea of stochastic superspace can easily be generalized to various scenarios beyond the MSSM. DOI: 10.1103/PhysRevD.87.035022