928 resultados para Tablet computers
Resumo:
Fractal dimension based damage detection method is studied for a composite structure with random material properties. A composite plate with localized matrix crack is considered. Matrix cracks are often seen as the initial damage mechanism in composites. Fractal dimension based method is applied to the static deformation curve of the structure to detect localized damage. Static deflection of a cantilevered composite plate under uniform loading is calculated using the finite element method. Composite material shows spatially varying random material properties because of complex manufacturing processes. Spatial variation of material property is represented as a two dimensional homogeneous Gaussian random field. Karhunen-Loeve (KL) expansion is used to generate a random field. The robustness of fractal dimension based damage detection methods is studied considering the composite plate with spatial variation in material properties.
Resumo:
A direct discretization approach and an operator-splitting scheme are applied for the numerical simulation of a population balance system which models the synthesis of urea with a uni-variate population. The problem is formulated in axisymmetric form and the setup is chosen such that a steady state is reached. Both solvers are assessed with respect to the accuracy of the results, where experimental data are used for comparison, and the efficiency of the simulations. Depending on the goal of simulations, to track the evolution of the process accurately or to reach the steady state fast, recommendations for the choice of the solver are given. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
The history of computing in India is inextricably intertwined with two interacting forces: the political climate determined by the political party in power) and the government policies mainly driven by the technocrats and bureaucrats who acted within the boundaries drawn by the political party in power. There were four break points (which occurred in 1970, 1978, 1991 and 1998) that changed the direction of the development of computers and their applications. This article explains why these breaks occurred and how they affected the history of computing in India.
Resumo:
The prime movers and refrigerators based on thermoacoustics have gained considerable importance toward practical applications in view of the absence of moving components, reasonable efficiency, use of environmental friendly working fluids, etc. Devices such as twin Standing Wave ThermoAcoustic Prime Mover (SWTAPM), Traveling Wave ThermoAcoustic Prime Mover (TWTAPM) and thermoacoustically driven Standing Wave ThermoAcoustic Refrigerator (SWTAR) have been studied by researchers. The numerical modeling and simulation play a vital role in their development. In our efforts to build the above thermoacoustic systems, we have carried out numerical analysis using the procedures of CFD on the above systems. The results of the analysis are compared with those of DeltaEC (freeware from LANL, USA) simulations and the experimental results wherever possible. For the CFD analysis commercial code Fluent 6.3.26 has been used along with the necessary boundary conditions for different working fluids at various average pressures. The results of simulation indicate that choice of the working fluid and the average pressure are critical to the performance of the above thermoacoustic devices. Also it is observed that the predictions through the CFD analysis are closer to the experimental results in most cases, compared to those of DeltaEC simulations. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Several papers have studied fault attacks on computing a pairing value e(P, Q), where P is a public point and Q is a secret point. In this paper, we observe that these attacks are in fact effective only on a small number of pairing-based protocols, and that too only when the protocols are implemented with specific symmetric pairings. We demonstrate the effectiveness of the fault attacks on a public-key encryption scheme, an identity-based encryption scheme, and an oblivious transfer protocol when implemented with a symmetric pairing derived from a supersingular elliptic curve with embedding degree 2.
Resumo:
Exascale systems of the future are predicted to have mean time between failures (MTBF) of less than one hour. At such low MTBFs, employing periodic checkpointing alone will result in low efficiency because of the high number of application failures resulting in large amount of lost work due to rollbacks. In such scenarios, it is highly necessary to have proactive fault tolerance mechanisms that can help avoid significant number of failures. In this work, we have developed a mechanism for proactive fault tolerance using partial replication of a set of application processes. Our fault tolerance framework adaptively changes the set of replicated processes periodically based on failure predictions to avoid failures. We have developed an MPI prototype implementation, PAREP-MPI that allows changing the replica set. We have shown that our strategy involving adaptive process replication significantly outperforms existing mechanisms providing up to 20 percent improvement in application efficiency even for exascale systems.
Resumo:
The ultimate bearing capacity of a circular footing, placed over rock mass, is evaluated by using the lower bound theorem of the limit analysis in conjunction with finite elements and nonlinear optimization. The generalized Hoek-Brown (HB) failure criterion, but by keeping a constant value of the exponent, alpha = 0.5, was used. The failure criterion was smoothened both in the meridian and pi planes. The nonlinear optimization was carried out by employing an interior point method based on the logarithmic barrier function. The results for the obtained bearing capacity were presented in a non-dimensional form for different values of GSI, m(i), sigma(ci)/(gamma b) and q/sigma(ci). Failure patterns were also examined for a few cases. For validating the results, computations were also performed for a strip footing as well. The results obtained from the analysis compare well with the data reported in literature. Since the equilibrium conditions are precisely satisfied only at the centroids of the elements, not everywhere in the domain, the obtained lower bound solution will be approximate not true. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Blood travels throughout the body and thus its flow is modulated by changes in body condition. As a consequence, the wrist pulse signal contains important information about the status of the human body. In this work we have employed signal processing techniques to extract important information from these signals. Radial artery pulse pressure signals are acquired at wrist position noninvasively for several subjects for two cases of interest, viz. before and after exercise, and before and after lunch. Further analysis is performed by fitting a bi-modal Gaussian model to the data and extracting spatial features from the fit. The spatial features show statistically significant (p < 0.001) changes between the groups for both the cases, which indicates that they are effective in distinguishing the changes taking place due to exercise or food intake. Recursive cluster elimination based support vector machine classifier is used to classify between the groups. A high classification accuracy of 99.71% is achieved for the exercise case and 99.94% is achieved for the lunch case. This paper demonstrates the utility of certain spatial features in studying wrist pulse signals obtained under various experimental conditions. The ability of the spatial features in distinguishing changing body conditions can be potentially used for various healthcare applications. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
This paper describes the development and evolution of research themes in the Design Theory and Methodology (DTM) conference. Essays containing reflections on the history of DTM, supported by an analysis of session titles and papers winning the ``best paper award'', describe the development of the research themes. A second set of essays describes the evolution of several key research themes. Two broad trends in research themes are evident, with a third one emerging. The topics of the papers in the first decade or so reflect an underlying aim to apply artificial intelligence toward developing systems that could `design'. To do so required understanding how human designers behave, formalizing design processes so that they could be computed, and formalizing representations of design knowledge. The themes in the first DTM conference and the recollections of the DTM founders reflect this underlying aim. The second decade of DTM saw the emergence of product development as an underlying concern and included a growth in a systems view of design. More recently, there appears to be a trend toward design-led innovation, which entails both executing the design process more efficiently and understanding the characteristics of market-leading designs so as to produce engineered products and systems of exceptional levels of quality and customer satisfaction.
Resumo:
In this paper, we integrate two or more compliant mechanisms to get enhanced functionality for manipulating and mechanically characterizing the grasped objects of varied size (cm to sub-mm), stiffness (1e5 to 10 N/m), and materials (cement to biological cells). The concepts of spring-lever (SL) model, stiffness maps, and non-dimensional kinetoelastostatic maps are used to design composite and multi-scale compliant mechanisms. Composite compliant mechanisms comprise two or more different mechanisms within a single elastic continuum while multi-scale ones possess the additional feature of substantial difference in the sizes of the mechanisms that are combined into one. We present three applications: (i) a composite compliant device to measure the failure load of the cement samples; (ii) a composite multi-scale compliant gripper to measure the bulk stiffness of zebrafish embryos; and (iii) a compliant gripper combined with a negative-stiffness element to reduce the overall stiffness. The prototypes of all three devices are made and tested. The cement sample needed a breaking force of 22.5 N; the zebrafish embryo is found to have bulk stiffness of about 10 N/m; and the stiffness of a compliant gripper was reduced by 99.8 % to 0.2 N/m.
Resumo:
How do we assess the capability of a compliant mechanism of given topology and shape? The kinetoelastostatic maps proposed in this paper help answer this question. These maps are drawn in 2D using two non-dimensional quantities, one capturing the nonlinear static response and the other the geometry, material, and applied forces. Geometrically nonlinear finite element analysis is used to create the maps for compliant mechanisms consisting of slender beams. In addition to the topology and shape, the overall proportions and the proportions of the cross-sections of the beam segments are kept fixed for a map. The finite region of the map is parameterized using a non-dimensional quantity defined as the slenderness ratio. The shape and size of the map and the parameterized curves inside it indicate the complete kinetoelastostatic capability of the corresponding compliant mechanism of given topology, shape, and fixed proportions. Static responses considered in this paper include input/output displacement, geometric amplification, mechanical advantage, maximum stress, etc. The maps can be used to compare mechanisms, to choose a suitable mechanism for an application, or re-design as may be needed. The usefulness of the non-dimensional maps is presented with multiple applications of different variety. Non-dimensional portrayal of snap-through mechanisms is one such example. The effect of the shape of the cross-section of the beam segments and the role of different segments in the mechanism as well as extension to 3D compliant mechanisms, the cases of multiple inputs and outputs, and moment loads are also explained. The effects of disproportionate changes on the maps are also analyzed.
Resumo:
In this work, we describe a system, which recognises open vocabulary, isolated, online handwritten Tamil words and extend it to recognize a paragraph of writing. We explain in detail each step involved in the process: segmentation, preprocessing, feature extraction, classification and bigram-based post-processing. On our database of 45,000 handwritten words obtained through tablet PC, we have obtained symbol level accuracy of 78.5% and 85.3% without and with the usage of post-processing using symbol level language models, respectively. Word level accuracies for the same are 40.1% and 59.6%. A line and word level segmentation strategy is proposed, which gives promising results of 100% line segmentation and 98.1% word segmentation accuracies on our initial trials of 40 handwritten paragraphs. The two modules have been combined to obtain a full-fledged page recognition system for online handwritten Tamil data. To the knowledge of the authors, this is the first ever attempt on recognition of open vocabulary, online handwritten paragraphs in any Indian language.
Resumo:
A discussion has been provided for the comments raised by the discusser (Clausen, 2015)1] on the article recently published by the authors (Chakraborty and Kumar, 2015). The effect of exponent alpha for values of GSI approximately smaller than 30 becomes more critical. On the other hand, for greater values of GSI, the results obtained by the authors earlier remain primarily independent of alpha and can be easily used. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Standard approaches for ellipse fitting are based on the minimization of algebraic or geometric distance between the given data and a template ellipse. When the data are noisy and come from a partial ellipse, the state-of-the-art methods tend to produce biased ellipses. We rely on the sampling structure of the underlying signal and show that the x- and y-coordinate functions of an ellipse are finite-rate-of-innovation (FRI) signals, and that their parameters are estimable from partial data. We consider both uniform and nonuniform sampling scenarios in the presence of noise and show that the data can be modeled as a sum of random amplitude-modulated complex exponentials. A low-pass filter is used to suppress noise and approximate the data as a sum of weighted complex exponentials. The annihilating filter used in FRI approaches is applied to estimate the sampling interval in the closed form. We perform experiments on simulated and real data, and assess both objective and subjective performances in comparison with the state-of-the-art ellipse fitting methods. The proposed method produces ellipses with lesser bias. Furthermore, the mean-squared error is lesser by about 2 to 10 dB. We show the applications of ellipse fitting in iris images starting from partial edge contours, and to free-hand ellipses drawn on a touch-screen tablet.
Resumo:
All computers process information electronically. A processing method based on magnetism is reported here, in which networks of interacting submicrometer magnetic dots are used to perform logic operations and propagate information at room temperature. The logic states are signaled by the magnetization direction of the single-domain magnetic dots; the dots couple to their nearest neighbors through magnetostatic interactions. Magnetic solitons carry information through the networks, and an applied oscillating magnetic field feeds energy into the system and serves as a clock. These networks offer a several thousandfold increase in integration density and a hundredfold reduction in power dissipation over current microelectronic technology.