7 resultados para women in engineering
em Indian Institute of Science - Bangalore - Índia
Resumo:
A pseudo-dynamical approach for a class of inverse problems involving static measurements is proposed and explored. Following linearization of the minimizing functional associated with the underlying optimization problem, the new strategy results in a system of linearized ordinary differential equations (ODEs) whose steady-state solutions yield the desired reconstruction. We consider some explicit and implicit schemes for integrating the ODEs and thus establish a deterministic reconstruction strategy without an explicit use of regularization. A stochastic reconstruction strategy is then developed making use of an ensemble Kalman filter wherein these ODEs serve as the measurement model. Finally, we assess the numerical efficacy of the developed tools against a few linear and nonlinear inverse problems of engineering interest.
Resumo:
Design creativity involves developing novel and useful solutions to design problems The research in this article is an attempt to understand how novelty of a design resulting from a design process is related to the kind of outcomes. described here as constructs, involved in the design process A model of causality, the SAPPhIRE model, is used as the basis of the analysis The analysis is based on previous research that shows that designing involves development and exploration of the seven basic constructs of the SAPPhIRE model that constitute the causal connection between the various levels of abstraction at which a design can be described The constructs am state change, action, parts. phenomenon. input. organs. and effect The following two questions are asked. Is there a relationship between novelty and the constructs? If them is a relationship, what is the degree of this relationship? A hypothesis is developed to answer the questions an increase in the number and variety of ideas explored while designing should enhance the variety of concept space. leading to an increase in the novelty of the concept space Eight existing observational studies of designing sessions are used to empirically validate the hypothesis Each designing session involves an individual designer. experienced or novice. solving a design problem by producing concepts and following a think-aloud protocol. The results indicate dependence of novelty of concept space on variety of concept space and dependence of variety of concept space on variety of idea space. thereby validating the hypothesis The Jesuits also reveal a strong correlation between novelty and the constructs, correlation value decreases as the abstraction level of the constructs reduces. signifying the importance of using constructs at higher abstraction levels for enhancing novelty
Resumo:
In the recent time CFD tools have become increasingly useful in the engineering design studies especially in the area of aerospace vehicles. This is largely due to the advent of high speed computing platforms in addition to the development of new efficient algorithms. The algorithms based on kinetic schemes have been shown to be very robust and further meshless methods offer certain advantages over the other methods. Preliminary investigations of blood flow visualization through artery using CFD tool have shown encouraging results which further needs to be verified and validated.
Resumo:
The questions that one should answer in engineering computations - deterministic, probabilistic/randomized, as well as heuristic - are (i) how good the computed results/outputs are and (ii) how much the cost in terms of amount of computation and the amount of storage utilized in getting the outputs is. The absolutely errorfree quantities as well as the completely errorless computations done in a natural process can never be captured by any means that we have at our disposal. While the computations including the input real quantities in nature/natural processes are exact, all the computations that we do using a digital computer or are carried out in an embedded form are never exact. The input data for such computations are also never exact because any measuring instrument has inherent error of a fixed order associated with it and this error, as a matter of hypothesis and not as a matter of assumption, is not less than 0.005 per cent. Here by error we imply relative error bounds. The fact that exact error is never known under any circumstances and any context implies that the term error is nothing but error-bounds. Further, in engineering computations, it is the relative error or, equivalently, the relative error-bounds (and not the absolute error) which is supremely important in providing us the information regarding the quality of the results/outputs. Another important fact is that inconsistency and/or near-consistency in nature, i.e., in problems created from nature is completely nonexistent while in our modelling of the natural problems we may introduce inconsistency or near-inconsistency due to human error or due to inherent non-removable error associated with any measuring device or due to assumptions introduced to make the problem solvable or more easily solvable in practice. Thus if we discover any inconsistency or possibly any near-inconsistency in a mathematical model, it is certainly due to any or all of the three foregoing factors. We do, however, go ahead to solve such inconsistent/near-consistent problems and do get results that could be useful in real-world situations. The talk considers several deterministic, probabilistic, and heuristic algorithms in numerical optimisation, other numerical and statistical computations, and in PAC (probably approximately correct) learning models. It highlights the quality of the results/outputs through specifying relative error-bounds along with the associated confidence level, and the cost, viz., amount of computations and that of storage through complexity. It points out the limitation in error-free computations (wherever possible, i.e., where the number of arithmetic operations is finite and is known a priori) as well as in the usage of interval arithmetic. Further, the interdependence among the error, the confidence, and the cost is discussed.
Resumo:
The tendency of bacterial cells to adhere and colonize a material surface leading to biofilm formation is a fundamental challenge underlying many different applications including microbial infections associated with biomedical devices and products. Although, bacterial attachment to surfaces has been extensively studied in the past, the effect of surface topography on bacteria-material interactions has received little attention until more recently. We review the recent progress in surface topography based approaches for engineering antibacterial surfaces. Biomimicry of antibacterial surfaces in nature is a popular strategy. Whereas earlier endeavors in the field aimed at minimizing cell attachment, more recent efforts have focused on developing bactericidal surfaces. However, not all such topography mediated bactericidal surfaces are necessarily cytocompatible thus underscoring the need for continued efforts for research in this area for developing antibacterial and yet cytocompatible surfaces for use in implantable biomedical applications. This mini-review provides a brief overview of the current strategies and challenges in the emerging field of topography mediated antibacterial surfaces.