2 resultados para open-shell
em Universidad Politécnica de Madrid
Resumo:
For the past 20 years, dynamic analysis of shells has been one of the most fascinating fields for research. Using the new light materials the building engineer soon discovered that the subsequent reduction of gravity forces produced not only the desired shape freedom but the appearance of ecologic loads as the first factor of design; loads which present strong random properties and marked dynamic influence. On the other hand, the technological advance in the aeronautical and astronautical field placed the engineers in front of shell structures of nonconventional shape and able to sustain substantialy dynamic loads. The response to the increasingly challenger problems of the last two decades has been very bright; new forms, new materials and new methods of analysis have arosen in the design of off-shore platforms, nuclear vessels, space crafts, etc. Thanks to the intensity of the lived years we have at our disposition a coherent and homogeneous amount of knowledge which enable us to face problems of inconceivable complexity when IASS was founded. The open minded approach to classical problems and the impact of the computer are, probably, important factors in the Renaissance we have enjoyed these years, and a good proof of this are the papers presented to the previous IASS meetings as well as that we are going to consider in this one. Particularly striking is the great number of papers based on a mathematical modeling in front of the meagerness of those treating laboratory experiments on physical models. The universal entering of the computer into almost every phase of our lifes, and the cost of physical models, are –may be- reasons for this lack of experimental methods. Nevertheless they continue offering useful results as are those obtained with the shaking-table in which the computer plays an essential role in the application of loads as well as in the instantaneous treatment of control data. Plates 1 and 2 record the papers presented under dynamic heading, 40% of them are from Japan in good correlation with the relevance that Japanese research has traditionally showed in this area. Also interesting is to find old friends as profesors Tanaka, Nishimura and Kostem who presented valuable papers in previous IASS conferences. As we see there are papers representative of all tendencies, even purely analytical! Better than discuss them in detail, which can be done after the authors presentation, I think we can comment in the general pattern of the dynamical approach are summarized in plate 3.
Resumo:
Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.