892 resultados para Towards Seamless Integration of Geoscience Models and Data
Resumo:
This paper describes a study of the theoretical and experimental behaviour of box-columns of varying b/t ratios under loadings of axial compression and torsion and their combinations. Details of the testing rigs and the testing methods, the results obtained such as the load-deflection curves and the interaction diagrams, and experimental observations regarding the behaviour of box-models and the types of local plastic mechanisms associated with each type of loading are presented. A simplified rigid-plastic analysis is carried out to study the collapse behaviour of box-columns under these loadings, based on the observed plastic mechanisms, and the results are compared with those of experiments.
Resumo:
In this chapter we will make the transition towards the design of business models and the related critical issues. We develop a model that helps us understand the causalities that play a role in understanding the viability and feasibility of the business models, i.e. long-term profitability and market adoption. We argue that designing viable business models requires balancing the requirements and interests of the actors involved, within and between the various business model domains. Requirements in the service domain guide the design choices in the technology domain, which in turn affect network formation and the financial arrangements. It is important to understand the Critical Design Issues (CDIs) involved in business models and their interdependencies. In this chapter, we present the Critical Design Issues involved in designing mobile service business models, and demonstrate how they are linked to the Critical Success Factors (CSFs) with regard to business model viability. This results in a causal model for understanding business model viability, as well as providing grounding for the business model design approach outlined in Chapter 5.
Resumo:
The object of analysis in the present text is the issue of operational control and data retention in Poland. The analysis of this issue follows from a critical stance taken by NGOs and state institutions on the scope of operational control wielded by the Polish police and special services – it concerns, in particular, the employment of “itemized phone bills and the so-called phone tapping.” Besides the quantitative analysis of operational control and the scope of data retention, the text features the conclusions of the Human Rights Defender referred to the Constitutional Tribunal in 2011. It must be noted that the main problems concerned with the employment of operational control and data retention are caused by: (1) a lack of specification of technical means which can be used by individual services; (2) a lack of specification of what kind of information and evidence is in question; (3) an open catalogue of information and evidence which can be clandestinely acquired in an operational mode. Furthermore, with regard to the access granted to teleinformation data by the Telecommunications Act, attention should be drawn to a wide array of data submitted to particular services. Also, the text draws on the so-called open interviews conducted mainly with former police officers with a view to pointing to some non-formal reasons for “phone tapping” in Poland. This comes in the form of a summary.
Resumo:
The Indian author Rabindranath Tagore was received like royalty during his visits to the West after winning the Nobel Prize in 1913. Dreams of foreign cultures offered a retreat from a complicated age. In a time when the West appeared to be living under threat of disintegration and when industrialism seemed like a cul-de-sac, he appeared to offer the promise of a return to a lost paradise, a spiritual abode that is superior to the restless Western culture. However, Tagore’s popularity faded rapidly, most notably in England, the main target of his criticism. Soon after Tagore had won the Nobel Prize, the English became indignant at Tagore’s anti-colonial attitude.Tagore visited Sweden in 1921 and 1926 and was given a warm reception. His visits to Sweden can be seen as an episode in a longer chain of events. It brought to life old conceptions of India as the abode of spirituality on earth. Nevertheless, interest in him was a relatively short-lived phenomenon in Sweden. Only a few of his admirers in Sweden appreciated the complexity of Tagore’s achievements. His “anathema of mammonism”, as a Swedish newspaper called it, was not properly received. After a steady stream of translations his popularity flagged towards the end of the 1920s and then almost disappeared entirely. Tagores visits in Sweden gave an indication that India was on the way to liberate itself from its colonial legacy, which consequently contributed to the waning of his popularity in the West. In the long run, his criticism of the drawbacks in the western world became too obvious to maintain permanent interest. The Russian author Fyodor Dostoyevskiy’s Crime and Punishment (1866) has enticed numerous interpretations such as the purely biographical approach. In the nervous main character of the novel, the young student Raskolnikov, one easily recognizes Dostoyevskiy himself. The novel can also be seen as a masterpiece of realistic fiction. It gives a broad picture of Saint Petersburg, a metropolis in decay. Crime and Punishment can also be seen as one of the first examples of a modern psychological novel, since it is focused on the inner drama of its main character, the young student Raskolnikov. His actions seem to be governed by mere coincidences, dreams and the spur of the moment. it seems fruitful to study the novel from a psychoanalytical approach. In his book Raskolnikov: the way of the divided towards unity in Crime and Punishment (1982), a Swedish scholar, Owe Wikström, has followed this line of interpretation all the way to Freud’s disciple C G Jung. In addition to this, the novel functions as an exciting crime story. To a large extent it is Viktor Sjklovskij and other Russian formalists from the 1920s and onwards who have taught the western audience to understand the specific nature of the crime story. The novel could be seen as a story about religious conversion. Like Lasarus in the Bible (whose story attracts a lot of attention in the novel) Raskolnikov is awakened from the dead, and together with Sonja he starts a completely new life. The theme of conversion has a special meaning for Dostoyevskiy. For him the conversion meant an acknowledgement of the specific nature of Russia itself. Crime and punishment mirrors the conflict between traditional Russian values and western influences that has been obvious in Russia throughout the history of the country. The novel reflects a dialogue that still continues in Russian society. The Russian literary historian Mikhail Bakhtin, who is probably the most famous interpreter of the works of Dostoyevskiy, has become famous precisely by emphasizing the importance of dialogues in novels like Crime and Punishment. According to Bakhtin, this novel is characterized by its multitude of voices. Various ideas are confronted with each other, and each one of them is personified by one of the characters in the novel. The author has resigned from his position as the superior monitor of the text, and he leaves it to the reader to decide what interpretation is the correct one..The aim of the present study is thus to analyze the complex reactions in the west to Tagore’s visits in Sweden and to Fyodor Dostoyevskiys novel Crime and Punishment.. This leads to more general conclusions on communication between cultures.
Resumo:
The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
Cell-based therapies and tissue engineering initiatives are gathering clinical momentum for next-generation treatment of tissue deficiencies. By using gravity-enforced self-assembly of monodispersed primary cells, we have produced adult and neonatal rat cardiomyocyte-based myocardial microtissues that could optionally be vascularized following coating with human umbilical vein endothelial cells (HUVECs). Within myocardial microtissues, individual cardiomyocytes showed native-like cell shape and structure, and established electrochemical coupling via intercalated disks. This resulted in the coordinated beating of microtissues, which was recorded by means of a multi-electrode complementary metal-oxide-semiconductor microchip. Myocardial microtissues (microm3 scale), coated with HUVECs and cast in a custom-shaped agarose mold, assembled to coherent macrotissues (mm3 scale), characterized by an extensive capillary network with typical vessel ultrastructures. Following implantation into chicken embryos, myocardial microtissues recruited the embryo's capillaries to functionally vascularize the rat-derived tissue implant. Similarly, transplantation of rat myocardial microtissues into the pericardium of adult rats resulted in time-dependent integration of myocardial microtissues and co-alignment of implanted and host cardiomyocytes within 7 days. Myocardial microtissues and custom-shaped macrotissues produced by cellular self-assembly exemplify the potential of artificial tissue implants for regenerative medicine.
Resumo:
One of the broad objectives of the Nigerian health service, vigorously being pursued at all levels of government, is to make comprehensive health care available and accessible to the population at the lowest possible cost, within available resources. Some state governments in the federation have already introduced free medical service as a practical way to remove financial barriers to access and in turn to encourage greater utilization of publicly funded care facilities.^ To aid health planners and decision makers in identifying a shorter corridor through which urban dwellers can gain access to comprehensive health care, a health interview survey of the metropolitan Lagos was undertaken. The primary purpose was to ascertain the magnitude of access problems which urban households face in seeking care from existing public facilities at the time of need. Six categories of illness chosen from the 1975 edition of the International Classification of Disease were used as indicators of health need.^ Choice of treatment facilities in response to illness episode was examined in relation to distance, travel time, time of use and transportation experiences. These were graphically described. The overall picture indicated that distance and travel time coexist with transportation problems in preventing a significant segment of those in need of health care from benefitting in the free medical service offered in public health facilities. Within this milieu, traditional medicine and its practitioners became the most preferred alternative. Recommendations were offered for action with regard to decentralization of general practitioner (GP) consultations in general hospitals and integration of traditional medicine and its practitioners into public health service. ^
Resumo:
Information generated by abstract interpreters has long been used to perform program specialization. Additionally, if the abstract interpreter generates a multivariant analysis, it is also possible to perform múltiple specialization. Information about valúes of variables is propagated by simulating program execution and performing fixpoint computations for recursive calis. In contrast, traditional partial evaluators (mainly) use unfolding for both propagating valúes of variables and transforming the program. It is known that abstract interpretation is a better technique for propagating success valúes than unfolding. However, the program transformations induced by unfolding may lead to important optimizations which are not directly achievable in the existing frameworks for múltiple specialization based on abstract interpretation. The aim of this work is to devise a specialization framework which integrates the better information propagation of abstract interpretation with the powerful program transformations performed by partial evaluation, and which can be implemented via small modifications to existing generic abstract interpreters. With this aim, we will relate top-down abstract interpretation with traditional concepts in partial evaluation and sketch how the sophisticated techniques developed for controlling partial evaluation can be adapted to the proposed specialization framework. We conclude that there can be both practical and conceptual advantages in the proposed integration of partial evaluation and abstract interpretation.
Resumo:
Fundamental research and modelling in plasma atomic physics continue to be essential for providing basic understanding of many different topics relevant to high-energy-density plasmas. The Atomic Physics Group at the Institute of Nuclear Fusion has accumulated experience over the years in developing a collection of computational models and tools for determining the atomic energy structure, ionization balance and radiative properties of, mainly, inertial fusion and laser-produced plasmas in a variety of conditions. In this work, we discuss some of the latest advances and results of our research, with emphasis on inertial fusion and laboratory-astrophysical applications.
Resumo:
Este proyecto tiene como intención llevar a cabo el desarrollo de una aplicación basada en tecnologías Web utilizando Spring Framework, una infraestructura de código abierto para la plataforma Java. Se realizará primero un estudio teórico sobre las características de Spring para luego poder implementar una aplicación utilizando dicha tecnología como ejemplo práctico. La primera parte constará de un análisis sobre las características más significativas de Spring, recogiendo de esta forma información sobre todos los componentes del framework necesarios para desarrollar una aplicación genérica. El objetivo es descubrir y analizar cómo Spring facilita la implementación de un proyecto con arquitectura MVC y cómo permite integrar seguridad, internacionalización y otros conceptos de forma transparente. La segunda parte, el desarrollo de la aplicación web, sirve como demostración práctica de cómo utilizar los conocimientos recogidos sobre Spring. Se desarrollará una aplicación que gestiona un recetario generado por una comunidad de usuarios. La aplicación contiene un registro de usuarios que deberán autenticarse para poder ver sus datos personales y modificarlos si lo desean. Dependiendo del tipo de usuarios, tendrán acceso a distintas zonas de la aplicación y tendrán un rango distinto de acciones disponibles. Las acciones principales son la visualización de recetas, la creación de recetas, la modificación o eliminación de recetas propias y la modificación o eliminación de recetas de los demás usuarios. Las recetas constarán de un nombre, una descripción, una fotografía del resultado, tiempos estimados, dificultad estimada, una lista de ingredientes y sus cantidades y finalmente una serie de pasos con fotografías demostrativas si se desea añadir. Los administradores, un tipo específico de usuarios, podrán acceder a una lista de usuarios para monitorizarlos, modificarlos o añadir y quitarles permisos. ABSTRACT The purpose of this project is the development of an application based on Web technologies with the use of Spring Framework, an open-source application framework for the Java platform. A theoretical study on the characteristics of Spring will be performed first, followed by the implementation of an application using said technology to show as object lesson. The first part consists of an analysis of the most significant features of Spring, thus collecting information on all components of the framework necessary to develop a generic app. The goal is to discover and analyze how Spring helps develop a project based on a MVC architecture and how it allows seamless integration of security, internationalization and other concepts. The second part, the development of the web application, serves as a practical demonstration of how to use the knowledge gleaned about Spring. An application will be developed to manage a cookbook generated by a community of users. The application has a set of users who have to authenticate themselves to be able to see their personal data and modify it if they wish to do so. Depending on the user type, the user will be able to access different parts of the application and will have a different set of possible actions. The main possible actions are: creation recipes, modification or deletion of owned recipes and the modification and deletion of any recipe. The recipes consist its name, a description, a photograph, estimated times and difficulties, a list of ingredients along with their quantities and lastly a series of steps to follow along with demonstrative photographs if desired; and other information such as categories or difficulties. The administrators, a specific type of users, will have access to a list of users where they can monitor them, modify them or grant and remove privileges.
Resumo:
There is growing interest in the use of context-awareness as a technique for developing pervasive computing applications that are flexible, adaptable, and capable of acting autonomously on behalf of users. However, context-awareness introduces a variety of software engineering challenges. In this paper, we address these challenges by proposing a set of conceptual models designed to support the software engineering process, including context modelling techniques, a preference model for representing context-dependent requirements, and two programming models. We also present a software infrastructure and software engineering process that can be used in conjunction with our models. Finally, we discuss a case study that demonstrates the strengths of our models and software engineering approach with respect to a set of software quality metrics.
Resumo:
Increasingly software systems are required to survive variations in their execution environment without or with only little human intervention. Such systems are called "eternal software systems". In contrast to the traditional view of development and execution as separate cycles, these modern software systems should not present such a separation. Research in MDE has been primarily concerned with the use of models during the first cycle or development (i.e. during the design, implementation, and deployment) and has shown excellent results. In this paper the author argues that an eternal software system must have a first-class representation of itself available to enable change. These runtime representations (or runtime models) will depend on the kind of dynamic changes that we want to make available during execution or on the kind of analysis we want the system to support. Hence, different models can be conceived. Self-representation inevitably implies the use of reflection. In this paper the author briefly summarizes research that supports the use of runtime models, and points out different issues and research questions. © 2009 IEEE.