17 resultados para Coining
Resumo:
A computer code is developed as a part of an ongoing project on computer aided process modelling of forging operation, to simulate heat transfer in a die-billet system. The code developed on a stage-by-stage technique is based on an Alternating Direction Implicit scheme. The experimentally validated code is used to study the effect of process specifics such as preheat die temperature, machine ascent time, rate of deformation, and dwell time on the thermal characteristics in a batch coining operation where deformation is restricted to surface level only.
Resumo:
A numerical simulation technique has been employed to study the thermal behavior of hot-forging type forming processes. Experiments on the coining and upsetting of an aluminum billet were conducted to validate the numerical predictions. Typical forming conditions for both the coining and upsetting processes were then studied in detail. an electrical analogy scheme was used to determine the thermal contact resistance. This scheme can conviniently provide the interface characteristics for typical processing conditions, which normally involve high pressures and temperatures. A single forging cycle was first considered, and then a batch of twenty-five forgings was studied. Each forging cycle includes the billet mounting, ascent, loading, dwelling, unloading, descent, and billet removal stages. The temperature distribution in the first forging to be formed is found to be significantly different from that at the end of the batch. In industry, forging is essentially a batch operation. The influence of forming speed and reduction on thermal characteristics was investigated also. The variations that can occur in the process design by considering differences in temperature characteristics are discussed also.
Resumo:
“What did you think you were doing?” Was the question posed by the conference organizers to me as the inventor and constructor of the first working Tangible Interfaces over 40 years ago. I think the question was intended to encourage me to talk about the underlying ideas and intentionality rather than describe an endless sequence of electronic bricks and that is what I shall do in this presentation. In the sixties the prevalent idea for a graphics interface was an analogue with sketching which was to somehow be understood by the computer as three dimensional form. I rebelled against this notion for reasons which I will explain in the presentation and instead came up with tangible physical three dimensional intelligent objects. I called these first prototypes “Intelligent Physical Modelling Systems” which is a really dumb name for an obvious concept. I am eternally grateful to Hiroshi Ishii for coining the term “Tangible User Interfaces” - the same idea but with a much smarter name. Another motivator was user involvement in the design process, and that led to the Generator (1979) project with Cedric Price for the world’s first intelligent building capable of organizing itself in response to the appetites of the users. The working model of that project is in MoMA. And the same motivation led to a self builders design kit (1980) for Walter Segal which facilitated self-builders to design their own houses. And indeed as the organizer’s question implied, the motivation and intentionality of these projects developed over the years in step with advancing technology. The speaker will attempt to articulate these changes with medical, psychological and educational examples. Much of this later work indeed stemming from the Media Lab where we are talking. Related topics such as “tangible thinking” and “intelligent teacups” will be introduced and the presentation will end with some speculations for the future. The presentation will be given against a background of images of early prototypes many of which have never been previously published.
Resumo:
"The 1996 edition of ‘Harvard Educational Review’ hosted the now seminal article from the New London Group, ‘The Pedagogy of Multiliteracies’. Coining the term ‘multiliteracies’ to describe the advent of new technologies as well as the rapidly changing social and cultural literacies of the emerging new world order, the New London Group proffered an ambitiously new educational agenda constituted by four non-hierarchical and non-linear components of pedagogy: situated practice, overt instruction, critical framing and transformed practice..."
Resumo:
The work reported herein is part of an on-going programme to develop a computer code which, given the geometrical, process and material parameters of the forging operation, is able to predict the die and the billet cooling/heating characteristics in forging production. The code has been experimentally validated earlier for a single forging cycle and is now validated for a small batch production. To facilitate a step-by-step development of the code, the billet deformation has so far been limited to its surface layers, a situation akin to coining. The code has been used here to study the effects of die preheat-temperature, machine speed and rate of deformation the cooling/heating of the billet and the dies over a small batch of 150 forgings. The study shows: that there is a pre-heat temperature at which the billet temperature changes little from one forging to the next; that beyond a particular number of forgings, the machine speed ceases to have any pronounced influence on the temperature characteristics of the billet; and that increasing the rate of deformation reduces the heat loss from the billet and gives the billet a stable temperature profile with respect to the number of forgings. The code, which is simple to use, is being extended to bulk-deformation problems. Given a practical range of possible machine, billet and process specifics, the code should be able to arrive at a combination of these parameters which will give the best thermal characteristics of the die-billet system. The code is also envisaged as being useful in the design of isothermal dies and processes.
Resumo:
Monitor a distribution network implies working with a huge amount of data coining from the different elements that interact in the network. This paper presents a visualization tool that simplifies the task of searching the database for useful information applicable to fault management or preventive maintenance of the network
Resumo:
Pós-graduação em Letras - FCLAS
Resumo:
The purpose of this study was to conduct a descriptive, exploratory analysis of the utilization of both traditional healing methods and western biomedical approaches to health care among members of the Vietnamese community in Houston, Texas. The first goal of the study was to identify the type(s) of health care that the Vietnamese use. The second goal was to highlight the numerous factors that may influence why certain health care choices are made. The third goal of this study was to examine the issue of preference to determine which practices would be used if limiting factors did not exist. ^ There were 81 participants, consisting of males and females who were 18 years or older. The core groups of participants were Vietnamese students from the University of Houston-Downtown and volunteer staff members from VN TeamWork. Asking the students and staff members to recommend others for the study used the snowball method of recruiting additional participants. ^ Surveys and informed consents were in English and Vietnamese. The participants were given the choice to take the surveys face-to-face or on their own. Surveys consisted of structured questions with predetermined choices, as well as, open-ended questions to allow more detailed information. The quantitative and qualitative data were coded and entered into a database, using SPSS software version 15.0. ^ Results indicated that participants used both traditional (38.3%) and biomedical (59.3%) healing, with 44.4% stating that it depended on the illness as to treatment. Coining was the most used traditional healing method, clearly still used by all ages. Coining was also the method most used when issues regarding fear and delayed western medical treatment were involved. It was determined that insurance status, more than household income, guided health care choices. A person's age, number of years spent in the United States, age at migration, and the use of certain traditional healing methods like coining all played a role in the importance of the health care practitioner speaking Vietnamese. The most important finding was that 64.2% of the participants preferred both traditional and western medicine because both methods work. ^
Resumo:
Internet está evolucionando hacia la conocida como Live Web. En esta nueva etapa en la evolución de Internet, se pone al servicio de los usuarios multitud de streams de datos sociales. Gracias a estas fuentes de datos, los usuarios han pasado de navegar por páginas web estáticas a interacturar con aplicaciones que ofrecen contenido personalizado, basada en sus preferencias. Cada usuario interactúa a diario con multiples aplicaciones que ofrecen notificaciones y alertas, en este sentido cada usuario es una fuente de eventos, y a menudo los usuarios se sienten desbordados y no son capaces de procesar toda esa información a la carta. Para lidiar con esta sobresaturación, han aparecido múltiples herramientas que automatizan las tareas más habituales, desde gestores de bandeja de entrada, gestores de alertas en redes sociales, a complejos CRMs o smart-home hubs. La contrapartida es que aunque ofrecen una solución a problemas comunes, no pueden adaptarse a las necesidades de cada usuario ofreciendo una solucion personalizada. Los Servicios de Automatización de Tareas (TAS de sus siglas en inglés) entraron en escena a partir de 2012 para dar solución a esta liminación. Dada su semejanza, estos servicios también son considerados como un nuevo enfoque en la tecnología de mash-ups pero centra en el usuarios. Los usuarios de estas plataformas tienen la capacidad de interconectar servicios, sensores y otros aparatos con connexión a internet diseñando las automatizaciones que se ajustan a sus necesidades. La propuesta ha sido ámpliamante aceptada por los usuarios. Este hecho ha propiciado multitud de plataformas que ofrecen servicios TAS entren en escena. Al ser un nuevo campo de investigación, esta tesis presenta las principales características de los TAS, describe sus componentes, e identifica las dimensiones fundamentales que los defines y permiten su clasificación. En este trabajo se acuña el termino Servicio de Automatización de Tareas (TAS) dando una descripción formal para estos servicios y sus componentes (llamados canales), y proporciona una arquitectura de referencia. De igual forma, existe una falta de herramientas para describir servicios de automatización, y las reglas de automatización. A este respecto, esta tesis propone un modelo común que se concreta en la ontología EWE (Evented WEb Ontology). Este modelo permite com parar y equiparar canales y automatizaciones de distintos TASs, constituyendo un aporte considerable paraa la portabilidad de automatizaciones de usuarios entre plataformas. De igual manera, dado el carácter semántico del modelo, permite incluir en las automatizaciones elementos de fuentes externas sobre los que razonar, como es el caso de Linked Open Data. Utilizando este modelo, se ha generado un dataset de canales y automatizaciones, con los datos obtenidos de algunos de los TAS existentes en el mercado. Como último paso hacia el lograr un modelo común para describir TAS, se ha desarrollado un algoritmo para aprender ontologías de forma automática a partir de los datos del dataset. De esta forma, se favorece el descubrimiento de nuevos canales, y se reduce el coste de mantenimiento del modelo, el cual se actualiza de forma semi-automática. En conclusión, las principales contribuciones de esta tesis son: i) describir el estado del arte en automatización de tareas y acuñar el término Servicio de Automatización de Tareas, ii) desarrollar una ontología para el modelado de los componentes de TASs y automatizaciones, iii) poblar un dataset de datos de canales y automatizaciones, usado para desarrollar un algoritmo de aprendizaje automatico de ontologías, y iv) diseñar una arquitectura de agentes para la asistencia a usuarios en la creación de automatizaciones. ABSTRACT The new stage in the evolution of the Web (the Live Web or Evented Web) puts lots of social data-streams at the service of users, who no longer browse static web pages but interact with applications that present them contextual and relevant experiences. Given that each user is a potential source of events, a typical user often gets overwhelmed. To deal with that huge amount of data, multiple automation tools have emerged, covering from simple social media managers or notification aggregators to complex CRMs or smart-home Hub/Apps. As a downside, they cannot tailor to the needs of every single user. As a natural response to this downside, Task Automation Services broke in the Internet. They may be seen as a new model of mash-up technology for combining social streams, services and connected devices from an end-user perspective: end-users are empowered to connect those stream however they want, designing the automations they need. The numbers of those platforms that appeared early on shot up, and as a consequence the amount of platforms following this approach is growing fast. Being a novel field, this thesis aims to shed light on it, presenting and exemplifying the main characteristics of Task Automation Services, describing their components, and identifying several dimensions to classify them. This thesis coins the term Task Automation Services (TAS) by providing a formal definition of them, their components (called channels), as well a TAS reference architecture. There is also a lack of tools for describing automation services and automations rules. In this regard, this thesis proposes a theoretical common model of TAS and formalizes it as the EWE ontology This model enables to compare channels and automations from different TASs, which has a high impact in interoperability; and enhances automations providing a mechanism to reason over external sources such as Linked Open Data. Based on this model, a dataset of components of TAS was built, harvesting data from the web sites of actual TASs. Going a step further towards this common model, an algorithm for categorizing them was designed, enabling their discovery across different TAS. Thus, the main contributions of the thesis are: i) surveying the state of the art on task automation and coining the term Task Automation Service; ii) providing a semantic common model for describing TAS components and automations; iii) populating a categorized dataset of TAS components, used to learn ontologies of particular domains from the TAS perspective; and iv) designing an agent architecture for assisting users in setting up automations, that is aware of their context and acts in consequence.
Resumo:
Apoptosis, also called programmed cell death, has attracted great attention in recent years. After its discovery by Carl Vogt in 1842, apoptosis research was dormant for more than a century. Its rediscovery in the second half of this century, and the coining of the term apoptosis in 1972 by Kerr, Wyllie, and Currie, ignited an unparalleled interest in this field of science. The number of publications related to apoptosis has been growing exponentially every year ever since. This is mainly due to three major advances, two of which have been made recently and one that is currently seen. First, studies with the small nematode Caenorhabditis elegans have identified a number of apoptosis regulating genes—the first evidence that cell death is an active process under genetic control. Many of these genes have mammalian homologs that, like their worm counterparts, seem to regulate mammalian apoptosis. Second, elucidation of the signal transduction pathways of apoptosis has lead especially to the identification of specific death signaling molecules such as a new family of cysteine proteases, the caspases. Third, it has now become clear that many diseases are characterized by dysregulation of apoptotic programs. Many of these programs involve a family of receptors and their ligands, the death receptor/ligand family. The hope now is to interfere with apoptosis regulation in these systems and to develop new therapeutic concepts.
Resumo:
Includes indexes.
Resumo:
This paper focuses upon the argument that the role played by the engineering profession within today's society has changed markedly over the past several years from providing the foundations for contemporary life to leading societal change and becoming one of the key driver's of future social development. Coining the term 'Engineering-Sociology' this paper contributes to engineering education and engineering education research by proposing a new paradigm upon which future engineering education programmes and engineering education research might build. Developed out of an approach to learning and teaching practice, Engineering-Sociology encapsulates both traditional and applied approaches to engineering education and engineering education research. It suggests that in order to meet future challenges there is a need to bring together what are generally perceived to be two diametrically opposed paradigms, namely engineering and sociology. Building on contemporary theoretical and pedagogical arguments in engineering education research, the paper concludes that by encouraging engineering educators to 'think differently', Engineering-Sociology can provide an approach to learning and teaching that both enhances the student experience and meets the changing needs of society.
Resumo:
No funding agencies or grants indicated in the publication.