21 resultados para Coining


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computer code is developed as a part of an ongoing project on computer aided process modelling of forging operation, to simulate heat transfer in a die-billet system. The code developed on a stage-by-stage technique is based on an Alternating Direction Implicit scheme. The experimentally validated code is used to study the effect of process specifics such as preheat die temperature, machine ascent time, rate of deformation, and dwell time on the thermal characteristics in a batch coining operation where deformation is restricted to surface level only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A numerical simulation technique has been employed to study the thermal behavior of hot-forging type forming processes. Experiments on the coining and upsetting of an aluminum billet were conducted to validate the numerical predictions. Typical forming conditions for both the coining and upsetting processes were then studied in detail. an electrical analogy scheme was used to determine the thermal contact resistance. This scheme can conviniently provide the interface characteristics for typical processing conditions, which normally involve high pressures and temperatures. A single forging cycle was first considered, and then a batch of twenty-five forgings was studied. Each forging cycle includes the billet mounting, ascent, loading, dwelling, unloading, descent, and billet removal stages. The temperature distribution in the first forging to be formed is found to be significantly different from that at the end of the batch. In industry, forging is essentially a batch operation. The influence of forming speed and reduction on thermal characteristics was investigated also. The variations that can occur in the process design by considering differences in temperature characteristics are discussed also.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“What did you think you were doing?” Was the question posed by the conference organizers to me as the inventor and constructor of the first working Tangible Interfaces over 40 years ago. I think the question was intended to encourage me to talk about the underlying ideas and intentionality rather than describe an endless sequence of electronic bricks and that is what I shall do in this presentation. In the sixties the prevalent idea for a graphics interface was an analogue with sketching which was to somehow be understood by the computer as three dimensional form. I rebelled against this notion for reasons which I will explain in the presentation and instead came up with tangible physical three dimensional intelligent objects. I called these first prototypes “Intelligent Physical Modelling Systems” which is a really dumb name for an obvious concept. I am eternally grateful to Hiroshi Ishii for coining the term “Tangible User Interfaces” - the same idea but with a much smarter name. Another motivator was user involvement in the design process, and that led to the Generator (1979) project with Cedric Price for the world’s first intelligent building capable of organizing itself in response to the appetites of the users. The working model of that project is in MoMA. And the same motivation led to a self builders design kit (1980) for Walter Segal which facilitated self-builders to design their own houses. And indeed as the organizer’s question implied, the motivation and intentionality of these projects developed over the years in step with advancing technology. The speaker will attempt to articulate these changes with medical, psychological and educational examples. Much of this later work indeed stemming from the Media Lab where we are talking. Related topics such as “tangible thinking” and “intelligent teacups” will be introduced and the presentation will end with some speculations for the future. The presentation will be given against a background of images of early prototypes many of which have never been previously published.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"The 1996 edition of ‘Harvard Educational Review’ hosted the now seminal article from the New London Group, ‘The Pedagogy of Multiliteracies’. Coining the term ‘multiliteracies’ to describe the advent of new technologies as well as the rapidly changing social and cultural literacies of the emerging new world order, the New London Group proffered an ambitiously new educational agenda constituted by four non-hierarchical and non-linear components of pedagogy: situated practice, overt instruction, critical framing and transformed practice..."

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work reported herein is part of an on-going programme to develop a computer code which, given the geometrical, process and material parameters of the forging operation, is able to predict the die and the billet cooling/heating characteristics in forging production. The code has been experimentally validated earlier for a single forging cycle and is now validated for a small batch production. To facilitate a step-by-step development of the code, the billet deformation has so far been limited to its surface layers, a situation akin to coining. The code has been used here to study the effects of die preheat-temperature, machine speed and rate of deformation the cooling/heating of the billet and the dies over a small batch of 150 forgings. The study shows: that there is a pre-heat temperature at which the billet temperature changes little from one forging to the next; that beyond a particular number of forgings, the machine speed ceases to have any pronounced influence on the temperature characteristics of the billet; and that increasing the rate of deformation reduces the heat loss from the billet and gives the billet a stable temperature profile with respect to the number of forgings. The code, which is simple to use, is being extended to bulk-deformation problems. Given a practical range of possible machine, billet and process specifics, the code should be able to arrive at a combination of these parameters which will give the best thermal characteristics of the die-billet system. The code is also envisaged as being useful in the design of isothermal dies and processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitor a distribution network implies working with a huge amount of data coining from the different elements that interact in the network. This paper presents a visualization tool that simplifies the task of searching the database for useful information applicable to fault management or preventive maintenance of the network

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scientific community' is common currency in the study of science, largely due to Kuhn's use of the term in his highly influential book The Structure of Scientific Revolutions. As this article explains, however, 'scientific community' was not of Kuhn's coining. It was hinted at by Peirce, and expressly designated by Royce. On a few occasions Fleck affirmed a scientific community, while Polanyi studied it in some detail. The article concludes by comparing these thinkers' 'communitarian' interpretations of science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The discovery, development or invention of new objects and phenomena by humankind, requires a new set of words to be coined or adopted to describe it. This is also true of the Information Communication Technology (ICT) world. Words are not neutral, regardless of which dialect or language they occur in. They carry with them associations and connotations based on their previous applications and alliances, and augmented by their shapes, sounds, rhymes and rhythms. The subtext that word choice creates, while often not recognised or acknowledged, is important in considering how communication operates in, and shapes Information Technology (IT) environments. Many words that are now embedded in the ICT lexicon continue to be informed by these earlier meanings, some of which, in the English lexis, are drawn from myths. The vernacular of the ICT lexis reflects its openness to new ideas, the nature of its users, its English language roots and its Western cultural origins. This contributes to a particular communication style. But such lexis can prove problematic for non-English speaking background users and/or those from different cultures. As the ICT vocabulary continues to evolve, these language and cultural underpinnings are coming under challenge, suggesting a language and cultural future very different to the past. This in turn, will create a subtext that affects all users.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the situated body by briefly surveying the historical studies of effect and of affect which converge in current work on attention. This common approach to the situated body through attention prompted the coining of a more inclusive term, Æffect, to indicate the situated body’s mode of observation. Examples from the work of artist-turned-architects, Arakawa and Gins, will be discussed to show how architectural environments can act as heuristic tools that allow the situated body to research its own conditions. Rather than isolating effect from affect, observer from subject, organism from environment, Arakawa and Gins’ work optimises the use of situated complexity in the study of the site of person. By constructing surrounding in which to observe and learn about the shape of awareness, their procedural architecture suggests ways in which the interaction of top-down conceptual knowledge and bottom-up perceptual learning may construct possibilities in emergent rather than programmatic ways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pós-graduação em Letras - FCLAS

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to conduct a descriptive, exploratory analysis of the utilization of both traditional healing methods and western biomedical approaches to health care among members of the Vietnamese community in Houston, Texas. The first goal of the study was to identify the type(s) of health care that the Vietnamese use. The second goal was to highlight the numerous factors that may influence why certain health care choices are made. The third goal of this study was to examine the issue of preference to determine which practices would be used if limiting factors did not exist. ^ There were 81 participants, consisting of males and females who were 18 years or older. The core groups of participants were Vietnamese students from the University of Houston-Downtown and volunteer staff members from VN TeamWork. Asking the students and staff members to recommend others for the study used the snowball method of recruiting additional participants. ^ Surveys and informed consents were in English and Vietnamese. The participants were given the choice to take the surveys face-to-face or on their own. Surveys consisted of structured questions with predetermined choices, as well as, open-ended questions to allow more detailed information. The quantitative and qualitative data were coded and entered into a database, using SPSS software version 15.0. ^ Results indicated that participants used both traditional (38.3%) and biomedical (59.3%) healing, with 44.4% stating that it depended on the illness as to treatment. Coining was the most used traditional healing method, clearly still used by all ages. Coining was also the method most used when issues regarding fear and delayed western medical treatment were involved. It was determined that insurance status, more than household income, guided health care choices. A person's age, number of years spent in the United States, age at migration, and the use of certain traditional healing methods like coining all played a role in the importance of the health care practitioner speaking Vietnamese. The most important finding was that 64.2% of the participants preferred both traditional and western medicine because both methods work. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internet está evolucionando hacia la conocida como Live Web. En esta nueva etapa en la evolución de Internet, se pone al servicio de los usuarios multitud de streams de datos sociales. Gracias a estas fuentes de datos, los usuarios han pasado de navegar por páginas web estáticas a interacturar con aplicaciones que ofrecen contenido personalizado, basada en sus preferencias. Cada usuario interactúa a diario con multiples aplicaciones que ofrecen notificaciones y alertas, en este sentido cada usuario es una fuente de eventos, y a menudo los usuarios se sienten desbordados y no son capaces de procesar toda esa información a la carta. Para lidiar con esta sobresaturación, han aparecido múltiples herramientas que automatizan las tareas más habituales, desde gestores de bandeja de entrada, gestores de alertas en redes sociales, a complejos CRMs o smart-home hubs. La contrapartida es que aunque ofrecen una solución a problemas comunes, no pueden adaptarse a las necesidades de cada usuario ofreciendo una solucion personalizada. Los Servicios de Automatización de Tareas (TAS de sus siglas en inglés) entraron en escena a partir de 2012 para dar solución a esta liminación. Dada su semejanza, estos servicios también son considerados como un nuevo enfoque en la tecnología de mash-ups pero centra en el usuarios. Los usuarios de estas plataformas tienen la capacidad de interconectar servicios, sensores y otros aparatos con connexión a internet diseñando las automatizaciones que se ajustan a sus necesidades. La propuesta ha sido ámpliamante aceptada por los usuarios. Este hecho ha propiciado multitud de plataformas que ofrecen servicios TAS entren en escena. Al ser un nuevo campo de investigación, esta tesis presenta las principales características de los TAS, describe sus componentes, e identifica las dimensiones fundamentales que los defines y permiten su clasificación. En este trabajo se acuña el termino Servicio de Automatización de Tareas (TAS) dando una descripción formal para estos servicios y sus componentes (llamados canales), y proporciona una arquitectura de referencia. De igual forma, existe una falta de herramientas para describir servicios de automatización, y las reglas de automatización. A este respecto, esta tesis propone un modelo común que se concreta en la ontología EWE (Evented WEb Ontology). Este modelo permite com parar y equiparar canales y automatizaciones de distintos TASs, constituyendo un aporte considerable paraa la portabilidad de automatizaciones de usuarios entre plataformas. De igual manera, dado el carácter semántico del modelo, permite incluir en las automatizaciones elementos de fuentes externas sobre los que razonar, como es el caso de Linked Open Data. Utilizando este modelo, se ha generado un dataset de canales y automatizaciones, con los datos obtenidos de algunos de los TAS existentes en el mercado. Como último paso hacia el lograr un modelo común para describir TAS, se ha desarrollado un algoritmo para aprender ontologías de forma automática a partir de los datos del dataset. De esta forma, se favorece el descubrimiento de nuevos canales, y se reduce el coste de mantenimiento del modelo, el cual se actualiza de forma semi-automática. En conclusión, las principales contribuciones de esta tesis son: i) describir el estado del arte en automatización de tareas y acuñar el término Servicio de Automatización de Tareas, ii) desarrollar una ontología para el modelado de los componentes de TASs y automatizaciones, iii) poblar un dataset de datos de canales y automatizaciones, usado para desarrollar un algoritmo de aprendizaje automatico de ontologías, y iv) diseñar una arquitectura de agentes para la asistencia a usuarios en la creación de automatizaciones. ABSTRACT The new stage in the evolution of the Web (the Live Web or Evented Web) puts lots of social data-streams at the service of users, who no longer browse static web pages but interact with applications that present them contextual and relevant experiences. Given that each user is a potential source of events, a typical user often gets overwhelmed. To deal with that huge amount of data, multiple automation tools have emerged, covering from simple social media managers or notification aggregators to complex CRMs or smart-home Hub/Apps. As a downside, they cannot tailor to the needs of every single user. As a natural response to this downside, Task Automation Services broke in the Internet. They may be seen as a new model of mash-up technology for combining social streams, services and connected devices from an end-user perspective: end-users are empowered to connect those stream however they want, designing the automations they need. The numbers of those platforms that appeared early on shot up, and as a consequence the amount of platforms following this approach is growing fast. Being a novel field, this thesis aims to shed light on it, presenting and exemplifying the main characteristics of Task Automation Services, describing their components, and identifying several dimensions to classify them. This thesis coins the term Task Automation Services (TAS) by providing a formal definition of them, their components (called channels), as well a TAS reference architecture. There is also a lack of tools for describing automation services and automations rules. In this regard, this thesis proposes a theoretical common model of TAS and formalizes it as the EWE ontology This model enables to compare channels and automations from different TASs, which has a high impact in interoperability; and enhances automations providing a mechanism to reason over external sources such as Linked Open Data. Based on this model, a dataset of components of TAS was built, harvesting data from the web sites of actual TASs. Going a step further towards this common model, an algorithm for categorizing them was designed, enabling their discovery across different TAS. Thus, the main contributions of the thesis are: i) surveying the state of the art on task automation and coining the term Task Automation Service; ii) providing a semantic common model for describing TAS components and automations; iii) populating a categorized dataset of TAS components, used to learn ontologies of particular domains from the TAS perspective; and iv) designing an agent architecture for assisting users in setting up automations, that is aware of their context and acts in consequence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Apoptosis, also called programmed cell death, has attracted great attention in recent years. After its discovery by Carl Vogt in 1842, apoptosis research was dormant for more than a century. Its rediscovery in the second half of this century, and the coining of the term apoptosis in 1972 by Kerr, Wyllie, and Currie, ignited an unparalleled interest in this field of science. The number of publications related to apoptosis has been growing exponentially every year ever since. This is mainly due to three major advances, two of which have been made recently and one that is currently seen. First, studies with the small nematode Caenorhabditis elegans have identified a number of apoptosis regulating genes—the first evidence that cell death is an active process under genetic control. Many of these genes have mammalian homologs that, like their worm counterparts, seem to regulate mammalian apoptosis. Second, elucidation of the signal transduction pathways of apoptosis has lead especially to the identification of specific death signaling molecules such as a new family of cysteine proteases, the caspases. Third, it has now become clear that many diseases are characterized by dysregulation of apoptotic programs. Many of these programs involve a family of receptors and their ligands, the death receptor/ligand family. The hope now is to interfere with apoptosis regulation in these systems and to develop new therapeutic concepts.