988 resultados para REFINEMENT


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In early stages of architectural design, as in other design domains, the language used is often very abstract. In architectural design, for example, architects and their clients use experiential terms such as "private" or "open" to describe spaces. If we are to build programs that can help designers during this early-stage design, we must give those programs the capability to deal with concepts on the level of such abstractions. The work reported in this thesis sought to do that, focusing on two key questions: How are abstract terms such as "private" and "open" translated into physical form? How might one build a tool to assist designers with this process? The Architect's Collaborator (TAC) was built to explore these issues. It is a design assistant that supports iterative design refinement, and that represents and reasons about how experiential qualities are manifested in physical form. Given a starting design and a set of design goals, TAC explores the space of possible designs in search of solutions that satisfy the goals. It employs a strategy we've called dependency-directed redesign: it evaluates a design with respect to a set of goals, then uses an explanation of the evaluation to guide proposal and refinement of repair suggestions; it then carries out the repair suggestions to create new designs. A series of experiments was run to study TAC's behavior. Issues of control structure, goal set size, goal order, and modification operator capabilities were explored. In addition, TAC's use as a design assistant was studied in an experiment using a house in the process of being redesigned. TAC's use as an analysis tool was studied in an experiment using Frank Lloyd Wright's Prairie houses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“What is value in product development?” is the key question of this paper. The answer is critical to the creation of lean in product development. By knowing how much value is added by product development (PD) activities, decisions can be more rationally made about how to allocate resources, such as time and money. In order to apply the principles of Lean Thinking and remove waste from the product development system, value must be precisely defined. Unfortunately, value is a complex entity that is composed of many dimensions and has thus far eluded definition on a local level. For this reason, research has been initiated on “Measuring Value in Product Development.” This paper serves as an introduction to this research. It presents the current understanding of value in PD, the critical questions involved, and a specific research design to guide the development of a methodology for measuring value. Work in PD value currently focuses on either high-level perspectives on value, or detailed looks at the attributes that value might have locally in the PD process. Models that attempt to capture value in PD are reviewed. These methods, however, do not capture the depth necessary to allow for application. A methodology is needed to evaluate activities on a local level to determine the amount of value they add and their sensitivity with respect to performance, cost, time, and risk. Two conceptual tools are proposed. The first is a conceptual framework for value creation in PD, referred to here as the Value Creation Model. The second tool is the Value-Activity Map, which shows the relationships between specific activities and value attributes. These maps will allow a better understanding of the development of value in PD, will facilitate comparison of value development between separate projects, and will provide the information necessary to adapt process analysis tools (such as DSM) to consider value. The key questions that this research entails are: · What are the primary attributes of lifecycle value within PD? · How can one model the creation of value in a specific PD process? · Can a useful methodology be developed to quantify value in PD processes? · What are the tools necessary for application? · What PD metrics will be integrated with the necessary tools? The research milestones are: · Collection of value attributes and activities (September, 200) · Development of methodology of value-activity association (October, 2000) · Testing and refinement of the methodology (January, 2001) · Tool Development (March, 2001) · Present findings at July INCOSE conference (April, 2001) · Deliver thesis that captures a formalized methodology for defining value in PD (including LEM data sheets) (June, 2001) The research design aims for the development of two primary deliverables: a methodology to guide the incorporation of value, and a product development tool that will allow direct application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An unsupervised approach to image segmentation which fuses region and boundary information is presented. The proposed approach takes advantage of the combined use of 3 different strategies: the guidance of seed placement, the control of decision criterion, and the boundary refinement. The new algorithm uses the boundary information to initialize a set of active regions which compete for the pixels in order to segment the whole image. The method is implemented on a multiresolution representation which ensures noise robustness as well as computation efficiency. The accuracy of the segmentation results has been proven through an objective comparative evaluation of the method

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La operación de mega minería que abarca CERREJON se desarrolla bajo lo más altos estándares de seguridad y calidad, con el compromiso de entregar al mercado un excelente producto y evitando el deterioro del entorno. Uno de los puntos más fuertes de la compañía se haya en la integración de los procesos productivos entre la mina de carbón, el ferrocarril y el puerto, logrando que la operación sea eficiente y que garantice niveles altos de óptimos resultados y que los porcentajes de falla no existan o cada vez sean menores en el proceso de explotación y transporte del carbón de Colombia hacia el mundo. El departamento de MATERIALES es el punto de origen para garantizar que se desarrolle la operación, dado que el departamento tiene la gran labor de adquirir y entregar los bienes y servicios requeridos por la Compañía al más bajo costo total evaluado, en el menor tiempo posible y bajo el marco de la legislación Colombiana en el proceso de nacionalización, con un alto énfasis en el desarrollo de relaciones fuertes y sinérgicas entre todos los eslabones de la cadena. Todo el proceso del departamento se enmarca dentro de un ciclo que debe encaminarse a ser cada vez más efectivo y eficiente; de allí que se busquen opciones de mejoramiento para afinar los procesos. Para que sea posible y factible manejar una mega operación como esta se requiere afinar al máximo la red que se establece en su cadena de abastecimiento, buscando lograr un flujo de producto, información y fondos que garanticen disponibilidad del producto, para así generar una rentabilidad alta a la compañía, y controlar los costos por la operación. Dado que la cadena de suministro es susceptible a mejoras, gracias a la interacción con los proveedores, colaboradores y clientes; se pretende sacar el mejor provecho de esto, a través del análisis de la cadena actual de CERREJON; presentando una opción de mejora en uno de los eslabones del proceso productivo, esta opción ha sido contemplada desde años anteriores, pero en esta ocasión gracias a la integración de afinamientos en los sistemas de información, y la participación activa de los proveedores, se encuentra una opción viable de la eliminación de un reproceso, que garantiza eficiencia y efectividad en la agilización del ciclo de producción en CERREJON. A través de la ejecución del proyecto de reforma del conteo, se presenta la oportunidad de mejoramiento en la cadena del departamento; el proceso de reconteo en la mina, realizado posteriormente al conteo inicial en Puerto Bolívar, de los materiales que llegan importados vía marítima. El proceso de afinamiento del recibo documental en el consolidador de carga y los mayores proveedores de entrega directa (HITACHI y MCA) al transportador, genera la opción del uso de terminales portátiles, que de la mano con los ajustes documentales, permitirán que la carga sea contada y separada por localización, para enviarla vía tren a LMN, reduciendo tiempo de entrega al cliente final, los costos de remanejo al volver a contar, y los costos asociados a las entregas no oportunas de devolución de contenedores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introducción: La escala de severidad en emergencias es una herramienta que ofrece seguridad a pacientes en servicios de urgencias. Este trabajo evalúa la aplicación de la escala ESI 4.0 en términos de oportunidad de atención y consumo de recursos en la Fundación Santa Fé de Bogotá, para comparar los resultados con parámetros estándar. Metodología Estudio observacional analítico de corte transversal. Se incluyeron 385 pacientes aleatorizados por nivel de atención. Se tomaron datos demográficos y variables como consumo de recursos y destino del paciente para su descripción y análisis. Resultados: El promedio de edad fue 44.9 años IC95%42.9–46.9, el 54.5% fueron mujeres. Se encontró un tiempo promedio de espera para nivel 1 de 1.39 min, para el nivel 2 de 22.9 min 2, para el nivel 3 de 41.9 min, para el nivel 4 de 56.9 min y para el nivel 5 de 52.1 min. El tiempo promedio de estancia en urgencias fue 5.9 horas y el 78.9% consumió recursos. Al comparar los tiempos con estándares mundiales en el nivel 1, 2 y 3 son significativamente mayores (P<0,05), en el nivel 4 es similar (p0,51) y en el nivel 5 es significativamente menor (p=0,00) Discusión: La escala ESI 4.0 es una herramienta segura, con un comportamiento similar en oportunidad de atención y consumo de recursos con respecto a los estándares de cuidado en los servicios de urgencias.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les dues proteïnes estudiades en aquest treball (ECP o RNasa 3 i RNasa 1ΔN7) pertanyen a la superfamília de la RNasa A i resulten d'especial interès per la seva potencial aplicació en la teràpia i/o diagnòstic del càncer. A més de la seva capacitat ribonucleolítica, l'ECP presenta d'altres activitats, com l'antibacteriana, l'helmintotòxica o la citotòxica contra cèl·lules i teixits de mamífers. Per la RNasa 1 de tipus pancreàtic expressada per les cèl·lules endotelials humanes també s'ha proposat un paper defensiu. La RNasa 1ΔN7, en canvi, no presenta aquest tipus d'accions biològiques, si bé cal destacar la menor afinitat que exhibeix enfront el seu inhibidor específic en relació a d'altres membres de la família. Tant l'ECP com la RNasa 1ΔN7 s'han cristal·litzat emprant la tècnica de la difusió de vapor en gotes penjants, i s'han determinat les seves estructures tridimensionals (3D) mitjançant el mètode del reemplaçament molecular. Per l'afinament de les estructures s'han usat dades fins a 1,75 i 1,90 Å respectivament. Ambdòs molècules exhibeixen el plegament típic  +  que caracteritza a tots els membres de la superfamília de la RNasa A. Tanmateix, les diferències que mostren en comparació amb l'estructura d'altres RNases permeten explicar, d'una banda, la baixa activitat ribonucleolítica d'aquests enzims i, de l'altra, les seves peculiaritats funcionals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tesis se centra en la Visión por Computador y, más concretamente, en la segmentación de imágenes, la cual es una de las etapas básicas en el análisis de imágenes y consiste en la división de la imagen en un conjunto de regiones visualmente distintas y uniformes considerando su intensidad, color o textura. Se propone una estrategia basada en el uso complementario de la información de región y de frontera durante el proceso de segmentación, integración que permite paliar algunos de los problemas básicos de la segmentación tradicional. La información de frontera permite inicialmente identificar el número de regiones presentes en la imagen y colocar en el interior de cada una de ellas una semilla, con el objetivo de modelar estadísticamente las características de las regiones y definir de esta forma la información de región. Esta información, conjuntamente con la información de frontera, es utilizada en la definición de una función de energía que expresa las propiedades requeridas a la segmentación deseada: uniformidad en el interior de las regiones y contraste con las regiones vecinas en los límites. Un conjunto de regiones activas inician entonces su crecimiento, compitiendo por los píxeles de la imagen, con el objetivo de optimizar la función de energía o, en otras palabras, encontrar la segmentación que mejor se adecua a los requerimientos exprsados en dicha función. Finalmente, todo esta proceso ha sido considerado en una estructura piramidal, lo que nos permite refinar progresivamente el resultado de la segmentación y mejorar su coste computacional. La estrategia ha sido extendida al problema de segmentación de texturas, lo que implica algunas consideraciones básicas como el modelaje de las regiones a partir de un conjunto de características de textura y la extracción de la información de frontera cuando la textura es presente en la imagen. Finalmente, se ha llevado a cabo la extensión a la segmentación de imágenes teniendo en cuenta las propiedades de color y textura. En este sentido, el uso conjunto de técnicas no-paramétricas de estimación de la función de densidad para la descripción del color, y de características textuales basadas en la matriz de co-ocurrencia, ha sido propuesto para modelar adecuadamente y de forma completa las regiones de la imagen. La propuesta ha sido evaluada de forma objetiva y comparada con distintas técnicas de integración utilizando imágenes sintéticas. Además, se han incluido experimentos con imágenes reales con resultados muy positivos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This workshop paper reports recent developments to a vision system for traffic interpretation which relies extensively on the use of geometrical and scene context. Firstly, a new approach to pose refinement is reported, based on forces derived from prominent image derivatives found close to an initial hypothesis. Secondly, a parameterised vehicle model is reported, able to represent different vehicle classes. This general vehicle model has been fitted to sample data, and subjected to a Principal Component Analysis to create a deformable model of common car types having 6 parameters. We show that the new pose recovery technique is also able to operate on the PCA model, to allow the structure of an initial vehicle hypothesis to be adapted to fit the prevailing context. We report initial experiments with the model, which demonstrate significant improvements to pose recovery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new formulation of a pose refinement technique using ``active'' models is described. An error term derived from the detection of image derivatives close to an initial object hypothesis is linearised and solved by least squares. The method is particularly well suited to problems involving external geometrical constraints (such as the ground-plane constraint). We show that the method is able to recover both the pose of a rigid model, and the structure of a deformable model. We report an initial assessment of the performance and cost of pose and structure recovery using the active model in comparison with our previously reported ``passive'' model-based techniques in the context of traffic surveillance. The new method is more stable, and requires fewer iterations, especially when the number of free parameters increases, but shows somewhat poorer convergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulations of the global atmosphere for weather and climate forecasting require fast and accurate solutions and so operational models use high-order finite differences on regular structured grids. This precludes the use of local refinement; techniques allowing local refinement are either expensive (eg. high-order finite element techniques) or have reduced accuracy at changes in resolution (eg. unstructured finite-volume with linear differencing). We present solutions of the shallow-water equations for westerly flow over a mid-latitude mountain from a finite-volume model written using OpenFOAM. A second/third-order accurate differencing scheme is applied on arbitrarily unstructured meshes made up of various shapes and refinement patterns. The results are as accurate as equivalent resolution spectral methods. Using lower order differencing reduces accuracy at a refinement pattern which allows errors from refinement of the mountain to accumulate and reduces the global accuracy over a 15 day simulation. We have therefore introduced a scheme which fits a 2D cubic polynomial approximately on a stencil around each cell. Using this scheme means that refinement of the mountain improves the accuracy after a 15 day simulation. This is a more severe test of local mesh refinement for global simulations than has been presented but a realistic test if these techniques are to be used operationally. These efficient, high-order schemes may make it possible for local mesh refinement to be used by weather and climate forecast models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A scale-invariant moving finite element method is proposed for the adaptive solution of nonlinear partial differential equations. The mesh movement is based on a finite element discretisation of a scale-invariant conservation principle incorporating a monitor function, while the time discretisation of the resulting system of ordinary differential equations is carried out using a scale-invariant time-stepping which yields uniform local accuracy in time. The accuracy and reliability of the algorithm are successfully tested against exact self-similar solutions where available, and otherwise against a state-of-the-art h-refinement scheme for solutions of a two-dimensional porous medium equation problem with a moving boundary. The monitor functions used are the dependent variable and a monitor related to the surface area of the solution manifold. (c) 2005 IMACS. Published by Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A one-dimensional water column model using the Mellor and Yamada level 2.5 parameterization of vertical turbulent fluxes is presented. The model equations are discretized with a mixed finite element scheme. Details of the finite element discrete equations are given and adaptive mesh refinement strategies are presented. The refinement criterion is an "a posteriori" error estimator based on stratification, shear and distance to surface. The model performances are assessed by studying the stress driven penetration of a turbulent layer into a stratified fluid. This example illustrates the ability of the presented model to follow some internal structures of the flow and paves the way for truly generalized vertical coordinates. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accuracy and mesh generation are key issues for the high-resolution hydrodynamic modelling of the whole Great Barrier Reef. Our objective is to generate suitable unstructured grids that can resolve topological and dynamical features like tidal jets and recirculation eddies in the wake of islands. A new strategy is suggested to refine the mesh in areas of interest taking into account the bathymetric field and an approximated distance to islands and reefs. Such a distance is obtained by solving an elliptic differential operator, with specific boundary conditions. Meshes produced illustrate both the validity and the efficiency of the adaptive strategy. Selection of refinement and geometrical parameters is discussed. (c) 2006 Elsevier Ltd. All rights reserved.