974 resultados para 090407 Process Control and Simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Master production schedule (MPS) plays an important role in an integrated production planning system. It converts the strategic planning defined in a production plan into the tactical operation execution. The MPS is also known as a tool for top management to control over manufacture resources and becomes input of the downstream planning levels such as material requirement planning (MRP) and capacity requirement planning (CRP). Hence, inappropriate decision on the MPS development may lead to infeasible execution, which ultimately causes poor delivery performance. One must ensure that the proposed MPS is valid and realistic for implementation before it is released to real manufacturing system. In practice, where production environment is stochastic in nature, the development of MPS is no longer simple task. The varying processing time, random event such as machine failure is just some of the underlying causes of uncertainty that may be hardly addressed at planning stage so that in the end the valid and realistic MPS is tough to be realized. The MPS creation problem becomes even more sophisticated as decision makers try to consider multi-objectives; minimizing inventory, maximizing customer satisfaction, and maximizing resource utilization. This study attempts to propose a methodology for MPS creation which is able to deal with those obstacles. This approach takes into account uncertainty and makes trade off among conflicting multi-objectives at the same time. It incorporates fuzzy multi-objective linear programming (FMOLP) and discrete event simulation (DES) for MPS development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to display a homogeneous image using multiple projectors, differences in the projected intensities must be compensated. In this paper, we present novel approaches to combine and extend existing techniques for edge blending and luminance harmonization to achieve a detailed luminance control. Furthermore, we apply techniques for improving the contrast ratio of multi-segmented displays also to the black offset correction. We also present a simple scheme to involve the displayed context in the correction process to dynamically improve the contrast in brighter images. In addition, we present a metric to evaluate the different methods and their influence on the visual quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Literature on hypertension treatment has demonstrated that a healthy life style is one of the best strategies for hypertension control. In exploring the mechanisms of behavioral change for hypertension control, a comprehensive study based on the Transtheoretical Model was carried out in Taiwan during the summer of 2000 with a sample of 350 hypertensive adults living in Taipei urban and rural areas. ^ The relationships among stages of change, processes of change and demographic factors were analyzed for six health behaviors—low fat food consumption, alcohol use, smoking, physical activity, weight control, and routine blood pressure checkups. In addition, differences were assessed between urban and rural populations in changing their behavior for hypertension control. ^ The results showed that rural populations had more difficulties than urban populations in avoiding smoking and engaging in physical activity, and the processes of change being used by urban populations were significantly greater than rural populations. The study findings support a strong association between processes and stages of change. ^ Individuals who use more processes of change will be more inclined to move from precontemplation stage to maintenance stage. Counterconditioning, which is the substitution of alternatives for the problem behaviors, in this study, significantly helped people to change diet, engage in physical activity, and check blood pressure regularly. For example, counterconditioning is eating more vegetables instead of meat, or engaging in physical activity as a time to relax rather than another task to accomplish. ^ In addition, self-reevaluation was the most important process for helping people to engage in physical activity; and social liberation was the most important process for changing diet behavior. The findings in this study may be applied to improve health behaviors among rural populations with low income and low education; however, at the same time, the obesity problems among urban populations should be prevented to control hypertension in Taiwan. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Five permanent cell lines were developed from Xiphophorus maculatus, X. helleri, and their hybrids using three tissue sources, including adults and embryos of different stages. To evaluate cell line gene expression for retention of either tissue-of-origin-specific or ontogenetic stage-specific characters, the activity distribution of 44 enzyme loci was determined in 11 X. maculatus tissues, and the developmental genetics of 17 enzyme loci was charted in X. helleri and in helleri x maculatus hybrids using starch gel electrophoresis. In the process, eight new loci were discovered and characterized for Xiphophorus.^ No Xiphophorus cell line showed retention of tissue-of-origin-specific or ontogenetic stage-specific enzyme gene expressional traits. Instead, gene expression was similar among the cell lines. One enzyme, adenosine deaminase (ADA) was an exception. Two adult-origin cell lines expressed ADA, whereas, three cell lines derived independently from embryos did not. ADA('-) expression of Xiphophorus embryo-derived cell lines may represent retention of an embryonic gene expressional trait. In one cell line (T(,3)) derived from 13 pooled interspecific hybrid (F(,2)) embryos, shifts with time were observed at enzyme loci polymorphic between the two species. This suggested shifts in ratios of cells of different genotypes in the population rather than unstable gene expression in one dominant cell type.^ Verification of this hypothesis was attempted by cloning the culture--seeking clones having different genetic signatures. The large number of loci electrophoretically polymorphic between the two species and whose alleles segregated independently into the 13 progeny from which this culture originated almost guaranteed the presence of different genetic signatures (lineages) in T(,3).^ Seven lineages of cells were found within T(,3), each expressing genotypes at some loci not characteristic of the expression of the culture-as-a-whole, supporting the hypothesis tested. Quantitative studies of ADA expression in the whole culture (ADA('-)) and in clones of these seven lineages suggested the predominance in T(,3) of ADA deficient cell lineages, although moderate to high ADA output clones also occurred. Thus, T(,3) has the potential to shift phenotypes from ADA('-) to ADA('+) by simply changing proportions of its constituent cell types, demonstrating that such shifts can occur in any cell culture containing cells of mixed expressional characteristics.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this paper is the presentation of modelling solutions off loating devices that can be used for harnessing energy from ocean currents. It has been structured into three main parts. First, the growing current interest in marine renewable energy in general, and in extracting energy from currents in particular, is presented, showing the large number of solutions that are emerging and some of the most significant types. GESMEY generator is presented in second section. It is based on a new concept that has been patented by the Universidad Politécnica de Madrid and which is currently being developed through a collaborative agreement with the SOERMAR Foundation. The main feature of this generator is that on operation is fully submerged, and no other facilities are required to move to floating state for maintenance, which greatly increases its performance. Third part of the article is devoted to present the modelling and simulation challenges that arise in the development of devices for harnessing the energy of marine currents, along with some solutions which have been adopted within the frame of the GESMEY Project, making particular emphasis on the dynamics of the generator and its control

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La fiabilidad está pasando a ser el principal problema de los circuitos integrados según la tecnología desciende por debajo de los 22nm. Pequeñas imperfecciones en la fabricación de los dispositivos dan lugar ahora a importantes diferencias aleatorias en sus características eléctricas, que han de ser tenidas en cuenta durante la fase de diseño. Los nuevos procesos y materiales requeridos para la fabricación de dispositivos de dimensiones tan reducidas están dando lugar a diferentes efectos que resultan finalmente en un incremento del consumo estático, o una mayor vulnerabilidad frente a radiación. Las memorias SRAM son ya la parte más vulnerable de un sistema electrónico, no solo por representar más de la mitad del área de los SoCs y microprocesadores actuales, sino también porque las variaciones de proceso les afectan de forma crítica, donde el fallo de una única célula afecta a la memoria entera. Esta tesis aborda los diferentes retos que presenta el diseño de memorias SRAM en las tecnologías más pequeñas. En un escenario de aumento de la variabilidad, se consideran problemas como el consumo de energía, el diseño teniendo en cuenta efectos de la tecnología a bajo nivel o el endurecimiento frente a radiación. En primer lugar, dado el aumento de la variabilidad de los dispositivos pertenecientes a los nodos tecnológicos más pequeños, así como a la aparición de nuevas fuentes de variabilidad por la inclusión de nuevos dispositivos y la reducción de sus dimensiones, la precisión del modelado de dicha variabilidad es crucial. Se propone en la tesis extender el método de inyectores, que modela la variabilidad a nivel de circuito, abstrayendo sus causas físicas, añadiendo dos nuevas fuentes para modelar la pendiente sub-umbral y el DIBL, de creciente importancia en la tecnología FinFET. Los dos nuevos inyectores propuestos incrementan la exactitud de figuras de mérito a diferentes niveles de abstracción del diseño electrónico: a nivel de transistor, de puerta y de circuito. El error cuadrático medio al simular métricas de estabilidad y prestaciones de células SRAM se reduce un mínimo de 1,5 veces y hasta un máximo de 7,5 a la vez que la estimación de la probabilidad de fallo se mejora en varios ordenes de magnitud. El diseño para bajo consumo es una de las principales aplicaciones actuales dada la creciente importancia de los dispositivos móviles dependientes de baterías. Es igualmente necesario debido a las importantes densidades de potencia en los sistemas actuales, con el fin de reducir su disipación térmica y sus consecuencias en cuanto al envejecimiento. El método tradicional de reducir la tensión de alimentación para reducir el consumo es problemático en el caso de las memorias SRAM dado el creciente impacto de la variabilidad a bajas tensiones. Se propone el diseño de una célula que usa valores negativos en la bit-line para reducir los fallos de escritura según se reduce la tensión de alimentación principal. A pesar de usar una segunda fuente de alimentación para la tensión negativa en la bit-line, el diseño propuesto consigue reducir el consumo hasta en un 20 % comparado con una célula convencional. Una nueva métrica, el hold trip point se ha propuesto para prevenir nuevos tipos de fallo debidos al uso de tensiones negativas, así como un método alternativo para estimar la velocidad de lectura, reduciendo el número de simulaciones necesarias. Según continúa la reducción del tamaño de los dispositivos electrónicos, se incluyen nuevos mecanismos que permiten facilitar el proceso de fabricación, o alcanzar las prestaciones requeridas para cada nueva generación tecnológica. Se puede citar como ejemplo el estrés compresivo o extensivo aplicado a los fins en tecnologías FinFET, que altera la movilidad de los transistores fabricados a partir de dichos fins. Los efectos de estos mecanismos dependen mucho del layout, la posición de unos transistores afecta a los transistores colindantes y pudiendo ser el efecto diferente en diferentes tipos de transistores. Se propone el uso de una célula SRAM complementaria que utiliza dispositivos pMOS en los transistores de paso, así reduciendo la longitud de los fins de los transistores nMOS y alargando los de los pMOS, extendiéndolos a las células vecinas y hasta los límites de la matriz de células. Considerando los efectos del STI y estresores de SiGe, el diseño propuesto mejora los dos tipos de transistores, mejorando las prestaciones de la célula SRAM complementaria en más de un 10% para una misma probabilidad de fallo y un mismo consumo estático, sin que se requiera aumentar el área. Finalmente, la radiación ha sido un problema recurrente en la electrónica para aplicaciones espaciales, pero la reducción de las corrientes y tensiones de los dispositivos actuales los está volviendo vulnerables al ruido generado por radiación, incluso a nivel de suelo. Pese a que tecnologías como SOI o FinFET reducen la cantidad de energía colectada por el circuito durante el impacto de una partícula, las importantes variaciones de proceso en los nodos más pequeños va a afectar su inmunidad frente a la radiación. Se demuestra que los errores inducidos por radiación pueden aumentar hasta en un 40 % en el nodo de 7nm cuando se consideran las variaciones de proceso, comparado con el caso nominal. Este incremento es de una magnitud mayor que la mejora obtenida mediante el diseño de células de memoria específicamente endurecidas frente a radiación, sugiriendo que la reducción de la variabilidad representaría una mayor mejora. ABSTRACT Reliability is becoming the main concern on integrated circuit as the technology goes beyond 22nm. Small imperfections in the device manufacturing result now in important random differences of the devices at electrical level which must be dealt with during the design. New processes and materials, required to allow the fabrication of the extremely short devices, are making new effects appear resulting ultimately on increased static power consumption, or higher vulnerability to radiation SRAMs have become the most vulnerable part of electronic systems, not only they account for more than half of the chip area of nowadays SoCs and microprocessors, but they are critical as soon as different variation sources are regarded, with failures in a single cell making the whole memory fail. This thesis addresses the different challenges that SRAM design has in the smallest technologies. In a common scenario of increasing variability, issues like energy consumption, design aware of the technology and radiation hardening are considered. First, given the increasing magnitude of device variability in the smallest nodes, as well as new sources of variability appearing as a consequence of new devices and shortened lengths, an accurate modeling of the variability is crucial. We propose to extend the injectors method that models variability at circuit level, abstracting its physical sources, to better model sub-threshold slope and drain induced barrier lowering that are gaining importance in FinFET technology. The two new proposed injectors bring an increased accuracy of figures of merit at different abstraction levels of electronic design, at transistor, gate and circuit levels. The mean square error estimating performance and stability metrics of SRAM cells is reduced by at least 1.5 and up to 7.5 while the yield estimation is improved by orders of magnitude. Low power design is a major constraint given the high-growing market of mobile devices that run on battery. It is also relevant because of the increased power densities of nowadays systems, in order to reduce the thermal dissipation and its impact on aging. The traditional approach of reducing the voltage to lower the energy consumption if challenging in the case of SRAMs given the increased impact of process variations at low voltage supplies. We propose a cell design that makes use of negative bit-line write-assist to overcome write failures as the main supply voltage is lowered. Despite using a second power source for the negative bit-line, the design achieves an energy reduction up to 20% compared to a conventional cell. A new metric, the hold trip point has been introduced to deal with new sources of failures to cells using a negative bit-line voltage, as well as an alternative method to estimate cell speed, requiring less simulations. With the continuous reduction of device sizes, new mechanisms need to be included to ease the fabrication process and to meet the performance targets of the successive nodes. As example we can consider the compressive or tensile strains included in FinFET technology, that alter the mobility of the transistors made out of the concerned fins. The effects of these mechanisms are very dependent on the layout, with transistor being affected by their neighbors, and different types of transistors being affected in a different way. We propose to use complementary SRAM cells with pMOS pass-gates in order to reduce the fin length of nMOS devices and achieve long uncut fins for the pMOS devices when the cell is included in its corresponding array. Once Shallow Trench isolation and SiGe stressors are considered the proposed design improves both kinds of transistor, boosting the performance of complementary SRAM cells by more than 10% for a same failure probability and static power consumption, with no area overhead. While radiation has been a traditional concern in space electronics, the small currents and voltages used in the latest nodes are making them more vulnerable to radiation-induced transient noise, even at ground level. Even if SOI or FinFET technologies reduce the amount of energy transferred from the striking particle to the circuit, the important process variation that the smallest nodes will present will affect their radiation hardening capabilities. We demonstrate that process variations can increase the radiation-induced error rate by up to 40% in the 7nm node compared to the nominal case. This increase is higher than the improvement achieved by radiation-hardened cells suggesting that the reduction of process variations would bring a higher improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissection of the primary and secondary response to an influenza A virus established that the liver contains a substantial population of CD8+ T cells specific for the immunodominant epitope formed by H-2Db and the influenza virus nucleoprotein peptide fragment NP366–374 (DbNP366). The numbers of CD8+ DbNP366+ cells in the liver reflected the magnitude of the inflammatory process in the pneumonic lung, though replication of this influenza virus is limited to the respiratory tract. Analysis of surface phenotypes indicated that the liver CD8+ DbNP366+ cells tended to be more “activated” than the set recovered from lymphoid tissue but generally less so than those from the lung. The distinguishing characteristic of the lymphocytes from the liver was that the prevalence of the CD8+ DbNP366+ set was always much higher than the percentage of CD8+ T cells that could be induced to synthesize interferon γ after short-term, in vitro stimulation with the NP366–374 peptide, whereas these values were generally comparable for virus-specific CD8+ T cells recovered from other tissue sites. Also, the numbers of apoptotic CD8+ T cells were higher in the liver. The results overall are consistent with the idea that antigen-specific CD8+ T cells are destroyed in the liver during the control and resolution phases of this viral infection, though this destruction is not necessarily an immediate process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have measured experimental adsorption isotherms of water in zeolite LTA4A, and studied the regeneration process by performing subsequent adsorption cycles after degassing at different temperatures. We observed incomplete desorption at low temperatures, and cation rearrangement at successive adsorption cycles. We also developed a new molecular simulation force field able to reproduce experimental adsorption isotherms in the range of temperatures between 273 K and 374 K. Small deviations observed at high pressures are attributed to the change in the water dipole moment at high loadings. The force field correctly describes the preferential adsorption sites of water at different pressures. We tested the influence of the zeolite structure, framework flexibility, and cation mobility when considering adsorption and diffusion of water. Finally, we performed checks on force field transferability between different hydrophilic zeolite types, concluding that classical, non-polarizable water force fields are not transferable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Standards reduce production costs and increase products’ value to consumers. Standards however entail risks of anti-competitive abuse. After the adoption of a standard, the chosen technology normally lacks credible substitutes. The owner of the patented technology might thus have additional market power relative to locked-in licensees, and might exploit this power to charge higher access rates. In the economic literature this phenomenon is referred to as ‘hold-up’. To reduce the risk of hold-up, standard-setting organisations often require patent holders to disclose their standard-essential patents before the adoption of the standard and to commit to license on fair, reasonable and non-discriminatory (FRAND) terms. The European Commission normally investigates unfair pricing abuse in a standard-setting context if a patent holder who committed to FRAND ex-ante is suspected not to abide to it ex-post. However, this approach risks ignoring a number of potential abuses which are likely harmful for welfare. That can happen if, for example, ex-post a licensee is able to impose excessively low access rates (‘reverse hold-up’) or if a patent holder acquires additional market power thanks to the standard but its essential patents are not encumbered by FRAND commitments, for instance because the patent holder did not directly participate to the standard setting process and was therefore not required by the standard-setting organisations to commit to FRAND ex-ante. A consistent policy by the Commission capable of tackling all sources of harm should be enforced regardless of whether FRAND commitments are given. Antitrust enforcement should hinge on the identification of a distortion in the bargaining process around technology access prices, which is determined by the adoption of the standard and is not attributable to pro-competitive merits of any of the involved players.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fuzzy data has grown to be an important factor in data mining. Whenever uncertainty exists, simulation can be used as a model. Simulation is very flexible, although it can involve significant levels of computation. This article discusses fuzzy decision-making using the grey related analysis method. Fuzzy models are expected to better reflect decision-making uncertainty, at some cost in accuracy relative to crisp models. Monte Carlo simulation is used to incorporate experimental levels of uncertainty into the data and to measure the impact of fuzzy decision tree models using categorical data. Results are compared with decision tree models based on crisp continuous data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discrete stochastic simulations are a powerful tool for understanding the dynamics of chemical kinetics when there are small-to-moderate numbers of certain molecular species. In this paper we introduce delays into the stochastic simulation algorithm, thus mimicking delays associated with transcription and translation. We then show that this process may well explain more faithfully than continuous deterministic models the observed sustained oscillations in expression levels of hes1 mRNA and Hes1 protein.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The potential for the use of DEA and simulation in a mutually supporting role in guiding operating units to improved performance is presented. An analysis following a three-stage process is suggested. Stage one involves obtaining the data for the DEA analysis. This can be sourced from historical data, simulated data or a combination of the two. Stage two involves the DEA analysis that identifies benchmark operating units. In the third stage simulation can now be used in order to offer practical guidance to operating units towards improved performance. This can be achieved by the use of sensitivity analysis of the benchmark unit using a simulation model to offer direct support as to the feasibility and efficiency of any variations in operating practices to be tested. Alternatively, the simulation can be used as a mechanism to transmit the practices of the benchmark unit to weaker performing units by building a simulation model of the weaker unit to the process design of the benchmark unit. The model can then compare performance of the current and benchmark process designs. Quantifying improvement in this way provides a useful driver to any process change initiative that is required to bring the performance of weaker units up to the best in class. © 2005 Operational Research Society Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.