897 resultados para Complex Design Space


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is to propose a multiobjective optimization approach for solving the manufacturing cell formation problem, explicitly considering the performance of this said manufacturing system. Cells are formed so as to simultaneously minimize three conflicting objectives, namely, the level of the work-in-process, the intercell moves and the total machinery investment. A genetic algorithm performs a search in the design space, in order to approximate to the Pareto optimal set. The values of the objectives for each candidate solution in a population are assigned by running a discrete-event simulation, in which the model is automatically generated according to the number of machines and their distribution among cells implied by a particular solution. The potential of this approach is evaluated via its application to an illustrative example, and a case from the relevant literature. The obtained results are analyzed and reviewed. Therefore, it is concluded that this approach is capable of generating a set of alternative manufacturing cell configurations considering the optimization of multiple performance measures, greatly improving the decision making process involved in planning and designing cellular systems. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the usual formulation of quantum mechanics, groups of automorphisms of quantum states have ray representations by unitary and antiunitary operators on complex Hilbert space, in accordance with Wigner's theorem. In the phase-space formulation, they have real, true unitary representations in the space of square-integrable functions on phase space. Each such phase-space representation is a Weyl–Wigner product of the corresponding Hilbert space representation with its contragredient, and these can be recovered by 'factorizing' the Weyl–Wigner product. However, not every real, unitary representation on phase space corresponds to a group of automorphisms, so not every such representation is in the form of a Weyl–Wigner product and can be factorized. The conditions under which this is possible are examined. Examples are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES Graves' disease (GD) complicates 0.1% to 0.2% of pregnancies, but congenital thyrotoxicosis is rare occurring in one in 70 of these pregnancies independent of maternal disease status. Antenatal prediction of affected infants is imprecise; however, maternal history, coupled with a high maternal serum TSH receptor binding immunoglobulin index (TBII) predict adverse neonatal outcome. Mortality is reported to be as high as 25% in affected infants and would therefore be expected to be higher in premature infants. This study illustrates that in sick, premature, extreme low birth weight (ELBW) or intrauterine growth retarded (IUGR) infants, the diagnosis maybe overlooked especially in the absence of antenatal risk assessment and management of thyrotoxicosis in this setting is complex. DESIGN and PATIENTS The records of premature neonates born at the three main maternity units in Brisbane, between January 1996 and July 1998 diagnosed with congenital thyrotoxicosis were reviewed. Data were recorded on gestational age, birth weight (B Wt), maternal thyroid history and current status, and neonatal course. Thyroid function and TBII status was assessed using standard biochemical assays. RESULTS Seven neonates from five pregnancies were identified (four female, three male). Mean gestational age was 30 week (25-36 week) and median B Wt was 1.96 kg (0.50-2.62 kg). Only one mother received formal antenatal counselling by a paediatric endocrine service and had a TBII (54%) measured prior to delivery. Three of five mothers had elevated TBII measured after diagnosis in their offspring (57%, 65%, 83%) and in one mother, a TBII was not performed. All mothers were biochemically euthyroid at delivery. Mean age at diagnosis was 9 days (1-16 days) and mean age at commencement of treatment was 12 days (7-26 days). Two infants received propylthiouracil and five received a combination of carbimazole and propranolol. Pour became biochemically hypothyroid, in three this resolved with cessation of the antithyroid drug (ATD), and one required ongoing T4 supplementation. Only one infant required treatment for cardiac failure and there were no deaths in this cohort. CONCLUSIONS This is a large series of extremely small and premature infants with neonatal thyrotoxicosis. Presentation was nonspecific. The diagnosis was delayed because of low birth weight, prematurity, multiple birth and/or an unrecognized maternal history of Graves' disease. The treatment of neonatal thyrotoxicosis was difficult in these extreme law birth weight infants yet no infant died and significant morbidity was confined to high output cardiac failure in one infant. With antenatal recognition of past or active Graves' disease, assessment of maternal TSH receptor binding immunoglobulin index prior to delivery and postnatal monitoring of cord TSH and venous fT4 and TSH on days 4 and 7 rapid treatment of affected infants may have further reduced neonatal morbidity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article develops a weighted least squares version of Levene's test of homogeneity of variance for a general design, available both for univariate and multivariate situations. When the design is balanced, the univariate and two common multivariate test statistics turn out to be proportional to the corresponding ordinary least squares test statistics obtained from an analysis of variance of the absolute values of the standardized mean-based residuals from the original analysis of the data. The constant of proportionality is simply a design-dependent multiplier (which does not necessarily tend to unity). Explicit results are presented for randomized block and Latin square designs and are illustrated for factorial treatment designs and split-plot experiments. The distribution of the univariate test statistic is close to a standard F-distribution, although it can be slightly underdispersed. For a complex design, the test assesses homogeneity of variance across blocks, treatments, or treatment factors and offers an objective interpretation of residual plot.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de mestrado em Bioinformática

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En aquest treball es tracten qüestions de la geometria integral clàssica a l'espai hiperbòlic i projectiu complex i a l'espai hermític estàndard, els anomenats espais de curvatura holomorfa constant. La geometria integral clàssica estudia, entre d'altres, l'expressió en termes geomètrics de la mesura de plans que tallen un domini convex fixat de l'espai euclidià. Aquesta expressió es dóna en termes de les integrals de curvatura mitja. Un dels resultats principals d'aquest treball expressa la mesura de plans complexos que tallen un domini fixat a l'espai hiperbòlic complex, en termes del que definim com volums intrínsecs hermítics, que generalitzen les integrals de curvatura mitja. Una altra de les preguntes que tracta la geometria integral clàssica és: donat un domini convex i l'espai de plans, com s'expressa la integral de la s-èssima integral de curvatura mitja del convex intersecció entre un pla i el convex fixat? A l'espai euclidià, a l'espai projectiu i hiperbòlic reals, aquesta integral correspon amb la s-èssima integral de curvatura mitja del convex inicial: se satisfà una propietat de reproductibitat, que no es té en els espais de curvatura holomorfa constant. En el treball donem l'expressió explícita de la integral de la curvatura mitja quan integrem sobre l'espai de plans complexos. L'expressem en termes de la integral de curvatura mitja del domini inicial i de la integral de la curvatura normal en una direcció especial: l'obtinguda en aplicar l'estructura complexa al vector normal. La motivació per estudiar els espais de curvatura holomorfa constant i, en particular, l'espai hiperbòlic complex, es troba en l'estudi del següent problema clàssic en geometria. Quin valor pren el quocient entre l'àrea i el perímetre per a successions de figures convexes del pla que creixen tendint a omplir-lo? Fins ara es coneixia el comportament d'aquest quocient en els espais de curvatura seccional negativa i que a l'espai hiperbòlic real les fites obtingudes són òptimes. Aquí provem que a l'espai hiperbòlic complex, les cotes generals no són òptimes i optimitzem la superior.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we study the set of periods of holomorphic maps on compact manifolds, using the periodic Lefschetz numbers introduced by Dold and Llibre, which can be computed from the homology class of the map. We show that these numbers contain information about the existence of periodic points of a given period; and, if we assume the map to be transversal, then they give us the exact number of such periodic orbits. We apply this result to the complex projective space of dimension n and to some special type of Hopf surfaces, partially characterizing their set of periods. In the first case we also show that any holomorphic map of CP(n) of degree greater than one has infinitely many distinct periodic orbits, hence generalizing a theorem of Fornaess and Sibony. We then characterize the set of periods of a holomorphic map on the Riemann sphere, hence giving an alternative proof of Baker's theorem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The SeDeM Diagram Expert System has been used to study excipients, Captopril and designed formulations for their galenic characterization and to ascertain the critical points of the formula affecting product quality to obtain suitable formulations of Captopril Direct Compression SR Matrix Tablets. The application of the Sedem Diagram Expert System enables selecting excipients with in order to optimize the formula in the preformulation and formulation studies. The methodology is based on the implementation of ICH Q8, establishing the design space of the formula with the use of experiment design, using the parameters of the SeDeM Diagram Expert System as system responses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conventional sample holder cells used to the electric characterization of ceramics at high temperature consists of an alumina tube and platinum wires and plates using a complex design. The high cost materials used in the conventional sampler holder cell were replaced by stainless steel and conventional ceramics. The sample holder was validated by characterizing yttria-stabilized-zirconia in a temperature range of 25 to 700 ºC. The results do not present variations, discontinuity or unusual noise in the electric signals. Several samples were characterized without maintenance, which demonstrates that the sample holder is electric and mechanic adequate to be used to electrical characterization of ceramics up to 700 ºC.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As increasing efficiency of a wind turbine gearbox, more power can be transferred from rotor blades to generator and less power is used to cause wear and heating in the gearbox. By using a simulation model, behavior of the gearbox can be studied before creating expensive prototypes. The objective of the thesis is to model a wind turbine gearbox and its lubrication system to study power losses and heat transfer inside the gearbox and to study the simulation methods of the used software. Software used to create the simulation model is Siemens LMS Imagine.Lab AMESim, which can be used to create one-dimensional mechatronic system simulation models from different fields of engineering. When combining components from different libraries it is possible to create a simulation model, which includes mechanical, thermal and hydraulic models of the gearbox. Results for mechanical, thermal, and hydraulic simulations are presented in the thesis. Due to the large scale of the wind turbine gearbox and the amount of power transmitted, power loss calculations from AMESim software are inaccurate and power losses are modelled as constant efficiency for each gear mesh. Starting values for simulation in thermal and hydraulic simulations were chosen from test measurements and from empirical study as compact and complex design of gearbox prevents accurate test measurements. In further studies to increase the accuracy of the simulation model, components used for power loss calculations needs to be modified and values for unknown variables are needed to be solved through accurate test measurements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Strategies designed to improve educational systems have created tensions in school personnel as they struggle to respond to competing demands of ongoing change within their daily realities. The purpose of this case study was to investigate how teachers and administrators in one elementary school made sense ofthese tensions and to explore the factors that constrained or shaped their responses. A constructive interpretative case study using a grounded theory approach was used. Qualitative data were collected through document analysis, semi-structured interviews, and participant observation. In-depth information about teachers' and administrators' experiences and a contextual understanding oftension was generated from inductive analysis of the data. The study found that tension was a phenomenon situated in the context in which it arose. A contextual understanding of tension revealed the interactions between the institutional, personal, and emotional domains that continually shaped individual and group behavioural responses. This contextual understanding of tension provided the means to reinterpret resistance to change. It also helped to show how teachers and administrators reconstructed identities and made sense in context.. Of particular note was the crucial nature of the conditions under which teachers and adlninistrators shaped meaning and understood change. This study sheds light on the contextual intricacies of tension that may help leaders with the complex design and implementation of educational change..

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dans une époque de changements des moyens de représentation et communication en architecture, cette recherche porte sur l’enseignement de la conception architecturale et plus spécifiquement sur l’apport que l’informatique pourrait avoir dans ce processus. En nous basant sur une méthodologie qualitative, exploratoire et participative, nous y procédons par enchainement de questions, celle de départ étant la suivante: Comment l’enseignement de la conception architecturale pourrait tirer avantage des moyens numériques? Notre objectif est de proposer des méthodes et des outils d’apprentissage aux étudiants en architecture pour enrichir leurs démarches de conception grâce à l’ordinateur. Après une revue de la littérature dans le domaine, et un approfondissement de l’étude sur le rôle des référents architecturaux et sur la conception intégrée, nous avons procédé à une observation exploratoire du travail des étudiants en atelier d’architecture. Ces premières étapes de la recherche ont permis de dégager des discordances entre les positions théoriques et la pratique en l’atelier, pour concrétiser ultérieurement la question de recherche. Dans le but de discerner des méthodes efficaces et innovatrices pour répondre aux discordances identifiées, nous avons engagé une étude de la littérature sur les théories cognitives par rapport aux connaissances, l’apprentissage et la conception. Certaines stratégies ont pu être définies, notamment la nécessité de représentation multimodale des référents architecturaux, l’importance de représenter le processus et non seulement le résultat, ainsi que l’avantage d’inciter les étudiants à travailler dans leur ‘zone proximale’ de développement. Suite à ces recherches, une méthode d’enseignement complémentaire a été définie. Elle propose aux étudiants des explorations de l’objet en conception basées sur la manipulation des savoir-faire architecturaux. Cette méthode a été opérationnalisée d’un point de vue pédagogique ainsi que didactique et mise à l’épreuve auprès des étudiants en atelier. Un prototype de librairie de référents architecturaux interactifs (LibReArchI) a été créé dans ce but. Elle a été conçue en tant qu’environnement de conception et espace de partage de savoir-faire entre étudiants et enseignants. Les principaux résultats de cette recherche démontrent le rôle positif de la méthode proposée pour le transfert des savoir-faire architecturaux lors de l’apprentissage en atelier. Son potentiel d’assister la conception intégrée et de stimuler l’émergence d’idées a été constaté. Au niveau théorique, un modèle d’un cycle du processus de design avec le numérique a été esquissé. En conclusion, des avenues de développements futurs de cette recherche sont proposées.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette recherche porte sur des questions relatives à la conception des interfaces humain-ordinateur. Elle s’inscrit dans le courant des recherches sur l’utilisabilité et elle s’intéresse particulièrement aux approches centrées sur l’utilisateur. Nous avons été très souvent témoin des difficultés éprouvées par les utilisateurs dans l’usage de certaines interfaces interactives et nous considérons que ces difficultés découlent d’un problème de design. Le design d’interface doit être basé sur les besoins de l’utilisateur dans le cadre de ses activités, dont les caractéristiques devaient être bien comprises et bien prises en considération pour mener à la conception d’interfaces qui respectent les critères d’utilisabilité. De plus, la communauté des chercheurs ainsi que l’industrie admettent maintenant que pour améliorer le design, il est crucial de développer les interfaces humain-ordinateur au sein d’une équipe multidisciplinaire. Malgré les avancées significatives dans le domaine du design centrées sur l’utilisateur, les visées annoncées sont rarement réalisées. La problématique étudiée nous a conduit à poser la question suivante : En tant que designer d’une équipe multidisciplinaire de conception, comment modifier la dynamique de collaboration et créer les conditions d’une conception véritablement centrée sur l’interaction humain-ordinateur ? Notre démarche de recherche a été guidée par l’hypothèse voulant que l’activité de design puisse être le moyen de faciliter la création d’un langage commun, des échanges constructifs entre les disciplines, et une réflexion commune centrée sur l’utilisateur. La formulation de cette hypothèse nous a mené à réfléchir sur le rôle du designer. Pour mener cette recherche, nous avons adopté une méthodologie mixte. Dans un premier temps, nous avons utilisé une approche de recherche par projet (recherche-projet) et notre fonction était celle de designer-chercheur. La recherche-projet est particulièrement appropriée pour les recherches en design. Elle privilégie les méthodes qualitatives et interprétatives ; elle étudie la situation dans sa complexité et de façon engagée. Nous avons effectué trois études de cas successives. L’objectif de la première étude était d’observer notre propre rôle et nos interactions avec les autres membres de l’équipe de projet pendant le processus de design. Dans la seconde étude, notre attention a été portée sur les interactions et la collaboration de l’équipe. Nous avons utilisé le processus de design comme méthode pour la construction d’un langage commun entre les intervenants, pour enrichir les réflexions et pour favoriser leur collaboration menant à redéfinir les objectifs du projet. Les limites de ces deux cas nous ont conduit à une intervention différente que nous avons mise en œuvre dans la troisième étude de cas. Cette intervention est constituée par la mise en place d’un atelier intensif de conception où les intervenants au projet se sont engagés à développer une attitude interdisciplinaire permettant la copratique réflexive pour atteindre les objectifs d’un projet de construction d’un site web complexe centré sur l’utilisateur. L’analyse et l’interprétation des données collectées de ces trois études de cas nous ont conduit à créer un modèle théorique de conception d’interface humain-ordinateur. Ce modèle qui informe et structure le processus de design impliquant une équipe multidisciplinaire a pour objectif d’améliorer l’approche centrée sur l’utilisateur. Dans le cadre de ce modèle, le designer endosse le rôle de médiateur en assurant l’efficacité de la collaboration de l’équipe. Dans un deuxième temps, afin de valider le modèle et éventuellement le perfectionner, nous avons utilisé une approche ethnographique comportant des entrevues avec trois experts dans le domaine. Les données des entrevues confirment la validité du modèle ainsi que son potentiel de transférabilité à d’autres contextes. L’application de ce modèle de conception permet d’obtenir des résultats plus performants, plus durables, et dans un délai plus court.