871 resultados para Optimal design of experiments
Resumo:
Introduction. Results from previous studies on acupuncture for labour pain are contradictory and lack important information on methodology. However, studies indicate that acupuncture has a positive effect on women's experiences of labour pain. The aim of the present study was to evaluate the efficacy of two different acupuncture stimulations, manual or electrical stimulation, compared with standard care in the relief of labour pain as the primary outcome. This paper will present in-depth information on the design of the study, following the CONSORT and STRICTA recommendations. Methods. The study was designed as a randomized controlled trial based on western medical theories. Nulliparous women with normal pregnancies admitted to the delivery ward after a spontaneous onset of labour were randomly allocated into one of three groups: manual acupuncture, electroacupuncture, or standard care. Sample size calculation gave 101 women in each group, including a total of 303 women. A Visual Analogue Scale was used for assessing pain every 30 minutes for five hours and thereafter every hour until birth. Questionnaires were distributed before treatment, directly after the birth, and at one day and two months postpartum. Blood samples were collected before and after the first treatment. This trial is registered at ClinicalTrials.gov: NCT01197950.
Resumo:
Bin planning (arrangements) is a key factor in the timber industry. Improper planning of the storage bins may lead to inefficient transportation of resources, which threaten the overall efficiency and thereby limit the profit margins of sawmills. To address this challenge, a simulation model has been developed. However, as numerous alternatives are available for arranging bins, simulating all possibilities will take an enormous amount of time and it is computationally infeasible. A discrete-event simulation model incorporating meta-heuristic algorithms has therefore been investigated in this study. Preliminary investigations indicate that the results achieved by GA based simulation model are promising and better than the other meta-heuristic algorithm. Further, a sensitivity analysis has been done on the GA based optimal arrangement which contributes to gaining insights and knowledge about the real system that ultimately leads to improved and enhanced efficiency in sawmill yards. It is expected that the results achieved in the work will support timber industries in making optimal decisions with respect to arrangement of storage bins in a sawmill yard.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
The author argues that by applying problem-solving negotiation skills in the design of public policies, public administrators benefit from more effective and wide-ranging outcomes in the realization of their goals. In order to demonstrate this idea, the author analyzes how negotiation skills – such as identifying key actors and their interests, recognizing hardbargaining tactics and changing the players, knowing your best alternative, creating value and building trust – permeated and contributed to the success of the City of São Paulo’s Invoice Program (“Programa Nota Fiscal Paulistana”), a public policy aimed at combating tax evasion of service tax in the City of São Paulo.
Resumo:
This paper investigates the importance of the fiow of funds as an implicit incetive provided by investors to portfolio managers in a two-period relationship. We show that the fiow of funds is a powerful incentive in an asset management contract. We build a binomial moral hazard model to explain the main trade-ofIs in the relationship between fiow, fees and performance. The main assumption is that efIort depend" on the combination of implicit and explicit incentives while the probability distrioutioll function of returns depends on efIort. In the case of full commitment, the investor's relevant trade-ofI is to give up expected return in the second period vis-à-vis to induce efIort in the first período The more concerned the investor is with today's payoff. the more willing he will be to give up expected return in the following periods. That is. in the second period, the investor penalizes observed low returns by withdrawing resources from non-performing portfolio managers. Besides, he pays performance fee when the observed excess return is positive. When commitment is not a plausible hypothesis, we consider that the investor also learns some symmetríc and imperfect information about the ability of the manager to generate positive excess returno In this case, observed returns reveal ability as well as efIort choices exerted by the portfolio manager. We show that implicit incentives can explain the fiow-performance relationship and, conversely, endogenous expected return determines incentives provision and define their optimal leveIs. We provide a numerical solution in Matlab that characterize these results.
Resumo:
This work analyzes the optimal design of an unemployment insurance program for couples, whose joint search problem in the labor market differ significantly from the problem faced by single agents. We use a version of the sequential search model of the labor market adapted to married agents to compare optimal constant policies for single and married agents, as well as characterize the optimal constant policy when the agency faces single and married agents simultaneously. Our main result is that an agency that gives equal weights to single and married agents will want to give equal utility promises to both types of agents and spend more on the single agent.
Resumo:
We discuss a general approach to building non-asymptotic confidence bounds for stochastic optimization problems. Our principal contribution is the observation that a Sample Average Approximation of a problem supplies upper and lower bounds for the optimal value of the problem which are essentially better than the quality of the corresponding optimal solutions. At the same time, such bounds are more reliable than “standard” confidence bounds obtained through the asymptotic approach. We also discuss bounding the optimal value of MinMax Stochastic Optimization and stochastically constrained problems. We conclude with a small simulation study illustrating the numerical behavior of the proposed bounds.
Resumo:
Tests on printed circuit boards and integrated circuits are widely used in industry,resulting in reduced design time and cost of a project. The functional and connectivity tests in this type of circuits soon began to be a concern for the manufacturers, leading to research for solutions that would allow a reliable, quick, cheap and universal solution. Initially, using test schemes were based on a set of needles that was connected to inputs and outputs of the integrated circuit board (bed-of-nails), to which signals were applied, in order to verify whether the circuit was according to the specifications and could be assembled in the production line. With the development of projects, circuit miniaturization, improvement of the production processes, improvement of the materials used, as well as the increase in the number of circuits, it was necessary to search for another solution. Thus Boundary-Scan Testing was developed which operates on the border of integrated circuits and allows testing the connectivity of the input and the output ports of a circuit. The Boundary-Scan Testing method was converted into a standard, in 1990, by the IEEE organization, being known as the IEEE 1149.1 Standard. Since then a large number of manufacturers have adopted this standard in their products. This master thesis has, as main objective: the design of Boundary-Scan Testing in an image sensor in CMOS technology, analyzing the standard requirements, the process used in the prototype production, developing the design and layout of Boundary-Scan and analyzing obtained results after production. Chapter 1 presents briefly the evolution of testing procedures used in industry, developments and applications of image sensors and the motivation for the use of architecture Boundary-Scan Testing. Chapter 2 explores the fundamentals of Boundary-Scan Testing and image sensors, starting with the Boundary-Scan architecture defined in the Standard, where functional blocks are analyzed. This understanding is necessary to implement the design on an image sensor. It also explains the architecture of image sensors currently used, focusing on sensors with a large number of inputs and outputs.Chapter 3 describes the design of the Boundary-Scan implemented and starts to analyse the design and functions of the prototype, the used software, the designs and simulations of the functional blocks of the Boundary-Scan implemented. Chapter 4 presents the layout process used based on the design developed on chapter 3, describing the software used for this purpose, the planning of the layout location (floorplan) and its dimensions, the layout of individual blocks, checks in terms of layout rules, the comparison with the final design and finally the simulation. Chapter 5 describes how the functional tests were performed to verify the design compliancy with the specifications of Standard IEEE 1149.1. These tests were focused on the application of signals to input and output ports of the produced prototype. Chapter 6 presents the conclusions that were taken throughout the execution of the work.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Based on the genetic analysis of the phytopathogen Xylella fastidiosa genome, five media with defined composition were developed and the growth abilities of this fastidious prokaryote were evaluated in liquid media and on solid plates. All media had a common salt composition and included the same amounts of glucose and vitamins but differed in their amino acid content. XDM1 medium contained amino acids threonine, serine, glycine, alanine, aspartic acid and glutamic acid, for which complete degradation pathways occur in X fastidiosa; XDM2 included serine and methionine, amino acids for which biosynthetic enzymes are absent, plus asparagine and glutamine, which are abundant in the xylem sap; XDM3 had the same composition as XDM2 but with asparagine replaced by aspartic acid due to the presence of complete degradation pathway for aspartic acid; XDM4 was a minimal medium with glutamine as a sole nitrogen source; XDM5 had the same composition as XDM4, plus methionine. The liquid and solidified XDM2 and XDM3 media were the most effective for the growth of X. fastidiosa. This work opens the opportunity for the in silico design of bacterial defined media once their genome is sequenced. (C) 2002 Federation of European Microbiological Societies. Published by Elsevier B.V. B.V. All rights reserved.
Resumo:
Absorbance detection in capillary electrophoresis (CE), offers an excellent mass sensitivity, but poor concentration detection limits owing to very small injection volumes (normally I to 10 nL). This aspect can be a limiting factor in the applicability of CE/UV to detect species at trace levels, particularly pesticide residues. In the present work, the optical path length of an on-column detection cell was increased through a proper connection of the column (75 mu m i.d.) to a capillary detection cell of 180 mu m optical path length in order to improve detectability. It is shown that the cell with an extended optical path length results in a significant gain in terms of signal to noise ratio. The effect of the increase in the optical path length has been evaluated for six pesticides, namely, carbendazim, thiabendazole, imazalil, procymidone triadimefon, and prochloraz. The resulting optical enhancement of the detection cell provided detection limits of ca. 0.3 mu g/mL for the studied compounds, thus enabling the residue analysis by CE/UV.
Resumo:
The reduction of the fuel content of a monoethanolamine nitrate (MEAN) fueled explosive slurry was investigated. The work was performed in three phases. The first one involved the MEAN content reduction in a reference slurry from its initial value of 36 down to 24% by weight, the balance being filled with ammonium nitrate, the least expensive item in the slurry composition. This proved to be successful, leading to an overall cost reduction of 17%, while keeping the overall performance quite satisfactory. The second phase consisted in trying to bring the MEAN content down from 24 to 17%. Although this led to further cost reduction, the formulations, obtained by substituting part of the MEAN content by ammonium nitrate/fuel oil (ANFO), produced unsatisfactory results regarding ignition and densities. In the third phase, the Design of Experiments Technique was used to find formulations displaying not only lower cost, but also acceptable overall performance. This led to a raw material cost reduction ranging from 23 to 26% relative to the initial reference slurry formulation.