910 resultados para Multiple Baseline Design
Resumo:
This paper presents a high efficiency Sepic rectifier for an electronic ballast application with multiple fluorescent lamps. The proposed Sepic rectifier is based on a Zero-Current-Switching (ZCS) Pulse-Width-Modulated (PWM) soft-commutation cell. The high power-factor of this structure is obtained using the instantaneous average-current control technique, in order to attend properly IEC61000-3-2 standards. The inverting stage of this new electronic ballast is a classical Zero-Voltage-Switching (ZVS) Half-Bridge inverter. A proper design methodology is developed for this new electronic ballast, and a design example is presented for an application with five fluorescent lamps 40W-T12 (200W output power), 220Vrms input voltage, 130Vdc dc link voltage, with rectifier and inverter stages operating at 50kHz. Experimental results are also presented. The THD at input current is equal to 6.41%, for an input voltage THD equal to 2.14%, and the measured overall efficiency is about 92.8%, at rated load.
Resumo:
Context and Objective: Lipasomial daunorubicin has been used to treat hematological malignancies, including multiple myelomo (MM). The goal was to evaluate efficacy, side-effects and toxicity of liposomal daunorubicin and dexamethasone (DD Protocol). Design and Setting: Prospective study of Sírio-Libonês, São Camilo, Brasil and Alemão Oswaldo Cruz hospitals. Methods: Twenty consecutive patients with active MM received four cycles of liposomal daunorubicin intravenously for two hours (25-30 mg/m 2/day) on three consecutive days per month, with oral dexamethasone, (10 mg every six hours) on four consecutive days three times a month. Results: The male/female ratio was 1:1 and median age 60. Nine patients were stage IIA, ten IIIA and one IIIB. The median from diagnosis to starting DD was 13 months. All patients received four cycles, except one. Fifteen had already received chemotherapy before DD. Responses of > 50% reduction in serum monoclonal paraprotein were observed in six patients after first cycle (30%), six after second (30%) and four after third (20%), while four (20%) did not obtain this. Initially, 17 patients (85%) had anemia: 12 (70%) achieved correction. Progressive disease was observed in three patients (15%), while one had minimal response, four (20%) partial and 12 (60%) complete. Hemotologlical toxicity was acceptable: three patients (15%) had neutrophils < 1,000/mm 3; none had thrombocyfopenia. Gastrointestinal toxicity was mild: nausea (10%), anorexio (15%) and no vomiting. Conclusions: This treatment has mild toxicity and good response rate. It may therefore be feasible before autologous bone marraw transplantation.
Resumo:
Virtual platforms are of paramount importance for design space exploration and their usage in early software development and verification is crucial. In particular, enabling accurate and fast simulation is specially useful, but such features are usually conflicting and tradeoffs have to be made. In this paper we describe how we integrated TLM communication mechanisms into a state-of-the-art, cycle-accurate, MPSoC simulation platform. More specifically, we show how we adapted ArchC fast functional instruction set simulators to the MPARM platform in order to achieve both fast simulation speed and accuracy. Our implementation led to a much faster hybrid platform, reaching speedups of up to 2.9 and 2.1x on average with negligible impact on power estimation accuracy (average 3.26% and 2.25% of standard deviation). © 2011 IEEE.
Resumo:
In this work the chaotic behavior of a micro-mechanical resonator with electrostatic forces on both sides is suppressed. The aim is to control the system in an orbit of the analytical solution obtained by the Method of Multiple Scales. Two control strategies are used for controlling the trajectory of the system, namely: State Dependent Riccati Equation (SDRE) Control and Optimal Linear Feedback Control (OLFC). The controls proved effectiveness in controlling the trajectory of the system. Additionally, the robustness of each strategy is tested considering the presence of parametric errors and measurement noise in control. © 2012 American Institute of Physics.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Design - FAAC
Resumo:
Includes bibliography
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.
Resumo:
1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.
Resumo:
The elimination of all external incisions is an important step in reducing the invasiveness of surgical procedures. Natural Orifice Translumenal Endoscopic Surgery (NOTES) is an incision-less surgery and provides explicit benefits such as reducing patient trauma and shortening recovery time. However, technological difficulties impede the widespread utilization of the NOTES method. A novel robotic tool has been developed, which makes NOTES procedures feasible by using multiple interchangeable tool tips. The robotic tool has the capability of entering the body cavity through an orifice or a single incision using a flexible articulated positioning mechanism and once inserted is not constrained by incisions, allowing for visualization and manipulations throughout the cavity. Multiple interchangeable tool tips of the robotic device initially consist of three end effectors: a grasper, scissors, and an atraumatic Babcock clamp. The tool changer is capable of selecting and switching between the three tools depending on the surgical task using a miniature mechanism driven by micro-motors. The robotic tool is remotely controlled through a joystick and computer interface. In this thesis, the following aspects of this robotic tool will be detailed. The first-generation robot is designed as a conceptual model for implementing a novel mechanism of switching, advancing, and controlling the tool tips using two micro-motors. It is believed that this mechanism achieves a reduction in cumbersome instrument exchanges and can reduce overall procedure time and the risk of inadvertent tissue trauma during exchanges with a natural orifice approach. Also, placing actuators directly at the surgical site enables the robot to generate sufficient force to operate effectively. Mounting the multifunctional robot on the distal end of an articulating tube provides freedom from restriction on the robot kinematics and helps solve some of the difficulties otherwise faced during surgery using NOTES or related approaches. The second-generation multifunctional robot is then introduced in which the overall size is reduced and two arms provide 2 additional degrees of freedom, resulting in feasibility of insertion through the esophagus and increased dexterity. Improvements are necessary in future iterations of the multifunctional robot; however, the work presented is a proof of concept for NOTES robots capable of abdominal surgical interventions.
Resumo:
Although low- and middle-income countries still bear the burden of major infectious diseases, chronic noncommunicable diseases are becoming increasingly common due to rapid demographic, epidemiologic, and nutritional transitions. However, information is generally scant in these countries regarding chronic disease incidence, social determinants, and risk factors. The Brazilian Longitudinal Study of Adult Health (ELSA-Brasil) aims to contribute relevant information with respect to the development and progression of clinical and subclinical chronic diseases, particularly cardiovascular diseases and diabetes. In this report, the authors delineate the study's objectives, principal methodological features, and timeline. At baseline, ELSA-Brasil enrolled 15,105 civil servants from 5 universities and 1 research institute. The baseline examination (2008-2010) included detailed interviews, clinical and anthropometric examinations, an oral glucose tolerance test, overnight urine collection, a 12-lead resting electrocardiogram, measurement of carotid intima-media thickness, echocardiography, measurement of pulse wave velocity, hepatic ultrasonography, retinal fundus photography, and an analysis of heart rate variability. Long-term biologic sample storage will allow investigation of biomarkers that may predict cardiovascular diseases and diabetes. Annual telephone surveillance, initiated in 2009, will continue for the duration of the study. A follow-up examination is scheduled for 2012-2013.