981 resultados para applying performance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A thorough study of the thermal performance of multipass parallel cross-flow and counter-cross-flow heat exchangers has been carried out by applying a new numerical procedure. According to this procedure, the heat exchanger is discretized into small elements following the tube-side fluid circuits. Each element is itself a one-pass mixed-unmixed cross-flow heat exchanger. Simulated results have been validated through comparisons to results from analytical solutions for one- to four-pass, parallel cross-flow and counter-cross-flow arrangements. Very accurate results have been obtained over wide ranges of NTU (number of transfer units) and C* (heat capacity rate ratio) values. New effectiveness data for the aforementioned configurations and a higher number of tube passes is presented along with data for a complex flow configuration proposed elsewhere. The proposed procedure constitutes a useful research tool both for theoretical and experimental studies of cross-flow heat exchangers thermal performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJETIVE: This study aimed to assess the practices of pharmacists in Hospital Care. Method - we interviewed 20 pharmacists from the Pharmacy Division by applying a structured instrument, in September 2005. This instrument addressed aspects related to the main activities at the Hospital Pharmacy, which were assessed according to indicators organized into five areas: sector management, hospital pharmacotechniques, committee activities, information and pharmacotherapeutic follow-up, as well as teaching and research activities.RESULTS: the Pharmacy Division considered all structural aspects under analysis as essential for the good development and application of its services. We found that some essential services, such as the Medication Information Service and Pharmacotherapeutic Follow-up, were absent. Pharmacist professionals were dissatisfied about human resource and physical structure dimensioning, and they presented as not very active in terms of Pharmaceutical Care.CONCLUSION: Results indicate that care is still centered on the drug, with few clinical activities. We suggest reformulations in service management, particularly in the management of pharmacists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine's capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine's potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivated by rising drilling operation costs, the oil industry has shown a trend toward real-time measurements and control. In this scenario, drilling control becomes a challenging problem for the industry, especially due to the difficulty associated with parameters modeling. One of the drillbit performance evaluators, the Rate Of Penetration (ROP), has been used as a drilling control parameter. However, relationships between operational variables affecting the ROP are complex and not easily modeled. This work presents a neuro-genetic adaptive controller to treat this problem. It is based on an auto-regressive with extra input signals, or ARX model and on a Genetic Algorithm (GA) to control the ROP. © [2006] IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this article is to apply the Design of Experiments technique along with the Discrete Events Simulation technique in an automotive process. The benefits of the design of experiments in simulation include the possibility to improve the performance in the simulation process, avoiding trial and error to seek solutions. The methodology of the conjoint use of Design of Experiments and Computer Simulation is presented to assess the effects of the variables and its interactions involved in the process. In this paper, the efficacy of the use of process mapping and design of experiments on the phases of conception and analysis are confirmed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work encompasses the direct electrodeposition of polypyrrole nanowires onto Au substrates using different electrochemical techniques: normal pulse voltammetry (NPV) and constant potential method with the aim in applying these films for the first time in ammonia sensing in solution. The performance of these nanowire-based sensors are compared and evaluated in terms of: film morphology (analyzed with scanning electron microscopy); their sensitivity towards ammonia; electrochemical and contact angle measurements. For nanowires prepared by NPV, the sensitivity towards ammonia increases with increasing amount of electrodeposited polypyrrole, as expected due to the role of polypyrrole as electrochemical transducer for ammonia oxidation. On the other hand, nanowires prepared potentiostatically displayed an unexpected opposite behavior, attributed to the lower conductivity of longer polypyrrole nanowires obtained through this technique. These results evidenced that the analytical and physico-chemical features of nanostructured sensors can differ greatly from those of their conventional bulky analogous. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BECTS represents the vast majority of childhood focal epilepsy. Owing to the age peculiarity of children who suffer from this disease, i.e., school-going age of between 6 and 9 years, the condition is often referred to as a school disorder by parents and teachers. Objective: The aim of this study was to evaluate the academic performance of children with BED'S, according to the clinical and electroencephalographic ILAE criteria, and compare the results of neuropsychological tests of language and attention to the frequency of epileptic discharges. Methods: The performances of 40 school children with BED'S were evaluated by applying a school performance test (SBT), neuropsychological tests (WISC and Trail-Making), and language tests (Illinois Test Psycholinguistic Abilities - ITPA - and Staggered Spondaic Word - SSW). The same tests were applied in the control group. Results: Children with BED'S, when compared to those in the control group, showed lower scores in academic performance (SPT), digits and similarities subtests of WISC, auditory processing subtest of SSW, and ITPA - representational and automatic level. The study showed that epileptic discharges did not influence the results. Conclusion: Children with BED'S scored significantly lower scores in tests on academic performance, when compared with those in the control group probably due to executive dysfunction. (C) 2011 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aufbau einer kontinuierlichen, mehrdimensionalen Hochleistungs-flüssigchromatographie-Anlage für die Trennung von Proteinen und Peptiden mit integrierter größenselektiver ProbenfraktionierungEs wurde eine mehrdimensionale HPLC-Trennmethode für Proteine und Peptide mit einem Molekulargewicht von <15 kDa entwickelt.Im ersten Schritt werden die Zielanalyte von höhermolekularen sowie nicht ionischen Bestandteilen mit Hilfe von 'Restricted Access Materialien' (RAM) mit Ionenaustauscher-Funktionalität getrennt. Anschließend werden die Proteine auf einer analytischen Ionenaustauscher-Säule sowie auf Reversed-Phase-Säulen getrennt. Zur Vermeidung von Probenverlusten wurde ein kontinuierlich arbeitendes, voll automatisiertes System auf Basis unterschiedlicher Trenngeschwindigkeiten und vier parallelen RP-Säulen aufgebaut.Es werden jeweils zwei RP-Säulen gleichzeitig, jedoch mit zeitlich versetztem Beginn eluiert, um durch flache Gradienten ausreichende Trennleistungen zu erhalten. Während die dritte Säule regeneriert wird, erfolgt das Beladen der vierte Säule durch Anreicherung der Proteine und Peptide am Säulenkopf. Während der Gesamtanalysenzeit von 96 Minuten werden in Intervallen von 4 Minuten Fraktionen aus der 1. Dimension auf die RP-Säulen überführt und innerhalb von 8 Minuten getrennt, wobei 24 RP-Chromatogramme resultieren.Als Testsubstanzen wurden u.a. Standardproteine, Proteine und Peptide aus humanem Hämofiltrat sowie aus Lungenfibroblast-Zellkulturüberständen eingesetzt. Weiterhin wurden Fraktionen gesammelt und mittels MALDI-TOF Massenspektrometrie untersucht. Bei einer Injektion wurden in den 24 RP-Chromatogrammen mehr als 1000 Peaks aufgelöst. Der theoretische Wert der Peakkapazität liegt bei ungefähr 3000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technology of partial virtualization is a revolutionary approach to the world of virtualization. It lies directly in-between full system virtual machines (like QEMU or XEN) and application-related virtual machines (like the JVM or the CLR). The ViewOS project is the flagship of such technique, developed by the Virtual Square laboratory, created to provide an abstract view of the underlying system resources on a per-process basis and work against the principle of the Global View Assumption. Virtual Square provides several different methods to achieve partial virtualization within the ViewOS system, both at user and kernel levels. Each of these approaches have their own advantages and shortcomings. This paper provides an analysis of the different virtualization methods and problems related to both the generic and partial virtualization worlds. This paper is the result of an in-depth study and research for a new technology to be employed to provide partial virtualization based on ELF dynamic binaries. It starts with a mild analysis of currently available virtualization alternatives and then goes on describing the ViewOS system, highlighting its current shortcomings. The vloader project is then proposed as a possible solution to some of these inconveniences with a working proof of concept and examples to outline the potential of such new virtualization technique. By injecting specific code and libraries in the middle of the binary loading mechanism provided by the ELF standard, the vloader project can promote a streamlined and simplified approach to trace system calls. With the advantages outlined in the following paper, this method presents better performance and portability compared to the currently available ViewOS implementations. Furthermore, some of itsdisadvantages are also discussed, along with their possible solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After almost 10 years from “The Free Lunch Is Over” article, where the need to parallelize programs started to be a real and mainstream issue, a lot of stuffs did happened: • Processor manufacturers are reaching the physical limits with most of their approaches to boosting CPU performance, and are instead turning to hyperthreading and multicore architectures; • Applications are increasingly need to support concurrency; • Programming languages and systems are increasingly forced to deal well with concurrency. This thesis is an attempt to propose an overview of a paradigm that aims to properly abstract the problem of propagating data changes: Reactive Programming (RP). This paradigm proposes an asynchronous non-blocking approach to concurrency and computations, abstracting from the low-level concurrency mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article discusses performance in the context of the World Trade Organization (WTO). Applying the framework by Gutner and Thompson and inspired by principal-agent theory, it is argued that existing studies have underspecified the institutional milieu that affects performance. The WTO represents a member-driven organization where Members are part of the international organization (IO) (e.g., through rule-making) and at the same time act outside the IO (e.g., through implementation). Thus, a narrow reading of the IO (focusing on the civil servants and the Director-General and his staff) will not suffice to understand IO performance in the WTO context. Selected evidence is presented to illustrate aspects of the WTO’s inner-working and the institutional milieu of performance. In addition, the article discusses a number of performance parameters, including the relationship between Secretariat autonomy and performance, the role of information, and the mechanisms of performance aggregation. The article ends by cautioning against quick fixes to the system to improve performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-uniform sampling (NUS) has been established as a route to obtaining true sensitivity enhancements when recording indirect dimensions of decaying signals in the same total experimental time as traditional uniform incrementation of the indirect evolution period. Theory and experiments have shown that NUS can yield up to two-fold improvements in the intrinsic signal-to-noise ratio (SNR) of each dimension, while even conservative protocols can yield 20-40 % improvements in the intrinsic SNR of NMR data. Applications of biological NMR that can benefit from these improvements are emerging, and in this work we develop some practical aspects of applying NUS nD-NMR to studies that approach the traditional detection limit of nD-NMR spectroscopy. Conditions for obtaining high NUS sensitivity enhancements are considered here in the context of enabling H-1,N-15-HSQC experiments on natural abundance protein samples and H-1,C-13-HMBC experiments on a challenging natural product. Through systematic studies we arrive at more precise guidelines to contrast sensitivity enhancements with reduced line shape constraints, and report an alternative sampling density based on a quarter-wave sinusoidal distribution that returns the highest fidelity we have seen to date in line shapes obtained by maximum entropy processing of non-uniformly sampled data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Ultra High Performance Concrete research involves observing early-age creep and shrinkage under a compressive load throughout multiple thermal curing regimes. The goal was to mimic the conditions that would be expected of a precast/prestressing plant in the United States, where UHPC beams would be produced quickly to maximize a manufacturing plant’s output. The practice of steam curing green concrete to accelerate compressive strengths for early release of the prestressing tendons was utilized (140°F [60°C], 95% RH, 14 hrs), in addition to the full thermal treatment (195°F [90°C], 95% RH, 48 hrs) while the specimens were under compressive loading. Past experimental studies on creep and shrinkage characteristics of UHPC have only looked at applying a creep load after the thermal treatment had been administered to the specimens, or on ambient cured specimens. However, this research looked at mimicking current U.S. precast/prestressed plant procedures, and thus characterized the creep and shrinkage characteristics of UHPC as it is thermally treated under a compressive load. Michigan Tech has three moveable creep frames to accommodate two loading criteria per frame of 0.2f’ci and 0.6f’ci. Specimens were loaded in the creep frames and moved into a custom built curing chamber at different times, mimicking a precast plant producing several beams throughout the week and applying a thermal cure to all of the beams over the weekend. This thesis presents the effects of creep strain due to the varying curing regimes. An ambient cure regime was used as a baseline for the comparison against the varying thermal curing regimes. In all cases of thermally cured specimens, the compressive creep and shrinkage strains are accelerated to a maximum strain value, and remain consistent after the administration of the thermal cure. An average creep coefficient for specimens subjected to a thermal cure was found to be 1.12 and 0.78 for the high and low load levels, respectively. Precast/pressed plants can expect that simultaneously thermally curing UHPC elements that are produced throughout the week does not impact the post-cure creep coefficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Civil infrastructure provides essential services for the development of both society and economy. It is very important to manage systems efficiently to ensure sound performance. However, there are challenges in information extraction from available data, which also necessitates the establishment of methodologies and frameworks to assist stakeholders in the decision making process. This research proposes methodologies to evaluate systems performance by maximizing the use of available information, in an effort to build and maintain sustainable systems. Under the guidance of problem formulation from a holistic view proposed by Mukherjee and Muga, this research specifically investigates problem solving methods that measure and analyze metrics to support decision making. Failures are inevitable in system management. A methodology is developed to describe arrival pattern of failures in order to assist engineers in failure rescues and budget prioritization especially when funding is limited. It reveals that blockage arrivals are not totally random. Smaller meaningful subsets show good random behavior. Additional overtime failure rate is analyzed by applying existing reliability models and non-parametric approaches. A scheme is further proposed to depict rates over the lifetime of a given facility system. Further analysis of sub-data sets is also performed with the discussion of context reduction. Infrastructure condition is another important indicator of systems performance. The challenges in predicting facility condition are the transition probability estimates and model sensitivity analysis. Methods are proposed to estimate transition probabilities by investigating long term behavior of the model and the relationship between transition rates and probabilities. To integrate heterogeneities, model sensitivity is performed for the application of non-homogeneous Markov chains model. Scenarios are investigated by assuming transition probabilities follow a Weibull regressed function and fall within an interval estimate. For each scenario, multiple cases are simulated using a Monte Carlo simulation. Results show that variations on the outputs are sensitive to the probability regression. While for the interval estimate, outputs have similar variations to the inputs. Life cycle cost analysis and life cycle assessment of a sewer system are performed comparing three different pipe types, which are reinforced concrete pipe (RCP) and non-reinforced concrete pipe (NRCP), and vitrified clay pipe (VCP). Life cycle cost analysis is performed for material extraction, construction and rehabilitation phases. In the rehabilitation phase, Markov chains model is applied in the support of rehabilitation strategy. In the life cycle assessment, the Economic Input-Output Life Cycle Assessment (EIO-LCA) tools are used in estimating environmental emissions for all three phases. Emissions are then compared quantitatively among alternatives to support decision making.