891 resultados para Network deployment methods
Resumo:
Objectives. To evaluate the influence of different tertiary amines on degree of conversion (DC), shrinkage-strain, shrinkage-strain rate, Knoop microhardness, and color and transmittance stabilities of experimental resins containing BisGMA/TEGDMA (3: 1 wt), 0.25wt% camphorquinone, 1wt% amine (DMAEMA, CEMA, DMPT, DEPT or DABE). Different light-curing protocols were also evaluated. Methods. DC was evaluated with FTIR-ATR and shrinkage-strain with the bonded-disk method. Shrinkage-strain-rate data were obtained from numerical differentiation of shrinkage-strain data with respect to time. Color stability and transmittance were evaluated after different periods of artificial aging, according to ISO 7491: 2000. Results were evaluated with ANOVA, Tukey, and Dunnett`s T3 tests (alpha = 0.05). Results. Studied properties were influenced by amines. DC and shrinkage-strain were maximum at the sequence: CQ < DEPT < DMPT <= CEMA approximate to DABE < DMAEMA. Both DC and shrinkage were also influenced by the curing protocol, with positive correlations between DC and shrinkage-strain and DC and shrinkage-strain rate. Materials generally decreased in L* and increased in b*. The strong exception was the resin containing DMAEMA that did not show dark and yellow shifts. Color varied in the sequence: DMAEMA < DEPT < DMPT < CEMA < DABE. Transmittance varied in the sequence: DEPT approximate to DABE < DABE approximate to DMPT approximate to CEMA < DMPT approximate to CEMA approximate to DMAEMA, being more evident at the wavelength of 400 nm. No correlations between DC and optical properties were observed. Significance. The resin containing DMAEMA showed higher DC, shrinkage-strain, shrinkage-strain rate, and microhardness, in addition to better optical properties. (C) 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Telehealth is an exciting new technique for improving access to specialist care, especially in underserviced areas. As with any new health care intervention, proper evaluations must be carried out to ensure that limited health care resources are employed appropriately. Evidence suggests that the most successful telehealth initiatives are those that have focussed on organisational and deployment issues, rather than on the technology itself. In the right circumstances, telehealth offers a cost-effective alternative to the traditional methods of health care delivery Copyright (C) 2001 S. Karger AG, Basel.
Resumo:
The two-node tandem Jackson network serves as a convenient reference model for the analysis and testing of different methodologies and techniques in rare event simulation. In this paper we consider a new approach to efficiently estimate the probability that the content of the second buffer exceeds some high level L before it becomes empty, starting from a given state. The approach is based on a Markov additive process representation of the buffer processes, leading to an exponential change of measure to be used in an importance sampling procedure. Unlike changes of measures proposed and studied in recent literature, the one derived here is a function of the content of the first buffer. We prove that when the first buffer is finite, this method yields asymptotically efficient simulation for any set of arrival and service rates. In fact, the relative error is bounded independent of the level L; a new result which is not established for any other known method. When the first buffer is infinite, we propose a natural extension of the exponential change of measure for the finite buffer case. In this case, the relative error is shown to be bounded (independent of L) only when the second server is the bottleneck; a result which is known to hold for some other methods derived through large deviations analysis. When the first server is the bottleneck, experimental results using our method seem to suggest that the relative error is bounded linearly in L.
Resumo:
Epidemiological studies of drug misusers have until recently relied on two main forms of sampling: probability and convenience. The former has been used when the aim was simply to estimate the prevalence of the condition and the latter when in depth studies of the characteristics, profiles and behaviour of drug users were required, but each method has its limitations. Probability samples become impracticable when the prevalence of the condition is very low, less than 0.5% for example, or when the condition being studied is a clandestine activity such as illicit drug use. When stratified random samples are used, it may be difficult to obtain a truly representative sample, depending on the quality of the information used to develop the stratification strategy. The main limitation of studies using convenience samples is that the results cannot be generalised to the whole population of drug users due to selection bias and a lack of information concerning the sampling frame. New methods have been developed which aim to overcome some of these difficulties, for example, social network analysis, snowball sampling, capture-recapture techniques, privileged access interviewer method and contact tracing. All these methods have been applied to the study of drug misuse. The various methods are described and examples of their use given, drawn from both the Brazilian and international drug misuse literature.
Resumo:
In the context of electricity markets, transmission pricing is an important tool to achieve an efficient operation of the electricity system. The electricity market is influenced by several factors; however the transmission network management is one of the most important aspects, because the network is a natural monopoly. The transmission tariffs can help to regulate the market, for this reason transmission tariffs must follow strict criteria. This paper presents the following methods to tariff the use of transmission networks by electricity market players: Post-Stamp Method; MW-Mile Method Distribution Factors Methods; Tracing Methodology; Bialek’s Tracing Method and Locational Marginal Price. A nine bus transmission network is used to illustrate the application of the tariff methods.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
The advent of Wireless Sensor Network (WSN) technologies is paving the way for a panoply of new ubiquitous computing applications, some of them with critical requirements. In the ART-WiSe framework, we are designing a two-tiered communication architecture for supporting real-time and reliable communications in WSNs. Within this context, we have been developing a test-bed application, for testing, validating and demonstrating our theoretical findings - a search&rescue/pursuit-evasion application. Basically, a WSN deployment is used to detect, localize and track a target robot and a station controls a rescuer/pursuer robot until it gets close enough to the target robot. This paper describes how this application was engineered, particularly focusing on the implementation of the localization mechanism.
Resumo:
ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.
Resumo:
“Many-core” systems based on a Network-on-Chip (NoC) architecture offer various opportunities in terms of performance and computing capabilities, but at the same time they pose many challenges for the deployment of real-time systems, which must fulfill specific timing requirements at runtime. It is therefore essential to identify, at design time, the parameters that have an impact on the execution time of the tasks deployed on these systems and the upper bounds on the other key parameters. The focus of this work is to determine an upper bound on the traversal time of a packet when it is transmitted over the NoC infrastructure. Towards this aim, we first identify and explore some limitations in the existing recursive-calculus-based approaches to compute the Worst-Case Traversal Time (WCTT) of a packet. Then, we extend the existing model by integrating the characteristics of the tasks that generate the packets. For this extended model, we propose an algorithm called “Branch and Prune” (BP). Our proposed method provides tighter and safe estimates than the existing recursive-calculus-based approaches. Finally, we introduce a more general approach, namely “Branch, Prune and Collapse” (BPC) which offers a configurable parameter that provides a flexible trade-off between the computational complexity and the tightness of the computed estimate. The recursive-calculus methods and BP present two special cases of BPC when a trade-off parameter is 1 or ∞, respectively. Through simulations, we analyze this trade-off, reason about the implications of certain choices, and also provide some case studies to observe the impact of task parameters on the WCTT estimates.
Resumo:
Comunicação apresentada na 30th Sunbelt Social Networks Conference, em Riva del Garda, Itália, a 3 de Julho de 2010.
Resumo:
Demand response has gain increasing importance in the context of competitive electricity markets environment. The use of demand resources is also advantageous in the context of smart grid operation. In addition to the need of new business models for integrating demand response, adequate methods are necessary for an accurate determination of the consumers’ performance evaluation after the participation in a demand response event. The present paper makes a comparison between some of the existing baseline methods related to the consumers’ performance evaluation, comparing the results obtained with these methods and also with a method proposed by the authors of the paper. A case study demonstrates the application of the referred methods to real consumption data belonging to a consumer connected to a distribution network.
Resumo:
6th International Real-Time Scheduling Open Problems Seminar (RTSOPS 2015), Lund, Sweden.
Resumo:
Chagas disease is a chronic, tropical, parasitic disease, endemic throughout Latin America. The large-scale migration of populations has increased the geographic distribution of the disease and cases have been observed in many other countries around the world. To strengthen the critical mass of knowledge generated in different countries, it is essential to promote cooperative and translational research initiatives. We analyzed authorship of scientific documents on Chagas disease indexed in the Medline database from 1940 to 2009. Bibliometrics was used to analyze the evolution of collaboration patterns. A Social Network Analysis was carried out to identify the main research groups in the area by applying clustering methods. We then analyzed 13,989 papers produced by 21,350 authors. Collaboration among authors dramatically increased over the study period, reaching an average of 6.2 authors per paper in the last five-year period. Applying a threshold of collaboration of five or more papers signed in co-authorship, we identified 148 consolidated research groups made up of 1,750 authors. The Chagas disease network identified constitutes a "small world," characterized by a high degree of clustering and a notably high number of Brazilian researchers.
Resumo:
Dissertação de Mestrado (Programa Doutoral em Informática)
Resumo:
Abstract Clinical decision-making requires synthesis of evidence from literature reviews focused on a specific theme. Evidence synthesis is performed with qualitative assessments and systematic reviews of randomized clinical trials, typically covering statistical pooling with pairwise meta-analyses. These methods include adjusted indirect comparison meta-analysis, network meta-analysis, and mixed-treatment comparison. These tools allow synthesis of evidence and comparison of effectiveness in cardiovascular research.