964 resultados para SCENARIOS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To propose a short version of the Brazilian Food Insecurity Scale. METHODS Two samples were used to test the results obtained in the analyses in two distinct scenarios. One of the studies was composed of 230 low income families from Pelotas, RS, Southern Brazil, and the other was composed of 15,575 women, whose data were obtained from the 2006 National Survey on Demography and Health. Two models were tested, the first containing seven questions, and the second, the five questions that were considered the most relevant ones in the concordance analysis. The models were compared to the Brazilian Food Insecurity Scale, and the sensitivity, specificity and accuracy parameters were calculated, as well as the kappa agreement test. RESULTS Comparing the prevalence of food insecurity between the Brazilian Food Insecurity Scale and the two models, the differences were around 2 percentage points. In the sensitivity analysis, the short version of seven questions obtained 97.8% and 99.5% in the Pelotas sample and in the National Survey on Demography and Health sample, respectively, while specificity was 100% in both studies. The five-question model showed similar results (sensitivity of 95.7% and 99.5% in the Pelotas sample and in the National Survey on Demography and Health sample, respectively). In the Pelotas sample, the kappa test of the seven-question version totaled 97.0% and that of the five-question version, 95.0%. In the National Survey on Demography and Health sample, the two models presented a 99.0% kappa. CONCLUSIONS We suggest that the model with five questions should be used as the short version of the Brazilian Food Insecurity Scale, as its results were similar to the original scale with a lower number of questions. This version needs to be administered to other populations in Brazil in order to allow for the adequate assessment of the validity parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A noncoherent vector delay/frequency-locked loop (VDFLL) architecture for GNSS receivers is proposed. A bank of code and frequency discriminators feeds a central extended Kalman filter that estimates the receiver's position and velocity, besides the clock error. The VDFLL architecture performance is compared with the one of the classic scalar receiver, both for scintillation and multipath scenarios, in terms of position errors. We show that the proposed solution is superior to the conventional scalar receivers, which tend to lose lock rapidly, due to the sudden drops of the received signal power.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cloud SLAs compensate customers with credits when average availability drops below certain levels. This is too inflexible because consumers lose non-measurable amounts of performance being only compensated later, in next charging cycles. We propose to schedule virtual machines (VMs), driven by range-based non-linear reductions of utility, different for classes of users and across different ranges of resource allocations: partial utility. This customer-defined metric, allows providers transferring resources between VMs in meaningful and economically efficient ways. We define a comprehensive cost model incorporating partial utility given by clients to a certain level of degradation, when VMs are allocated in overcommitted environments (Public, Private, Community Clouds). CloudSim was extended to support our scheduling model. Several simulation scenarios with synthetic and real workloads are presented, using datacenters with different dimensions regarding the number of servers and computational capacity. We show the partial utility-driven driven scheduling allows more VMs to be allocated. It brings benefits to providers, regarding revenue and resource utilization, allowing for more revenue per resource allocated and scaling well with the size of datacenters when comparing with an utility-oblivious redistribution of resources. Regarding clients, their workloads’ execution time is also improved, by incorporating an SLA-based redistribution of their VM’s computational power.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertation submitted to the Faculty of Sciences and Technology of New University of Lisbon for obtaining the degree of Master in Environmental Management Systems

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing integration of larger amounts of wind energy into power systems raises important operational issues, such as the balance between power generation and demand. The pumped storage hydro (PSH) units are one possible solution to mitigate this problem, once they can store the excess of energy in the periods of higher generation and lower demand. However, the behaviour of a PSH unit may differ considerably from the expected in terms of wind power integration when it operates in a liberalized electricity market under a price-maker context. In this regard, this paper models and computes the optimal PSH weekly scheduling in a price-taker and price-maker scenarios, either when the PSH unit operates in standalone and integrated in a portfolio of other generation assets. Results show that the price-maker standalone PSH will integrate less wind power in comparison with the price-taker situation. Moreover, when the PSH unit is integrated in a portfolio with a base load power plant, the role of the price elasticity of demand may completely change the operational profile of the PSH unit. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores.Área de Especialização de Sistemas Autónomos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the two-Higgs-doublet model as a framework in which to evaluate the viability of scenarios in which the sign of the coupling of the observed Higgs boson to down-type fermions (in particular, b-quark pairs) is opposite to that of the Standard Model (SM), while at the same time all other tree-level couplings are close to the SM values. We show that, whereas such a scenario is consistent with current LHC observations, both future running at the LHC and a future e(+)e(-) linear collider could determine the sign of the Higgs coupling to b-quark pairs. Discrimination is possible for two reasons. First, the interference between the b-quark and the t-quark loop contributions to the ggh coupling changes sign. Second, the charged-Higgs loop contribution to the gamma gamma h coupling is large and fairly constant up to the largest charged-Higgs mass allowed by tree-level unitarity bounds when the b-quark Yukawa coupling has the opposite sign from that of the SM (the change in sign of the interference terms between the b-quark loop and the W and t loops having negligible impact).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To estimate the budget impact from the incorporation of positron emission tomography (PET) in mediastinal and distant staging of non-small cell lung cancer.METHODS The estimates were calculated by the epidemiological method for years 2014 to 2018. Nation-wide data were used about the incidence; data on distribution of the disease´s prevalence and on the technologies’ accuracy were from the literature; data regarding involved costs were taken from a micro-costing study and from Brazilian Unified Health System (SUS) database. Two strategies for using PET were analyzed: the offer to all newly-diagnosed patients, and the restricted offer to the ones who had negative results in previous computed tomography (CT) exams. Univariate and extreme scenarios sensitivity analyses were conducted to evaluate the influence from sources of uncertainties in the parameters used.RESULTS The incorporation of PET-CT in SUS would imply the need for additional resources of 158.1 BRL (98.2 USD) million for the restricted offer and 202.7 BRL (125.9 USD) million for the inclusive offer in five years, with a difference of 44.6 BRL (27.7 USD) million between the two offer strategies within that period. In absolute terms, the total budget impact from its incorporation in SUS, in five years, would be 555 BRL (345 USD) and 600 BRL (372.8 USD) million, respectively. The costs from the PET-CT procedure were the most influential parameter in the results. In the most optimistic scenario, the additional budget impact would be reduced to 86.9 BRL (54 USD) and 103.8 BRL (64.5 USD) million, considering PET-CT for negative CT and PET-CT for all, respectively.CONCLUSIONS The incorporation of PET in the clinical staging of non-small cell lung cancer seems to be financially feasible considering the high budget of the Brazilian Ministry of Health. The potential reduction in the number of unnecessary surgeries may cause the available resources to be more efficiently allocated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyse the possibility that, in two Higgs doublet models, one or more of the Higgs couplings to fermions or to gauge bosons change sign, relative to the respective Higgs Standard Model couplings. Possible sign changes in the coupling of a neutral scalar to charged ones are also discussed. These wrong signs can have important physical consequences, manifesting themselves in Higgs production via gluon fusion or Higgs decay into two gluons or into two photons. We consider all possible wrong sign scenarios, and also the symmetric limit, in all possible Yukawa implementations of the two Higgs doublet model, in two different possibilities: the observed Higgs boson is the lightest CP-even scalar, or the heaviest one. We also analyse thoroughly the impact of the currently available LHC data on such scenarios. With all 8 TeV data analysed, all wrong sign scenarios are allowed in all Yukawa types, even at the 1 sigma level. However, we will show that B-physics constraints are crucial in excluding the possibility of wrong sign scenarios in the case where tan beta is below 1. We will also discuss the future prospects for probing the wrong sign scenarios at the next LHC run. Finally we will present a scenario where the alignment limit could be excluded due to non-decoupling in the case where the heavy CP-even Higgs is the one discovered at the LHC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present paper we focus on the performance of clustering algorithms using indices of paired agreement to measure the accordance between clusters and an a priori known structure. We specifically propose a method to correct all indices considered for agreement by chance - the adjusted indices are meant to provide a realistic measure of clustering performance. The proposed method enables the correction of virtually any index - overcoming previous limitations known in the literature - and provides very precise results. We use simulated datasets under diverse scenarios and discuss the pertinence of our proposal which is particularly relevant when poorly separated clusters are considered. Finally we compare the performance of EM and KMeans algorithms, within each of the simulated scenarios and generally conclude that EM generally yields best results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT OBJECTIVE To estimate the required number of public beds for adults in intensive care units in the state of Rio de Janeiro to meet the existing demand and compare results with recommendations by the Brazilian Ministry of Health. METHODS The study uses a hybrid model combining time series and queuing theory to predict the demand and estimate the number of required beds. Four patient flow scenarios were considered according to bed requests, percentage of abandonments and average length of stay in intensive care unit beds. The results were plotted against Ministry of Health parameters. Data were obtained from the State Regulation Center from 2010 to 2011. RESULTS There were 33,101 medical requests for 268 regulated intensive care unit beds in Rio de Janeiro. With an average length of stay in regulated ICUs of 11.3 days, there would be a need for 595 active beds to ensure system stability and 628 beds to ensure a maximum waiting time of six hours. Deducting current abandonment rates due to clinical improvement (25.8%), these figures fall to 441 and 417. With an average length of stay of 6.5 days, the number of required beds would be 342 and 366, respectively; deducting abandonment rates, 254 and 275. The Brazilian Ministry of Health establishes a parameter of 118 to 353 beds. Although the number of regulated beds is within the recommended range, an increase in beds of 122.0% is required to guarantee system stability and of 134.0% for a maximum waiting time of six hours. CONCLUSIONS Adequate bed estimation must consider reasons for limited timely access and patient flow management in a scenario that associates prioritization of requests with the lowest average length of stay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote Laboratories are an emergent technological and pedagogical tool at all education levels, and their widespread use is an important part of their own improvement and evolution. This paper describes several issues encountered on laboratorial classes, on higher education courses, when using remote laboratories based on PXI systems, either using the VISIR system or an alternate in-house solution. Three main issues are presented and explained, all reported by teachers that gave support to students use of remote laboratories. The first issue deals with the need to allow students to select the actual place where an ammeter is to be inserted on electric circuits, even incorrectly, therefore emulating real world difficulties. The second one deals with problems with timing when several measurements are required at short intervals, as in the discharge cycle of a capacitor. And the last issue deals with the use of a multimeter in DC mode when reading AC values, a use that collides with the lab settings. All scenarios are presented and discussed including the solution found for each case. The conclusion derived from the described work is that the remote laboratories area is an expanding field, where practical use leads to improvement and evolution of the available solutions, requiring a strict cooperation and information sharing between all actors, i.e. developers, teachers and students.