970 resultados para sampling methods
Resumo:
The present research paper presents five different clustering methods to identify typical load profiles of medium voltage (MV) electricity consumers. These methods are intended to be used in a smart grid environment to extract useful knowledge about customer’s behaviour. The obtained knowledge can be used to support a decision tool, not only for utilities but also for consumers. Load profiles can be used by the utilities to identify the aspects that cause system load peaks and enable the development of specific contracts with their customers. The framework presented throughout the paper consists in several steps, namely the pre-processing data phase, clustering algorithms application and the evaluation of the quality of the partition, which is supported by cluster validity indices. The process ends with the analysis of the discovered knowledge. To validate the proposed framework, a case study with a real database of 208 MV consumers is used.
Resumo:
OBJECTIVE: To obtain population estimates and profile risk factors for infant mortality in two birth cohorts and compare them among cities of different regions in Brazil. METHODS: In Ribeirão Preto, southeast Brazil, infant mortality was determined in a third of hospital live births (2,846 singleton deliveries) in 1994. In São Luís, northeast Brazil, data were obtained using systematic sampling of births stratified by maternity unit (2,443 singleton deliveries) in 1997-1998. Mothers answered standardized questionnaires shortly after delivery and information on infant deaths was retrieved from hospitals, registries and the States Health Secretarys' Office. The relative risk (RR) was estimated by Poisson regression. RESULTS: In São Luís, the infant mortality rate was 26.6/1,000 live births, the neonatal mortality rate was 18.4/1,000 and the post-neonatal mortality rate was 8.2/1,000, all higher than those observed in Ribeirão Preto (16.9, 10.9 and 6.0 per 1,000, respectively). Adjusted analysis revealed that previous stillbirths (RR=3.67 vs 4.13) and maternal age <18 years (RR=2.62 vs 2.59) were risk factors for infant mortality in the two cities. Inadequate prenatal care (RR=2.00) and male sex (RR=1.79) were risk factors in São Luís only, and a dwelling with 5 or more residents was a protective factor (RR=0.53). In Ribeirão Preto, maternal smoking was associated with infant mortality (RR=2.64). CONCLUSIONS: In addition to socioeconomic inequalities, differences in access to and quality of medical care between cities had an impact on infant mortality rates.
Resumo:
Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response
Resumo:
In the context of electricity markets, transmission pricing is an important tool to achieve an efficient operation of the electricity system. The electricity market is influenced by several factors; however the transmission network management is one of the most important aspects, because the network is a natural monopoly. The transmission tariffs can help to regulate the market, for this reason transmission tariffs must follow strict criteria. This paper presents the following methods to tariff the use of transmission networks by electricity market players: Post-Stamp Method; MW-Mile Method Distribution Factors Methods; Tracing Methodology; Bialek’s Tracing Method and Locational Marginal Price. A nine bus transmission network is used to illustrate the application of the tariff methods.
Resumo:
This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Tomographic image can be degraded, partially by patient based attenuation. The aim of this paper is to quantitatively verify the effects of attenuation correction methods Chang and CT in 111In studies through the analysis of profiles from abdominal SPECT, correspondent to a uniform radionuclide uptake organ, the left kidney.
Resumo:
Na indústria farmacêutica, a limpeza dos equipamentos e superfícies é muito importante no processo de fabrico/embalagem dos produtos farmacêuticos. Possíveis resíduos contaminantes devem ser removidos dos equipamentos e das superfícies envolvidas no processo. De acordo com as Boas Práticas de Fabrico (GMP), os procedimentos de limpeza e os métodos analíticos usados para determinar as quantidades de resíduos devem ser validados. O método analítico combinado com o método de amostragem utilizado na colheita de amostras deve ser sujeito a um ensaio de “recovery”. Neste trabalho apresenta-se uma estratégia inovadora para a validação de limpeza de formas farmacêuticas semi-sólidas. Propõe-se o uso de um método de amostragem que consiste na colheita direta de amostra após o seu fabrico, sendo a análise de resíduos feita directamente nesta amostra. Os produtos escolhidos para a avaliação da estratégia foram dois medicamentos dermatológicos, apresentados na forma de pomada e produzidos numa unidade de fabrico de vários produtos, pela Schering Plough Farma/ Merck Sharp & Dohme (Cacém, Portugal). Como métodos analíticos para a quantificação dos resíduos, utilizaram-se métodos validados por via espectrofotométrica (HPLC), usados na análise do produto acabado. A validação de limpeza foi avaliada através da análise de uma quantidade conhecida de pomada (produto B (*)), usando o método de análise da pomada fabricada anteriormente (produto A (*)), de modo a verificar-se a existência ou não de agente de limpeza e substâncias ativas deixadas após a limpeza do produto A, e vice-versa. As concentrações residuais das substâncias ativas e do agente de limpeza encontradas após a limpeza foram nulas, ou seja, inferiores ao limite de deteção (LOD), sendo que o critério de aceitação da limpeza utilizado foi de 6,4 x 10-4 mg/g para a substância ativa 1 (*); 1,0 x 10-2 mg/g para a substância ativa 2 (*); 1,0 x 10-3 mg/g para a substância ativa 3 (*) e de 10 ppm para o agente de limpeza. No ensaio de “recovery”, obtiveram-se resultados acima de 70% para todas as substâncias ativas e para o agente de limpeza nas duas pomadas. Antes de se proceder a este ensaio de “recovery”, houve a necessidade de ajustar as condições cromatográficas dos métodos analíticos de ambos os produtos e do agente de limpeza, por forma a obter-se valores da conformidade do sistema (fator de tailling e de resolução) de acordo com as especificações. A precisão dos resultados, reportada como desvio padrão relativo (RSD), deu abaixo de 2,0%, excepto nos ensaios que envolvem a substância ativa 3, cuja especificação é inferior a 10,0%. Os resultados obtidos demonstraram que os procedimentos de limpeza usados na unidade de fabrico em causa são eficazes, eliminando assim a existência de contaminação cruzada.
Resumo:
Introdução - A prevalência da doença pulmonar obstrutiva crónica (DPOC) apresenta valores muito heterogéneos em todo o mundo. A iniciativa Burden of Obstructive Lung Disease (BOLD) foi desenvolvida para que a prevalência da DPOC possa ser avaliada com metodologia uniformizada. O objetivo deste estudo foi estimar a prevalência da DPOC em adultos com 40 ou mais anos numa população alvo de 2 700 000 habitantes na região de Lisboa, de acordo com o protocolo BOLD. Métodos - A amostra foi estratificada de forma aleatória multifaseada selecionando-se 12 freguesias. O inquérito compreendia um questionário com informação sobre fatores de risco para a DPOC e doença respiratória autoreportada; adicionalmente, foi efetuada espirometria com prova de broncodilatação. Resultados - Foram incluídos 710 participantes com questionário e espirometria aceitáveis. A prevalência estimada da DPOC na população no estadio GOLD I+ foi de 14,2% (IC 95%: 11,1; 18,1) e de 7,3% no estadio ii+ (IC 95%: 4,7; 11,3). A prevalência não ajustada foi de 20,2% (IC 95%: 17,4; 23,3) no estadio i+ e de 9,5% (IC 95%: 7,6; 11,9) no estadio ii+. A prevalência da DPOC no estadio GOLD II+ aumentou com a idade, sendo mais elevada no sexo masculino. A prevalência estimada da DPOC no estadio GOLD I+ foi de 9,2% (IC 95%: 5,9; 14,0) nos não fumadores versus 27,4% (IC 95%: 18,5; 38,5) nos fumadores com carga tabágica de ≥ 20 Unidades Maço Ano. Detetou-se uma fraca concordância entre a referência a diagnóstico médico prévio e o diagnóstico espirométrico, com 86,8% de subdiagnósticos. Conclusões - O achado de uma prevalência estimada da DPOC de 14,2% sugere que esta é uma doença comum na região de Lisboa, contudo com uma elevada proporção de subdiagnósticos. Estes dados apontam para a necessidade de aumentar o grau de conhecimento dos profissionais de saúde sobre a DPOC, bem como a necessidade de maior utilização da espirometria nos cuidados de saúde primários.
Resumo:
OBJECTIVE: To examine the prevalence of body dissatisfaction and associated factors in 8- to 11-year-old schoolchildren. METHODS: A cross-sectional study including children aged 8- to 11-years enrolled in public and private schools in Porto Alegre, Southern Brazil, was carried out from August to December, 2001. A total of 901 subjects were selected through cluster sampling. Participants answered a questionnaire aimed at measuring body dissatisfaction and self-esteem and questions about family and social pressures on weight change. Height and weight were measured. The relationship between body dissatisfaction and the variables studied was measured by logistic regression. RESULTS: The prevalence of body dissatisfaction was 82%. Fifty-five percent of the girls wanted a thinner body size, and 28% desired a larger one; the estimates for the boys were 43% and 38%, respectively. Children with the lowest self-esteem (OR=1.80; 95% CI: 1.13-2.89) and who thought their parents (OR=6.10; 95% CI: 2.95-12.60) and friends (OR=1.81; 95% CI: 1.02-3.20) expected them to be thinner showed a higher chance of presenting body dissatisfaction. CONCLUSIONS: Body dissatisfaction was highly prevalent among the evaluated schoolchildren, especially in those with lower self-esteem and who thought their parents and friends expected them to be thinner.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
Mestrado em Engenharia Geotécnica e Geoambiente
Resumo:
Tese de Doutoramento, Geologia (Hidrogeologia), 17 de Dezembro de 2013, Universidade dos Açores.