975 resultados para non-triangular setup times
Resumo:
The prevalence of rubella antibodies was evaluated through a ramdom Seroepidemiological survey in 1400 blood samples of 2-14 year old children and in 329 samples of umbilical cord serum. Rubella IgG antibodies were detected by ELISA, and the sera were collected in 1987, five years before the mass vaccination campaign with measles-mumps-rubella vaccine carried out in the city of São Paulo in 1992. A significant increase in prevalence of rubella infection was observed after 6 years of age, and 77% of the individuals aged from 15 to 19 years had detectable rubella antibodies. However, the seroprevalence rose to 90.5% (171/189) in cord serum samples from children whose mothers were 20 to 29 years old, and reached 95.6% in newborns of mothers who were 30 to 34 years old, indicating that a large number of women are infected during childbearing years. This study confirms that rubella infection represents an important Public Health problem in São Paulo city. The data on the seroprevalence of rubella antibodies before the mass vaccination campaign reflects the baseline immunological status of this population before any intervention and should be used to design an adequate vaccination strategy and to assess the Seroepidemiological impact of this intervention.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
A sustentabilidade energética do planeta é uma preocupação corrente e, neste sentido, a eficiência energética afigura-se como sendo essencial para a redução do consumo em todos os setores de atividade. No que diz respeito ao setor residencial, o indevido comportamento dos utilizadores aliado ao desconhecimento do consumo dos diversos aparelhos, são factores impeditivos para a redução do consumo energético. Uma ferramenta importante, neste sentido, é a monitorização de consumos nomeadamente a monitorização não intrusiva, que apresenta vantagens económicas relativamente à monitorização intrusiva, embora levante alguns desafios na desagregação de cargas. Abordou-se então, neste documento, a temática da monitorização não intrusiva onde se desenvolveu uma ferramenta de desagregação de cargas residenciais, sobretudo de aparelhos que apresentavam elevados consumos. Para isso, monitorizaram-se os consumos agregados de energia elétrica, água e gás de seis habitações do município de Vila Nova de Gaia. Através da incorporação dos vetores de água e gás, a acrescentar ao da energia elétrica, provou-se que a performance do algoritmo de desagregação de aparelhos poderá aumentar, no caso de aparelhos que utilizem simultaneamente energia elétrica e água ou energia elétrica e gás. A eficiência energética é também parte constituinte deste trabalho e, para tal, implementaram-se medidas de eficiência energética para uma das habitações em estudo, de forma a concluir as que exibiam maior potencial de poupança, assim como rápidos períodos de retorno de investimento. De um modo geral, os objetivos propostos foram alcançados e espera-se que num futuro próximo, a monitorização de consumos não intrusiva se apresente como uma solução de referência no que respeita à sustentabilidade energética do setor residencial.
Resumo:
A cepa WSL (Wild São Lorenço) de T. cruzi, isolada de um cobaio proveniente de São Lorenço da Mata (Nordeste do Brasil) foi caracterizada através da análise do seu comportamento morfobiológico e perfil isoenzimático. Para o estudo do comportamento morfobiológico, tripomastigotas sanguíneos (1 x 10 5) da cepa WSL foram inoculados por via intraperitonal em camundongos albinos Swiss. Como controle a cepa Y (Tipo I) foi usada. Durante o curso da infecção os seguintes parâmetros foram analisados: parasitemia, mortalidade, morfologia dos parasitas no sangue periférico e tropismo tissular. O perfil isoenzimático foi analisado em relação às enzimas ALAT, GPI e PGM usando como controle de referência as cepas Peruana (Tipo I), 21SF (Tipo II) e Colombiana (Tipo III). A cepa WSL apresentou as seguintes características biológicas: 1) multiplicação lenta e pico parasitêmico entre 21 - 25 dias pós-infecção; 2) mortalidade de 3,3% 40 dias pós-infecção; 3) predominância de formas largas no sangue periférico e 4) miotropismo com predominante envolvimento cardíaco. A análise isoenzimática mostrou um padrão de zimodema 2 (Z2) que corresponde às cepas biológicas Tipo II. Os resultados mostram que a cepa WSL apresenta baixa virulência e patogenicidade.
Resumo:
To evaluate whether the intensity of the hepatic granulomatous response induced by S. mansoni eggs plays a role in drug metabolism, mice were infected with 40 cercariae and tested to assess the sodic pentobarbital induced sleeping-time. To decrease the inflammatory reaction the animals were irradiated with 400 Rad or received azathioprine, 20mg/kg, 3 times a week, for 4 weeks, respectively in or beginning in the 33th post-infection day. In infected animals receiving azathioprine the area of the hepatic granulomas was smaller and the sleeping-time was similar to that of non-infected ones (controls). In mice infected and irradiated the granuloma dimensions were similar to those of animals only infected, in these two latter groups of animals, the sleeping-time was more prolonged than that of the control animals. These results show that: 1) mice with unaltered hepatic granulomatous reaction show reduction in metabolism of sodic pentobarbital; 2) granulomatous response diminished by azathioprine does not interfere with the capacity of metabolism of the anesthetic drug.
Resumo:
Geociências, Museu Nac. Hist. Nat. Univ. Lisboa, nº 2, 35-84
Resumo:
The non-technical loss is not a problem with trivial solution or regional character and its minimization represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. In this paper, we show how to improve the training phase of a neural network-based classifier using a recently proposed meta-heuristic technique called Charged System Search, which is based on the interactions between electrically charged particles. The experiments were carried out in the context of non-technical loss in power distribution systems in a dataset obtained from a Brazilian electrical power company, and have demonstrated the robustness of the proposed technique against with several others natureinspired optimization techniques for training neural networks. Thus, it is possible to improve some applications on Smart Grids.
Resumo:
Demand response has gained increasing importance in the context of competitive electricity markets and smart grid environments. In addition to the importance that has been given to the development of business models for integrating demand response, several methods have been developed to evaluate the consumers’ performance after the participation in a demand response event. The present paper uses those performance evaluation methods, namely customer baseline load calculation methods, to determine the expected consumption in each period of the consumer historic data. In the cases in which there is a certain difference between the actual consumption and the estimated consumption, the consumer is identified as a potential cause of non-technical losses. A case study demonstrates the application of the proposed method to real consumption data.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding the management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.
Resumo:
The effects of Corynebacterium parvum on host protection, tissue reaction and "in vivo" chemotaxis in Schistosoma mansoni infected mice were studied. The C. parvum was given intraperitoneally using a dose of 0.7 mg, twice a week (for 4 weeks), thirty days before (prophylactic treatment) or after infection (curative treatment). The host protection was evaluated through the recovery of adult worms by liver perfusion and was lower in the prophylactic group as compared to the control group (p = 0.018), resulting in 44% protection. The "in vivo" leukocyte response in both prophylactic and curative groups was higher as compared to the infected/non treated group (p = 0.009 and p = 0.003, respectively). Tissue reactions were described in the experimental and control groups, but there were not remarkable differences among them. The possible biological implications and relevance of the findings for the defensive response of the host and control of schistosomiasis are discussed.
Resumo:
The electricity demand in Brazil has been growing. Some studies estimate that through 2035 the energy consumption (the power consumption) should increase 78%. Two distinct actions are necessary to meet this growth: the construction of new generating plants and to reduce electrical losses in the country. As the construction of power plants have a high price, coupled with the growth of (current) environmental concern, electric utilities are investing in reducing losses, both technical and non-technical. In this context, this paper aims to present an overview of nontechnical losses in Brazil and to raise a discussion on the reasons that contribute to energy fraud.
Resumo:
The electric utilities have large revenue losses annually due to commercial losses, which are caused mainly by fraud on the part of consumers and faulty meters. Automatic detection of such losses where there is a complex problem, given the large number of consumers and the high cost of each inspection, not to mention the wear of the relationship between company and consumer. Given the above, this paper aims to briefly present some methodologies applied by utilities to identify consumer frauds.
Resumo:
The identification of the major agents causing human hepatitis (Hepatitis A, B, C, D and E Viruses) was achieved during the last 30 years. These viruses are responsible for the vast majority of human viral hepatitis cases, but there are still some cases epidemiologically related to infectious agents without any evidence of infection with known virus, designated as hepatitis non A - E. Those cases are considered to be associated with at least three different viruses: 1 - Hepatitis B Virus mutants expressing its surface antigen (HBsAg) with altered epitopes or in low quantities; 2 - Another virus probably associated with enteral transmitted non A-E hepatitis, called Hepatitis F Virus. Still more studies are necessary to better characterize this agent; 3 - Hepatitis G Virus or GB virus C, recently identified throughout the world (including Brazil) as a Flavivirus responsible for about 10% of parenteral transmitted hepatitis non A-E. Probably still other unknown viruses are responsible for human hepatitis cases without evidence of infection by any of these viruses, that could be called as non A-G hepatitis.
Resumo:
A polymerase chain reaction was carried out to detect pathogenic leptospires isolated from animals and humans in Argentina. A double set of primers (G1/G2, B64-I/B64-II), described before, were used to amplify by PCR a DNA fragment from serogroups belonging to Leptospira interrogans but did not allow to detect saprophytic strains isolated from soil and water (L. biflexa). This fact represents an advantage since it makes possible the differentiation of pathogenic from non-pathogenic leptospires in cultures. The sensitivity of this assay has been determined, allowing to detect just only 10 leptospires in the reaction tube. Those sets of primers generated either a 285 bp or 360 bp fragment, depending on the pathogenic strain
Resumo:
O conforto é uma necessidade para a maioria das pessoas. A busca de vestuário que se adapte às condições ambientais tornou-se essencial. Queremos materiais que nos mantenham quentes ou frescos, em condições de frio ou calor, e sejam capazes de nos manter secos se chover, ou se transpirarmos, devido a actividade intensa, ou simplesmente porque está quente. O objectivo principal deste trabalho era desenvolver uma estrutura multicamada respirável, para posterior aplicação num sapato perfurado, tornando-o respirável e impermeável. São já aplicados em peças de roupa e calçado, materiais que permitem essa gestão de calor e humidade – as membranas. Neste trabalho, foram apresentadas algumas membranas, de fabricantes e materiais diferentes, que foram testadas de modo a obter valores para a transmissão de vapor de água e classificá-las quanto à sua respirabilidade, relativamente a uma membrana de referência. Foram feitos testes com as membranas isoladas, laminadas e com sobreposição de duas membranas laminadas. Verificou-se que a laminagem não diminuía, substancialmente, a respirabilidade das membranas. Já a sobreposição de membranas, demonstrou diminuir em 35 % a respirabilidade das membranas. A membrana com melhor desempenho é constituída por um polímero de base éter e blocos de amida (PEBA). Ainda pouco aplicado em vestuário e calçado, mas com algum potencial, são os não-tecidos impregnados com polímeros super absorventes (SAP’s). Estes podem absorver até 500 vezes o seu peso em água, dependendo da quantidade de SAP’s com que o não tecido é impregnado e da aplicação final. Esta capacidade de adsorção seria uma mais-valia, em condições de chuva intensa, mas por outro lado, se atingir a saturação, não permite a entrada ou saída de ar, o que poderia levar a desconforto no utilizador. Por fim, foi utilizado um manequim térmico (pé), onde se testaram diferentes calçados, verificando-se que só é possível perder calor e vapor de água pela sola do sapato se esta se encontrar perfurada e utilizar um sistema respirável. Futuramente, pretende-se aplicar uma outra camada de não-tecido, na outra face das membranas já testadas, de modo a criar um sistema de 3 camadas, e testar a sua respirabilidade. Sugere-se, também, criar uma estrutura sólida e arejada para utilizar os não-tecidos impregnados em SAPs. Posteriormente, deve-se aplicar estas estruturas num sapato com a sola perfurada e testá-las no manequim térmico.