997 resultados para Estimated parameters
Resumo:
Due to their detrimental effects on human health, the scientific interest in ultrafine particles (UFP) has been increasing, but available information is far from comprehensive. Compared to the remaining population, the elderly are potentially highly susceptible to the effects of outdoor air pollution. Thus, this study aimed to (1) determine the levels of outdoor pollutants in an urban area with emphasis on UFP concentrations and (2) estimate the respective dose rates of exposure for elderly populations. UFP were continuously measured over 3 weeks at 3 sites in north Portugal: 2 urban (U1 and U2) and 1 rural used as reference (R1). Meteorological parameters and outdoor pollutants including particulate matter (PM10), ozone (O3), nitric oxide (NO), and nitrogen dioxide (NO2) were also measured. The dose rates of inhalation exposure to UFP were estimated for three different elderly age categories: 64–70, 71–80, and >81 years. Over the sampling period levels of PM10, O3 and NO2 were in compliance with European legislation. Mean UFP were 1.7 × 104 and 1.2 × 104 particles/cm3 at U1 and U2, respectively, whereas at rural site levels were 20–70% lower (mean of 1 ×104 particles/cm3). Vehicular traffic and local emissions were the predominant identified sources of UFP at urban sites. In addition, results of correlation analysis showed that UFP were meteorologically dependent. Exposure dose rates were 1.2- to 1.4-fold higher at urban than reference sites with the highest levels noted for adults at 71–80 yr, attributed mainly to higher inhalation rates.
Resumo:
In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results.
Resumo:
The effect of organic and conventional agricultural systems on the physicochemical parameters, bioactive compounds content, and sensorial attributes of tomatoes (‘‘Redondo’’ cultivar) was studied. The influence on phytochemicals distribution among peel, pulp and seeds was also accessed. Organic tomatoes were richer in lycopene (+20%), vitamin C (+30%), total phenolics (+24%) and flavonoids (+21%) and had higher (+6%) in vitro antioxidant activity. In the conventional fruits, lycopene was mainly concentrated in the pulp, whereas in the organic ones, the peel and seeds contained high levels of bioactive compounds. Only the phenolic compounds had a similar distribution among the different fractions of both types of tomatoes. Furthermore, a sensorial analysis indicated that organic farming improved the gustative properties of this tomato cultivar.
Resumo:
Motor dysfunction is consistently reported but understudied in schizophrenia. It has been hypothesized that this abnormality may reflect a neuro-developmental disorder underlying this illness. The main goal of this study was to analyze movement patterns used by participants with schizophrenia and healthy controls during overarm throwing performance, using a markerless motion capture system. Thirteen schizophrenia patients and 16 healthy control patients performed the overarm throwing task in a markerless motion capture system. Participants were also examined for the presence of motor neurological soft signs (mNSS) using the Brief Motor Scale. Schizophrenia patients demonstrated a less developed movement pattern with low individualization of components compared to healthy controls. The schizophrenia group also displayed a higher incidence of mNSS. The presence of a less mature movement pattern can be an indicator of neuro-immaturity and a marker for atypical neurological development in schizophrenia. Our findings support the understanding of motor dysfunction as an intrinsic part of the disorder of schizophrenia.
Resumo:
Mestrado em Engenharia Química - Tecnologias de Protecção Ambiental
Resumo:
Mestrado em Engenharia Informática - Área de Especialização em Sistemas Gráficos e Multimédia
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
A correlation and predictive scheme for the viscosity and self-diffusivity of liquid dialkyl adipates is presented. The scheme is based on the kinetic theory for dense hard-sphere fluids, applied to the van der Waals model of a liquid to predict the transport properties. A "universal" curve for a dimensionless viscosity of dialkyl adipates was obtained using recently published experimental viscosity and density data of compressed liquid dimethyl (DMA), dipropyl (DPA), and dibutyl (DBA) adipates. The experimental data are described by the correlation scheme with a root-mean-square deviation of +/- 0.34 %. The parameters describing the temperature dependence of the characteristic volume, V-0, and the roughness parameter, R-eta, for each adipate are well correlated with one single molecular parameter. Recently published experimental self-diffusion coefficients of the same set of liquid dialkyl adipates at atmospheric pressure were correlated using the characteristic volumes obtained from the viscosity data. The roughness factors, R-D, are well correlated with the same single molecular parameter found for viscosity. The root-mean-square deviation of the data from the correlation is less than 1.07 %. Tests are presented in order to assess the capability of the correlation scheme to estimate the viscosity of compressed liquid diethyl adipate (DEA) in a range of temperatures and pressures by comparison with literature data and of its self-diffusivity at atmospheric pressure in a range of temperatures. It is noteworthy that no data for DEA were used to build the correlation scheme. The deviations encountered between predicted and experimental data for the viscosity and self-diffusivity do not exceed 2.0 % and 2.2 %, respectively, which are commensurate with the estimated experimental measurement uncertainty, in both cases.
Resumo:
A eficiência energética e a preocupação com a sustentabilidade têm vindo a ganhar preponderância na sociedade moderna. Este trabalho é uma contribuição para esta tendência onde se pretendeu avaliar e sugerir alterações ao sistema de climatização do edifício Biorama do Parque Biológico de Vila Nova de Gaia (PBG). Procedeu-se em primeiro lugar a uma caracterização física, química e geográfica dos 5 biomas constituintes do Biorama. Para isso, recorreu-se a documentos fornecidos pelo próprio PBG, visitas ao local e registo de medições de alguns parâmetros (temperatura, humidade relativa, qualidade do ar). Posteriormente foi realizado o balanço térmico dos edifícios, de acordo com a legislação em vigor, recorrendo a expressões e conceitos teóricos. Foram determinados valores dos ganhos térmicos de aquecimento de 15811, 10694, 7939, 9233, e 6621 kWh/ano para Floresta tropical, Mesozoico, Dunas, Savana e Deserto, respetivamente. Foram igualmente determinados valores dos ganhos térmicos no verão de 7093, 4798, 3560, 4144 e 2971 kWh na Floresta tropical, no Mesozoico, nas Dunas, na Savana e no Deserto, respetivamente. As cargas térmicas de aquecimento foram 149, 125, 47, 60 e 51 kW na Floresta tropical, no Mesozoico, nas Dunas, na Savana e no Deserto, respetivamente. As cargas térmicas de arrefecimento foram iguais a 59, 57, 47, 35 e 36 kW na Floresta tropical, no Mesozoico, nas Dunas, na Savana e no Deserto, respetivamente. Algumas soluções são avançadas, bem como alternativas comportamentais de modo a corrigir alguns problemas identificados. Uma proposta é a da instalação de painéis solares e acumuladores de calor, com os quais se estima um ganho médio conjunto de 500 W em cada bioma, e representam um investimento de 1050 euros e terão um retorno de 1 ano. Em relação à humidade é sugerido a utilização mais eficaz dos aspersores existentes e a utilização de esponjas, para fazer subir a humidade relativa para valores superiores a 80%. Em sentido inverso, no inverno, propõem-se a utilização de material higroscópico para fazer baixar a humidade relativa em cerca de 5%. Os custos com os suportes e o material higroscópico rondam os 250 €. Por fim, é sugerido a instalação de um aparelho de ar condicionado de 16 000 BTU no corredor de ligação, pois é a única forma de garantir condições de conforto térmico. Esta proposta de arrefecimento com ar condicionado e ainda o recurso a uma cortina de lâminas de plástico, que servem para efetuar uma separação mais eficiente entre ar frio e ar quente, têm um custo aproximado de 350 €. É ainda sugerida a utilização de lonas ou de uma planta trepadeira com um custo por planta de 5€, nas coberturas dos telhados virados a sul, sendo que a zona do corredor deverá ser totalmente coberta, a fim de evitar a exposição solar direta.
Resumo:
Demand response is an energy resource that has gained increasing importance in the context of competitive electricity markets and of smart grids. New business models and methods designed to integrate demand response in electricity markets and of smart grids have been published, reporting the need of additional work in this field. In order to adequately remunerate the participation of the consumers in demand response programs, improved consumers’ performance evaluation methods are needed. The methodology proposed in the present paper determines the characterization of the baseline approach that better fits the consumer historic consumption, in order to determine the expected consumption in absent of participation in a demand response event and then determine the actual consumption reduction. The defined baseline can then be used to better determine the remuneration of the consumer. The paper includes a case study with real data to illustrate the application of the proposed methodology.
Resumo:
Differences in virulence of strains of Entamoeba histolytica have long been detected by various experimental assays, both in vivo and in vitro. Discrepancies in the strains characterization have been arisen when different biological assays are compared. In order to evaluate different parameters of virulence in the strains characterization, five strains of E. histolytica, kept under axenic culture, were characterized in respect to their, capability to induce hamster liver abscess, erythrophagocytosis rate and cytopathic effect upon VERO cells. It was found significant correlation between in vitro biological assays, but not between in vivo and in vitro assays. Good correlation was found between cytopathic effect and the mean number of uptaken erythrocytes, but not with percentage of phagocytic amoebae, showing that great variability can be observed in the same assay, according to the variable chosen. It was not possible to correlate isoenzyme and restriction fragment pattern with virulence indexes since all studied strains presented pathogenic patterns. The discordant results observed in different virulence assays suggests that virulence itself may not the directly assessed. What is in fact assessed are different biological characteristics or functions of the parasite more than virulence itself. These characteristics or functions may be related or not with pathogenic mechanisms occurring in the development of invasive amoebic disease
Resumo:
The present study assessed the clinical significance of hepatitis C virus (HCV) genotypes and their influence on response to long term recombinant-interferon-alpha (r-IFN-a) therapy in Brazilian patients. One hundred and thirty samples from patients previously genotyped for the HCV and with histologically confirmed chronic hepatitis C (CH-C) were evaluated for clinical and epidemiological parameters (sex, age, time of HCV infection and transmission routes). No difference in disease activity, sex, age or mode and time of transmission were seen among patients infected with HCV types 1, 2 or 3. One hundred and thirteen of them were treated with 3 million units of r-IFN-a, 3 times a week for 12 months. Initial response (IR) was significantly better in patients with genotype 2 (100%) and 3 (46%) infections than in patients with genotype 1 (29%) (p < 0.005). Among subtypes, difference in IR was observed between 1b and 2 (p < 0.005), and between 1b and 3a (p < 0.05). Sustained response (SR) was observed in 12% for (sub)type 1a, 13% for 1b, 19% for 3a, and 40% for type 2; significant differences were found between 1b and 2 (p < 0.001), and between 1b and 3a (p < 0.05). Moreover, presence of cirrhosis was significantly associated with non response and response with relapse (p < 0.05). In conclusion, non-1 HCV genotype and lack of histological diagnosis of cirrhosis were the only baseline features associated with sustained response to treatment. These data indicate that HCV genotyping may have prognostic relevance in the responsiveness to r-IFN-a therapy in Brazilian patients with chronic HCV infection, as seen in other reports worldwide.
Resumo:
We are presenting a simple, low-cost and rapid solid-state optical probe for screening chlorpromazine (CPZ) in aquacultures. The method exploits the colourimetric reaction between CPZ and Fe(III) ion that occurs at a solid/liquid interface, the solid layer consisting of ferric iron entrapped in a layer of plasticized PVC. If solutions containing CPZ are dropped onto such a layer, a colour change occurs from light yellow to dark pink or even light blue, depending on the concentration of CPZ. Visual inspection enables the concentration of CPZ to be estimated. The resulting colouration was also monitored by digital image collection for a more accurate quantification. The three coordinates of the hue, saturation and lightness system were obtained by standard image processing along with mathematical data treatment. The parameters affecting colour were assessed and optimized. Studies were conducted by visible spectrophotometry and digital image acquisition, respectively. The response of the optimized probe towards the concentration of CPZ was tested for several mathematical transformations of the colour coordinates, and a linear relation was found for the sum of hue and luminosity. The limit of detection is 50 μM (corresponding to about 16 μg per mL). The probe enables quick screening for CPZ in real water samples with prior sample treatment.
Resumo:
23rd Euromicro International Conference on Parallel, Distributed, and Network-Based Processing (PDP 2015). 4 to 6, Mar, 2015. Turku, Finland.