890 resultados para monotone estimating
Resumo:
Copyright © 2014 The Authors. Methods in Ecology and Evolution © 2014 British Ecological Society.
Resumo:
OBJECTIVE: To estimate the prevalence of alcohol abuse/dependence and identify associated factors among demographic, family, socioeconomic and mental health variables. METHODS: A household survey was carried out in the urban area of Campinas, southeastern Brazil, in 2003. A total of 515 subjects, aged 14 years or more were randomly selected using a stratified cluster sample. The Self-Report Questionnaire and the Alcohol Use Disorder Identification Test were used in the interview. Prevalences were calculated, and univariate and multivariate logistic analyses performed by estimating odds ratios and 95% confidence intervals. RESULTS: The estimated prevalence of alcohol abuse/dependence was 13.1% (95% CI: 8.4;19.9) in men and 4.1% (95% CI: 1.9;8.6) in women. In the final multiple logistic regression model, alcohol abuse/dependence was significantly associated with age, income, schooling, religion and illicit drug use. The adjusted odds ratios were significantly higher in following variables: income between 2,501 and 10,000 dollars (OR=10.29); income above 10,000 dollars (OR=10.20); less than 12 years of schooling (OR=13.42); no religion (OR=9.16) or religion other than Evangelical (OR=4.77); and illicit drug use during lifetime (OR=4.47). Alcohol abuse and dependence patterns were different according to age group. CONCLUSIONS: There is a significantly high prevalence of alcohol abuse/dependence in this population. The knowledge of factors associated with alcohol abuse, and differences in consumption patterns should be taken into account in the development of harm reduction strategies.
Resumo:
O presente trabalho tem como objectivo o diagnóstico ambiental da empresa Lacticinios do Paiva, S.A, a avaliação da água do processo e da ETARI e o estudo da fermentação do soro de queijo com o intuito de produção de bioetanol. No diagnóstico ambiental da empresa, observou-se que 18.227.731 litros de leite usados anualmente geram 5.031 ton/ano de queijo, 7.204 ton/ano de soro de queijo, 74.201 m3/ano de efluente liquido, 14 ton/ano de plástico e 20 ton/ano de cartão. Os principais problemas com necessidade de optimização são a recuperação de água das lavagens, avaliação da produção de biogás no digestor anaeróbio, recuperação do volume de leite que é desperdiçado na produção de queijo fresco de longa duração, avaliação da eficiência energética da empresa, valorização das natas e do soro de queijo. Decidiu-se neste trabalho avaliar a possibilidade de reciclagem das águas de lavagem, avaliar o funcionamento da ETARI face à legislação existente e estudar a possibilidade de valorização do soro de queijo. Na avaliação das águas de processo das lavagens para posterior reciclagem, verifica-se que relativamente ao pH e aos sólidos suspensos não existe problema, podendo encarar-se a hipótese de reciclagem directa. No entanto, no que respeita à carga orgânica das águas de lavagem do sistema de ultrafiltração do queijo fresco de longa duração, constata-se que esta não poderia ser utilizada novamente, uma vez que apresenta valores elevados de CQO. Para a sua reutilização, será necessário remover a CQO, hipótese que se estudou com resultados positivos. Verificou-se que, um tratamento por adsorção em carvão activado precedido de microfiltração, reduz a CQO de forma significativa permitindo admitir a hipótese de reciclagem da água, nomeadamente para as 1ª e 3ª águas de lavagem. As outras águas teriam necessidade de mais tempo de contacto com o carvão activado. No sentido de avaliar o funcionamento da ETARI, foram analisadas várias correntes da mesma, em particular a do efluente final, no que respeita a parâmetros como: pH, Sólidos Suspensos Totais, Carência Química de Oxigénio, Carência Bioquímica de Oxigénio, Turvação, Nitratos, Fósforo Total, Azoto Kjeldalh, Azoto Amoniacal e Cloretos. Observou-se que os valores para o efluente final da ETARI são os seguintes: pH compreendido entre [7,21 – 8,69], SST entre [65,3 – 3110] mg/L, CQO entre [92,5 – 711,5] mg/L, CBO5 entre [58 – 161] mg/L, NO3- entre [10,8 – 106,7] mg/L, fósforo total entre [8,3 – 64,3] mg/L, turvação entre [67,7 – 733,3] FTU e cloretos entre [459,9 – 619,81] mg/L; pode-se dizer que os parâmetros analisados se encontram quase sempre dentro da gama de valores impostos pela Câmara Municipal de Lamego pelo que o efluente pode ser lançado no Colector Municipal de Cambres. Relativamente à fermentação alcoólica do soro de queijo, verifica-se que a levedura Kluyveromyces Marxianus consegue degradar praticamente todo o açúcar presente no permeado produzindo assim uma quantidade razoável de etanol. Quando se utilizou a levedura Saccharomyces Cerevisiae, a produção de etanol foi muito reduzida, como esperado, dado que esta levedura apresenta dificuldades na metabolização da lactose. Constatou-se assim que a melhor levedura para a fermentação do permeado do soro de queijo é a Kluyveromyces Marxianus, estimando-se em 150 mg a produção de etanol por L de soro.
Resumo:
Mestrado em Contabilidade e Análise Financeira
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.
Resumo:
CYCLOTech is a high-tech Project, related with an innovative method for direct production of a radioactive pharmaceutical, used in excess of 85% of 35 Million Nuclear Medicine procedures done yearly, worldwide, representing globally more than 3 Billion Euros. The CYCLOTech team has developed an innovative proprietary methodology based on the use of Cyclotron Centers, formally identified as the Clients (actually, there are around 450 of this Centers in function worldwide), to directly produce and deliver the radiopharmaceutical to the final users, at the Hospitals and other Health Institutions (estimating at around 25.000, worldwide). The investment still need to finish Research and Technological Development (RTD), Industrial, Regulatory and Intellectual Property Rights (IPR) issues and allow the introduction in the Market is 4,35 M€, with a Payback of 3 years, with an Investment Return Rate (IRR) of 81,7% and a Net Present Value (NPV) of 60.620.525€ (in 2020).
Resumo:
Geostatistics has been successfully used to analyze and characterize the spatial variability of environmental properties. Besides giving estimated values at unsampled locations, it provides a measure of the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. In this work universal block kriging is novelty used to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign, with the aim of distinguishing the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents that are very valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies might be very helpful in the future to validate dispersion models.
Resumo:
Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Eletrónica e Telecomunicações
Resumo:
Desertification is a critical issue for Mediterranean drylands. Climate change is expected to aggravate its extension and severity by reinforcing the biophysical driving forces behind desertification processes: hydrology, vegetation cover and soil erosion. The main objective of this thesis is to assess the vulnerability of Mediterranean watersheds to climate change, by estimating impacts on desertification drivers and the watersheds’ resilience to them. To achieve this objective, a modeling framework capable of analyzing the processes linking climate and the main drivers is developed. The framework couples different models adapted to different spatial and temporal scales. A new model for the event scale is developed, the MEFIDIS model, with a focus on the particular processes governing Mediterranean watersheds. Model results are compared with desertification thresholds to estimate resilience. This methodology is applied to two contrasting study areas: the Guadiana and the Tejo, which currently present a semi-arid and humid climate. The main conclusions taken from this work can be summarized as follows: • hydrological processes show a high sensitivity to climate change, leading to a significant decrease in runoff and an increase in temporal variability; • vegetation processes appear to be less sensitive, with negative impacts for agricultural species and forests, and positive impacts for Mediterranean species; • changes to soil erosion processes appear to depend on the balance between changes to surface runoff and vegetation cover, itself governed by relationship between changes to temperature and rainfall; • as the magnitude of changes to climate increases, desertification thresholds are surpassed in a sequential way, starting with the watersheds’ ability to sustain current water demands and followed by the vegetation support capacity; • the most important thresholds appear to be a temperature increase of +3.5 to +4.5 ºC and a rainfall decrease of -10 to -20 %; • rainfall changes beyond this threshold could lead to severe water stress occurring even if current water uses are moderated, with droughts occurring in 1 out of 4 years; • temperature changes beyond this threshold could lead to a decrease in agricultural yield accompanied by an increase in soil erosion for croplands; • combined changes of temperature and rainfall beyond the thresholds could shift both systems towards a more arid state, leading to severe water stresses and significant changes to the support capacity for current agriculture and natural vegetation in both study areas.
Resumo:
We prove existence, uniqueness, and stability of solutions of the prescribed curvature problem (u'/root 1 + u'(2))' = au - b/root 1 + u'(2) in [0, 1], u'(0) = u(1) = 0, for any given a > 0 and b > 0. We also develop a linear monotone iterative scheme for approximating the solution. This equation has been proposed as a model of the corneal shape in the recent paper (Okrasinski and Plociniczak in Nonlinear Anal., Real World Appl. 13:1498-1505, 2012), where a simplified version obtained by partial linearization has been investigated.
Resumo:
Preemptions account for a non-negligible overhead during system execution. There has been substantial amount of research on estimating the delay incurred due to the loss of working sets in the processor state (caches, registers, TLBs) and some on avoiding preemptions, or limiting the preemption cost. We present an algorithm to reduce preemptions by further delaying the start of execution of high priority tasks in fixed priority scheduling. Our approaches take advantage of the floating non-preemptive regions model and exploit the fact that, during the schedule, the relative task phasing will differ from the worst-case scenario in terms of admissible preemption deferral. Furthermore, approximations to reduce the complexity of the proposed approach are presented. Substantial set of experiments demonstrate that the approach and approximations improve over existing work, in particular for the case of high utilisation systems, where savings of up to 22% on the number of preemption are attained.
Resumo:
As doenças cardiovasculares lideram as causas de mortalidade no mundo e em Portugal. Alguns dos fatores de risco (FR) associados são sexo masculino, idade avançada, hipertensão arterial, hipercolesteremia, tabagismo, obesidade e sedentarismo, cuja sinergia amplifica o risco cardiovascular. Realizou-se um rastreio em indivíduos da região norte de Portugal, com o objetivo de determinar, pela tabela derivada do projeto SCORE, o Risco Cardiovascular Absoluto e o Risco Cardiovascular Relativo e Risco Cardiovascular Absoluto Projetado aos 60 anos. Verificou-se a presença de vários FR na amostra em estudo. A avaliação do risco permite estimar a interação de FR individuais, fundamentando a definição de estratégias interventivas, com potenciais ganhos em saúde.
Resumo:
In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.