988 resultados para Normally Complemented Subgroups
Resumo:
Mestrado em Radioterapia
Resumo:
Inventories and vertical distribution of (137)Cs were determined in La Plata region undisturbed soils, Argentina. A mean inventory value of 891 ± 220 Bq/m(2) was established, which is compatible with the values expected from atmospheric weapon tests fallout. The study was complemented with pH, organic carbon fraction, texture and mineralogical soil analyses. Putting together Southern Hemisphere (137)Cs inventory data, it is possible to correlate these data with the mean annual precipitations. The large differences in (137)Cs concentration profiles were attributed to soil properties, especially the clay content and the pH values. A convection-dispersion model with irreversible retention was used to fit the activity concentration profiles. The obtained effective diffusion coefficient and effective convection velocity parameters values were in the range from 0.2 cm(2)/y to 0.4 cm(2)/y and from 0.23 cm/y to 0.43 cm/y, respectively. These data are in agreement with values reported in literature. In general, with the growth of clay content in the soil, there was an increase in the transfer rate from free to bound state. Finally, the highest transfer rate from free to bound state was obtained for soil pH value equal to 8.
Resumo:
OBJECTIVE: To analyze alcohol and tobacco use among Brazilian adolescents and identify higher-risk subgroups. METHODS: A systematic review of the literature was conducted. Searches were performed using four databases (LILACS, MEDLINE /PubMed, Web of Science, and Google Scholar), specialized websites and the references cited in retrieved articles. The search was done in English and Portuguese and there was no limit on the year of publication (up to June 2011). From the search, 59 studies met all the inclusion criteria: to involve Brazilian adolescents aged 10-19 years; to assess the prevalence of alcohol and/or tobacco use; to use questionnaires or structured interviews to measure the variables of interest; and to be a school or population-based study that used methodological procedures to ensure representativeness of the target population (i.e. random sampling). RESULTS: The prevalence of current alcohol use (at the time of the investigation or in the previous month) ranged from 23.0% to 67.7%. The mean prevalence was 34.9% (reflecting the central trend of the estimates found in the studies). The prevalence of current tobacco use ranged from 2.4% to 22.0%, and the mean prevalence was 9.3%. A large proportion of the studies estimated prevalences of frequent alcohol use (66.7%) and heavy alcohol use (36.8%) of more than 10%. However, most studies found prevalences of frequent and heavy tobacco use of less than 10%. The Brazilian literature has highlighted that environmental factors (religiosity, working conditions, and substance use among family and friends) and psychosocial factors (such as conflicts with parents and feelings of negativeness and loneliness) are associated with the tobacco and alcohol use among adolescents. CONCLUSIONS: The results suggest that consumption of alcohol and tobacco among adolescents has reached alarming prevalences in various localities in Brazil. Since unhealthy behavior tends to continue from adolescence into adulthood, public policies aimed towards reducing alcohol and tobacco use among Brazilians over the medium and long terms may direct young people and the subgroups at higher risk towards such behavior.
Resumo:
Dissertação de Natureza Científica para obtenção do grau de Mestre em Engenharia Civil Perfil Estruturas
Resumo:
Trabalho de Projeto para obtenção do grau de Mestre em Engenharia Civil Área de Especialização em Estruturas
Resumo:
Medical imaging is a powerful diagnostic tool. Consequently, the number of medical images taken has increased vastly over the past few decades. The most common medical imaging techniques use X-radiation as the primary investigative tool. The main limitation of using X-radiation is associated with the risk of developing cancers. Alongside this, technology has advanced and more centres now use CT scanners; these can incur significant radiation burdens compared with traditional X-ray imaging systems. The net effect is that the population radiation burden is rising steadily. Risk arising from X-radiation for diagnostic medical purposes needs minimising and one way to achieve this is through reducing radiation dose whilst optimising image quality. All ages are affected by risk from X-radiation however the increasing population age highlights the elderly as a new group that may require consideration. Of greatest concern are paediatric patients: firstly they are more sensitive to radiation; secondly their younger age means that the potential detriment to this group is greater. Containment of radiation exposure falls to a number of professionals within medical fields, from those who request imaging to those who produce the image. These staff are supported in their radiation protection role by engineers, physicists and technicians. It is important to realise that radiation protection is currently a major European focus of interest and minimum competence levels in radiation protection for radiographers have been defined through the integrated activities of the EU consortium called MEDRAPET. The outcomes of this project have been used by the European Federation of Radiographer Societies to describe the European Qualifications Framework levels for radiographers in radiation protection. Though variations exist between European countries radiographers and nuclear medicine technologists are normally the professional groups who are responsible for exposing screening populations and patients to X-radiation. As part of their training they learn fundamental principles of radiation protection and theoretical and practical approaches to dose minimisation. However dose minimisation is complex – it is not simply about reducing X-radiation without taking into account major contextual factors. These factors relate to the real world of clinical imaging and include the need to measure clinical image quality and lesion visibility when applying X-radiation dose reduction strategies. This requires the use of validated psychological and physics techniques to measure clinical image quality and lesion perceptibility.
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Supervisão em Educação
Resumo:
Biophysical Chemistry 110 (2004) 83–92
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
Mestrado em Educação Pré-Escolar
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.
Resumo:
OBJECTIVE To analyze the association between concentrations of air pollutants and admissions for respiratory causes in children. METHODS Ecological time series study. Daily figures for hospital admissions of children aged < 6, and daily concentrations of air pollutants (PM10, SO2, NO2, O3 and CO) were analyzed in the Região da Grande Vitória, ES, Southeastern Brazil, from January 2005 to December 2010. For statistical analysis, two techniques were combined: Poisson regression with generalized additive models and principal model component analysis. Those analysis techniques complemented each other and provided more significant estimates in the estimation of relative risk. The models were adjusted for temporal trend, seasonality, day of the week, meteorological factors and autocorrelation. In the final adjustment of the model, it was necessary to include models of the Autoregressive Moving Average Models (p, q) type in the residuals in order to eliminate the autocorrelation structures present in the components. RESULTS For every 10:49 μg/m3 increase (interquartile range) in levels of the pollutant PM10 there was a 3.0% increase in the relative risk estimated using the generalized additive model analysis of main components-seasonal autoregressive – while in the usual generalized additive model, the estimate was 2.0%. CONCLUSIONS Compared to the usual generalized additive model, in general, the proposed aspect of generalized additive model − principal component analysis, showed better results in estimating relative risk and quality of fit.
Resumo:
OBJECTIVE To analyze the access and utilization profile of biological medications for psoriasis provided by the judicial system in Brazil.METHODSThis is a cross-sectional study. We interviewed a total of 203 patients with psoriasis who were on biological medications obtained by the judicial system of the State of Sao Paulo, from 2004 to 2010. Sociodemographics, medical, and political-administrative characteristics were complemented with data obtained from dispensation orders that included biological medications to treat psoriasis and the legal actions involved. The data was analyzed using an electronic data base and shown as simple variable frequencies. The prescriptions contained in the lawsuits were analyzed according to legal provisions.RESULTS A total of 190 lawsuits requesting several biological drugs (adalimumab, efalizumab, etanercept, and infliximab) were analyzed. Patients obtained these medications as a result of injunctions (59.5%) or without having ever demanded biological medication from any health institution (86.2%), i.e., public or private health services. They used the prerogative of free legal aid (72.6%), even though they were represented by private lawyers (91.1%) and treated in private facilities (69.5%). Most of the patients used a biological medication for more than 13 months (66.0%), and some patients were undergoing treatment with this medication when interviewed (44.9%). Approximately one third of the patients discontinued treatment due to worsening of their illness (26.6%), adverse drug reactions (20.5%), lack of efficacy, or because the doctor discontinued this medication (13.8%). None of the analyzed medical prescriptions matched the legal prescribing requirements. Clinical monitoring results showed that 70.3% of the patients had not undergone laboratory examinations (blood work, liver and kidney function tests) for treatment control purposes.CONCLUSIONS The plaintiffs resorted to legal action to get access to biological medications because they were either unaware or had difficulty in accessing them through institutional public health system procedures. Access by means of legal action facilitated long-term use of this type of medication through irregular prescriptions and led to a high rate of adverse drug reactions as well as inappropriate clinical monitoring.
Resumo:
Penalty and Barrier methods are normally used to solve Nonlinear Optimization Problems constrained problems. The problems appear in areas such as engineering and are often characterised by the fact that involved functions (objective and constraints) are non-smooth and/or their derivatives are not know. This means that optimization methods based on derivatives cannot net used. A Java based API was implemented, including only derivative-free optimizationmethods, to solve both constrained and unconstrained problems, which includes Penalty and Barriers methods. In this work a new penalty function, based on Fuzzy Logic, is presented. This function imposes a progressive penalization to solutions that violate the constraints. This means that the function imposes a low penalization when the violation of the constraints is low and a heavy penalisation when the violation is high. The value of the penalization is not known in beforehand, it is the outcome of a fuzzy inference engine. Numerical results comparing the proposed function with two of the classic penalty/barrier functions are presented. Regarding the presented results one can conclude that the prosed penalty function besides being very robust also exhibits a very good performance.