933 resultados para Functions of real variables
Resumo:
In memory of our beloved Professor José Rodrigues Santos de Sousa Ramos (1948-2007), who João Cabral, one of the authors of this paper, had the honor of being his student between 2000 and 2006, we wrote this paper following the research by experimentation, using the new technologies to capture a new insight about a problem, as him so much love to do it. His passion was to create new relations between different fields of mathematics. He was a builder of bridges of knowledge, encouraging the birth of new ways to understand this science. One of the areas that Sousa Ramos researched was the iteration of maps and the description of its behavior, using the symbolic dynamics. So, in this issue of this journal, honoring his memory, we use experimental results to find some stable regions of a specific family of real rational maps, the ones that he worked with João Cabral. In this paper we describe a parameter space (a,b) to the real rational maps fa,b(x) = (x2 −a)/(x2 −b), using some tools of dynamical systems, as the study of the critical point orbit and Lyapunov exponents. We give some results regarding the stability of these family of maps when we iterate it, specially the ones connected to the order 3 of iteration. We hope that our results would help to understand better the behavior of these maps, preparing the ground to a more efficient use of the Kneading Theory on these family of maps, using symbolic dynamics.
Resumo:
Neste trabalho propus-me realizar um Sistema de Aquisição de Dados em Tempo Real via Porta Paralela. Para atingir com sucesso este objectivo, foi realizado um levantamento bibliográfico sobre sistemas operativos de tempo real, salientando e exemplificando quais foram marcos mais importantes ao longo da sua evolução. Este levantamento permitiu perceber o porquê da proliferação destes sistemas face aos custos que envolvem, em função da sua aplicação, bem como as dificuldades, científicas e tecnológicas, que os investigadores foram tendo, e que foram ultrapassando com sucesso. Para que Linux se comporte como um sistema de tempo real, é necessário configura-lo e adicionar um patch, como por exemplo o RTAI ou ADEOS. Como existem vários tipos de soluções que permitem aplicar as características inerentes aos sistemas de tempo real ao Linux, foi realizado um estudo, acompanhado de exemplos, sobre o tipo de arquitecturas de kernel mais utilizadas para o fazer. Nos sistemas operativos de tempo real existem determinados serviços, funcionalidades e restrições que os distinguem dos sistemas operativos de uso comum. Tendo em conta o objectivo do trabalho, e apoiado em exemplos, fizemos um pequeno estudo onde descrevemos, entre outros, o funcionamento escalonador, e os conceitos de latência e tempo de resposta. Mostramos que há apenas dois tipos de sistemas de tempo real o ‘hard’ que tem restrições temporais rígidas e o ‘soft’ que engloba as restrições temporais firmes e suaves. As tarefas foram classificadas em função dos tipos de eventos que as despoletam, e evidenciando as suas principais características. O sistema de tempo real eleito para criar o sistema de aquisição de dados via porta paralela foi o RTAI/Linux. Para melhor percebermos o seu comportamento, estudamos os serviços e funções do RTAI. Foi dada especial atenção, aos serviços de comunicação entre tarefas e processos (memória partilhada e FIFOs), aos serviços de escalonamento (tipos de escalonadores e tarefas) e atendimento de interrupções (serviço de rotina de interrupção - ISR). O estudo destes serviços levou às opções tomadas quanto ao método de comunicação entre tarefas e serviços, bem como ao tipo de tarefa a utilizar (esporádica ou periódica). Como neste trabalho, o meio físico de comunicação entre o meio ambiente externo e o hardware utilizado é a porta paralela, também tivemos necessidade de perceber como funciona este interface. Nomeadamente os registos de configuração da porta paralela. Assim, foi possível configura-lo ao nível de hardware (BIOS) e software (módulo do kernel) atendendo aos objectivos do presente trabalho, e optimizando a utilização da porta paralela, nomeadamente, aumentando o número de bits disponíveis para a leitura de dados. No desenvolvimento da tarefa de hard real-time, foram tidas em atenção as várias considerações atrás referenciadas. Foi desenvolvida uma tarefa do tipo esporádica, pois era pretendido, ler dados pela porta paralela apenas quando houvesse necessidade (interrupção), ou seja, quando houvesse dados disponíveis para ler. Desenvolvemos também uma aplicação para permitir visualizar os dados recolhidos via porta paralela. A comunicação entre a tarefa e a aplicação é assegurada através de memória partilhada, pois garantindo a consistência de dados, a comunicação entre processos do Linux e as tarefas de tempo real (RTAI) que correm ao nível do kernel torna-se muito simples. Para puder avaliar o desempenho do sistema desenvolvido, foi criada uma tarefa de soft real-time cujos tempos de resposta foram comparados com os da tarefa de hard real-time. As respostas temporais obtidas através do analisador lógico em conjunto com gráficos elaborados a partir destes dados, mostram e comprovam, os benefícios do sistema de aquisição de dados em tempo real via porta paralela, usando uma tarefa de hard real-time.
Resumo:
Este trabalho de pesquisa e desenvolvimento tem como fundamento principal o Conceito de Controlo por Lógica Difusa. Utilizando as ferramentas do software Matlab, foi possível desenvolver um controlador com base na inferência difusa que permitisse controlar qualquer tipo de sistema físico real, independentemente das suas características. O Controlo Lógico Difuso, do inglês “Fuzzy Control”, é um tipo de controlo muito particular, pois permite o uso simultâneo de dados numéricos com variáveis linguísticas que tem por base o conhecimento heurístico dos sistemas a controlar. Desta forma, consegue-se quantificar, por exemplo, se um copo está “meio cheio” ou “meio vazio”, se uma pessoa é “alta” ou “baixa”, se está “frio” ou “muito frio”. O controlo PID é, sem dúvida alguma, o controlador mais amplamente utilizado no controlo de sistemas. Devido à sua simplicidade de construção, aos reduzidos custos de aplicação e manutenção e aos resultados que se obtêm, este controlador torna-se a primeira opção quando se pretende implementar uma malha de controlo num determinado sistema. Caracterizado por três parâmetros de ajuste, a saber componente proporcional, integral e derivativa, as três em conjunto permitem uma sintonia eficaz de qualquer tipo de sistema. De forma a automatizar o processo de sintonia de controladores e, aproveitando o que melhor oferece o Controlo Difuso e o Controlo PID, agrupou-se os dois controladores, onde em conjunto, como poderemos constatar mais adiante, foram obtidos resultados que vão de encontro com os objectivos traçados. Com o auxílio do simulink do Matlab, foi desenvolvido o diagrama de blocos do sistema de controlo, onde o controlador difuso tem a tarefa de supervisionar a resposta do controlador PID, corrigindo-a ao longo do tempo de simulação. O controlador desenvolvido é denominado por Controlador FuzzyPID. Durante o desenvolvimento prático do trabalho, foi simulada a resposta de diversos sistemas à entrada em degrau unitário. Os sistemas estudados são na sua maioria sistemas físicos reais, que representam sistemas mecânicos, térmicos, pneumáticos, eléctricos, etc., e que podem ser facilmente descritos por funções de transferência de primeira, segunda e de ordem superior, com e sem atraso.
Resumo:
The main goals of the present work are the evaluation of the influence of several variables and test parameters on the melt flow index (MFI) of thermoplastics, and the determination of the uncertainty associated with the measurements. To evaluate the influence of test parameters on the measurement of MFI the design of experiments (DOE) approach has been used. The uncertainty has been calculated using a "bottom-up" approach given in the "Guide to the Expression of the Uncertainty of Measurement" (GUM). Since an analytical expression relating the output response (MFI) with input parameters does not exist, it has been necessary to build mathematical models by adjusting the experimental observations of the response variable in accordance with each input parameter. Subsequently, the determination of the uncertainty associated with the measurement of MFI has been performed by applying the law of propagation of uncertainty to the values of uncertainty of the input parameters. Finally, the activation energy (Ea) of the melt flow at around 200 degrees C and the respective uncertainty have also been determined.
Resumo:
The main purpose of this work was the development of procedures for the simulation of atmospheric ows over complex terrain, using OpenFOAM. For this aim, tools and procedures were developed apart from this code for the preprocessing and data extraction, which were thereafter applied in the simulation of a real case. For the generation of the computational domain, a systematic method able to translate the terrain elevation model to a native OpenFOAM format (blockMeshDict) was developed. The outcome was a structured mesh, in which the user has the ability to de ne the number of control volumes and its dimensions. With this procedure, the di culties of case set up and the high computation computational e ort reported in literature associated to the use of snappyHexMesh, the OpenFOAM resource explored until then for the accomplishment of this task, were considered to be overwhelmed. Developed procedures for the generation of boundary conditions allowed for the automatic creation of idealized inlet vertical pro les, de nition of wall functions boundary conditions and the calculation of internal eld rst guesses for the iterative solution process, having as input experimental data supplied by the user. The applicability of the generated boundary conditions was limited to the simulation of turbulent, steady-state, incompressible and neutrally strati ed atmospheric ows, always recurring to RaNS (Reynolds-averaged Navier-Stokes) models. For the modelling of terrain roughness, the developed procedure allowed to the user the de nition of idealized conditions, like an uniform aerodynamic roughness length or making its value variable as a function of topography characteristic values, or the using of real site data, and it was complemented by the development of techniques for the visual inspection of generated roughness maps. The absence and the non inclusion of a forest canopy model limited the applicability of this procedure to low aerodynamic roughness lengths. The developed tools and procedures were then applied in the simulation of a neutrally strati ed atmospheric ow over the Askervein hill. In the performed simulations was evaluated the solution sensibility to di erent convection schemes, mesh dimensions, ground roughness and formulations of the k - ε and k - ω models. When compared to experimental data, calculated values showed a good agreement of speed-up in hill top and lee side, with a relative error of less than 10% at a height of 10 m above ground level. Turbulent kinetic energy was considered to be well simulated in the hill windward and hill top, and grossly predicted in the lee side, where a zone of ow separation was also identi ed. Despite the need of more work to evaluate the importance of the downstream recirculation zone in the quality of gathered results, the agreement between the calculated and experimental values and the OpenFOAM sensibility to the tested parameters were considered to be generally in line with the simulations presented in the reviewed bibliographic sources.
Resumo:
The occurrence of human Toxocara infection was evaluated in three neighborhoods of the periphery of the Campinas municipality (Jardim Santa Mônica, Jardim São Marcos and Jardim Campineiro) in 1999. Forty residences and 138 residents were randomly selected by drawing lots and were submitted to a seroepidemiological survey, which included blood collection for the immunoenzymatic detection (ELISA) of anti-Toxocara antibodies and a blood count, and the application of a semi-structured questionnaire for the evaluation of epidemiological data. Significant levels of anti-Toxocara antibodies were detected in 23.9% of the 1999 samples. No significant difference in the frequency of infection according to age was observed. Environmental contamination with Toxocara eggs was observed in 12.3 and 14.0% of 57 soil samples collected in the same region in December 1998 and July 1999, respectively. Univariate analysis and multiple logistic regression of the data obtained from the questionnaires and of the results of the serological tests, suggest a significant influence of socioeconomic variables on the frequency of human infection with Toxocara under the conditions prevalent in the study area.
Resumo:
Presented at 23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France.
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Industrial
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Química e Bioquímica
Resumo:
Dissertação para obtenção do Grau de Doutor em Alterações Climáticas e Políticas de Desenvolvimento Sustentável
Resumo:
To study the macroeconomic effects of unconventional monetary policy across the different countries of the eurozone, I develop an identification scheme to disentangle conventional from non-conventional policy shocks, using futures contracts on overnight interest rates and the size of the European Central Bank balance sheet. Setting these shocks as endogenous variables in a structural vector autoregressive (SVAR) model, along with the CPI and the employment rate, estimated impulse response functions of policy to macroeconomic variables are studied. I find that unconventional policy shocks generated mixed effects in inflation but had a positive impact on employment, with the exception of Portugal, Spain, Greece and Italy where the employment response is close to zero or negative. The heterogeneity that characterizes the responses shows that the monetary policy measures taken in recent years were not sufficient to stabilize the economies of the eurozone countries under more severe economic conditions.
Resumo:
^a Introduction Interleukin (IL)-18 is a well-known major proinflammatory cytokine with broad biological effects. The major immunomodulatory functions of IL-18 include enhancing T cell and natural killer cell cytotoxicity. Serum levels of this cytokine were shown to increase in chronic hepatitis C patients compared to non-infected healthy people. An association between IL-18 gene promoter polymorphisms and pegylated interferon (PEG-IFN) and ribavirin treatment outcomes has been reported for individuals with chronic hepatitis C virus genotype 1 (HCV-1). In this study, HCV genotype 4 (HCV-4) patients were assessed for IL-18 gene polymorphisms and treatment outcomes or severity of liver disease because data concerning the impact of IL-18 gene polymorphisms on patients with HCV-4 infections are limited. Methods This study included 123 chronic HCV-4 Egyptian patients and 123 apparently healthy volunteer blood donors who served as a control group. HCV genotyping was performed using the line probe assay. IL-18 genotyping was performed using the TaqMan Real-Time PCR method in all 246 patient and control samples. Results In our study, all patients had HCV-4. IL-18 gene single nucleotide polymorphism (SNP) (-607C/A) genotype distributions and allele frequencies did not differ between HCV patients and normal healthy subjects or between patient groups when compared according to the therapeutic response. Moreover, the presence of an IL-18 SNP was not associated with histological disease severity. We conclude that the presence of the IL-18 SNP rs1946518 does not affect the outcome of chronic HCV-4 treatment in Egyptian patients. Conclusions The IL-18 SNP rs1946518 does not affect response to treatment in chronic HCV-4 patients.
Resumo:
Glazing is a technique used to retard fish deterioration during storage. This work focuses on the study of distinct variables (fish temperature, coating temperature, dipping time) that affect the thickness of edible coatings (water glazing and 1.5% chitosan) applied on frozen fish. Samples of frozen Atlantic salmon (Salmo salar) at -15, -20, and -25 °C were either glazed with water at 0.5, 1.5 or 2.5 °C or coated with 1.5% chitosan solution at 2.5, 5 or 8 °C, by dipping during 10 to 60 s. For both water and chitosan coatings, lowering the salmon and coating solution temperatures resulted in an increase of coating thickness. At the same conditions, higher thickness values were obtained when using chitosan (max. thickness of 1.41±0.05 mm) compared to water (max. thickness of 0.84±0.03 mm). Freezing temperature and crystallization heat were found to be lower for 1.5% chitosan solution than for water, thus favoring phase change. Salmon temperature profiles allowed determining, for different dipping conditions, whether the salmon temperature was within food safety standards to prevent the growth of pathogenic microorganisms. The concept of safe dipping time is proposed to define how long a frozen product can be dipped into a solution without the temperature raising to a point where it can constitute a hazard.
Resumo:
Background: Coronary artery bypass graft (CABG) is a standard surgical option for patients with diffuse and significant arterial plaque. This procedure, however, is not free of postoperative complications, especially pulmonary and cognitive disorders. Objective: This study aimed at comparing the impact of two different physiotherapy treatment approaches on pulmonary and cognitive function of patients undergoing CABG. Methods: Neuropsychological and pulmonary function tests were applied, prior to and following CABG, to 39 patients randomized into two groups as follows: Group 1 (control) - 20 patients underwent one physiotherapy session daily; and Group 2 (intensive physiotherapy) - 19 patients underwent three physiotherapy sessions daily during the recovery phase at the hospital. Non-paired and paired Student t tests were used to compare continuous variables. Variables without normal distribution were compared between groups by using Mann-Whitney test, and, within the same group at different times, by using Wilcoxon test. The chi-square test assessed differences of categorical variables. Statistical tests with a p value ≤ 0.05 were considered significant. Results: Changes in pulmonary function were not significantly different between the groups. However, while Group 2 patients showed no decline in their neurocognitive function, Group 1 patients showed a decline in their cognitive functions (P ≤ 0.01). Conclusion: Those results highlight the importance of physiotherapy after CABG and support the implementation of multiple sessions per day, providing patients with better psychosocial conditions and less morbidity.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...