888 resultados para sandwich theorems
Atendimento ambulatorial versus programa de educação infantil : qual oferece mais mudança de hábito?
Resumo:
De acordo com um recente relatório da Organização Mundial de Saúde, a obesidade atingiu proporções epidêmicas em todo o mundo. Hoje a obesidade é muito comum e está começando a substituir a desnutrição e as doenças infecciosas. A obesidade está relacionada com doenças crônico-degenerativas e com sérias conseqüências psicológicas para o indivíduo. A obesidade é uma doença complexa e heterogênea, influenciada por diversos genes, no entanto, a combinação dos genes envolvidos no desenvolvimento de formas de obesidade ainda não foi definitivamente determinada (REILLY et al, 2002). A obesidade, ou o aumento da adiposidade, é geralmente atribuída a um desequilíbrio entre a energia ingerida (padrão alimentar) e a energia gasta (atividade física e metabolismo basal). Assim, o manejo da obesidade consiste em tornar esse balanço energético negativo, sendo o exercício considerado um dos aspectos principais, associado com mudanças alimentares e de estilo de vida saudáveis (ESCRIVÃO & LOPEZ, 1998). Dietas são, na maioria das vezes, transitórias. Então, a mudança de hábito alimentar e de atividade física são os aspectos principais, especialmente na criança, uma vez que a manutenção de peso irá proporcionar uma melhora dramática da composição corporal, já que o crescimento linear ainda existe. Mas, qualquer mudança de hábito necessita da colaboração da família (HILL et al, 1993). Assim, o presente estudo teve como objetivo principal comparar um programa de educação em obesidade infantil com o atendimento ambulatorial para manejo de obesidade infantil quanto a mudanças de hábitos alimentares e de atividade física e aquisição de conhecimentos em dieta saudável. Foi desenvolvido inicialmente um programa de educação em obesidade infantil e posteriormente comparado com o atendimento ambulatorial habitual. O presente estudo constou de um ensaio clínico randomizado entre crianças e adolescentes com idade entre 7 e 13 anos incompletos que tivessem IMC compatível para obesidade, de acordo com a idade e sexo, segundo classificação de COLE et al (2002). Os sujeitos foram aleatoriamente distribuídos em dois grupos. Cada grupo foi acompanhado por oito meses, sendo que o primeiro e o oitavo encontro serviram para responder questionários que avaliavam aspectos gerais, hábitos alimentares e de atividade física, conhecimentos gerais sobre dieta saudável e avaliação corporal. O grupo ambulatorial teve atendimento mensal com aferição de peso e orientações gerais quanto alimentação e atividade física. O grupo programa tinha encontro mensal, em grupo, seus participantes assistiam a uma aula expositiva e, posteriormente, eram divididos em grupos para atividades monitoradas, e os pais e/ou responsáveis ficavam discutindo suas dificuldades e como mudar hábitos. As 38 crianças inicialmente apresentavam algumas diferenças quanto a atividade física, mas após a intervenção elas se assemelharam, apresentando ambas tendência a desfechos favoráveis. O grupo programa passou a fazer mais atividade física e caminhar, e reduziu sedentarismo. O grupo programa foi mais efetivo em reduzir colesterol total. Houve também uma melhora do hábito alimentar do grupo programa, com menor consumo de massa + arroz, bebida láctea + leite, leite, salsicha + frios e sanduíche + bauru. Assim, conclui-se que as intervenções foram semelhantes e de sucesso, podendo-se aplicar mais o programa, que pode envolver menos profissionais, mais sujeitos e ser realizado em qualquer local, especialmente nas escolas, que são, na realidade, o local de mudança.
Resumo:
We report results on the optimal \choice of technique" in a model originally formulated by Robinson, Solow and Srinivasan (henceforth, the RSS model) and further discussed by Okishio and Stiglitz. By viewing this vintage-capital model without discounting as a speci c instance of the general theory of intertemporal resource allocation associated with Brock, Gale and McKenzie, we resolve longstanding conjectures in the form of theorems on the existence and price support of optimal paths, and of conditions suÆcient for the optimality of a policy rst identi ed by Stiglitz. We dispose of the necessity of these conditions in surprisingly simple examples of economies in which (i) an optimal path is periodic, (ii) a path following Stiglitz' policy is bad, and (iii) there is optimal investment in di erent vintages at di erent times. (129 words)
Resumo:
We study the asset pricing implications of an endowment economy when agents can default on contracts that would leave them otherwise worse off. We specialize and extend the environment studied by Kocherlakota (1995) and Kehoe and Levine (1993) to make it comparable to standard studies of asset pricillg. We completely charactize efficient allocations for several special cases. We illtroduce a competitive equilibrium with complete markets alld with elldogellous solvency constraints. These solvellcy constraints are such as to prevent default -at the cost of reduced risk sharing. We show a version of the classical welfare theorems for this equilibrium definition. We characterize the pricing kernel, alld compare it with the one for economies without participation constraints : interest rates are lower and risk premia can be bigger depending on the covariance of the idiosyncratic and aggregate shocks. Quantitative examples show that for reasonable parameter values the relevant marginal rates of substitution fali within the Hansen-Jagannathan bounds.
Resumo:
Lucas (2000) estimates that the US welfare costs of inflation are around 1% of GDP. This measurement is consistent with a speci…c distorting channel in terms of the Bailey triangle under the demand for monetary base schedule (outside money): the displacement of resources from the production of consumption goods to the household transaction time à la Baumol. Here, we consider also several new types of distortions in the manufacturing and banking industries. Our new evidences show that both banks and firms demand special occupational employments to avoid the inflation tax. We de…ne the concept of ”the foat labor”: The occupational employments that are aflected by the in‡ation rates. More administrative workers are hired relatively to the bluecollar workers for producing consumption goods. This new phenomenon makes the manufacturing industry more roundabout. To take into account this new stylized fact and others, we redo at same time both ”The model 5: A Banking Sector -2” formulated by Lucas (1993) and ”The Competitive Banking System” proposed by Yoshino (1993). This modelling allows us to characterize better the new types of misallocations. We …nd that the maximum value of the resources wasted by the US economy happened in the years 1980-81, after the 2nd oil shock. In these years, we estimate the excess resources that are allocated for every speci…c distorting channel: i) The US commercial banks spent additional resources of around 2% of GDP; ii) For the purpose of the firm foating time were used between 2.4% and 4.1% of GDP); and iii) For the household transaction time were allocated between 3.1% and 4.5 % of GDP. The Bailey triangle under the demand for the monetary base schedule represented around 1% of GDP, which is consistent with Lucas (2000). We estimate that the US total welfare costs of in‡ation were around 10% of GDP in terms of the consumption goods foregone. The big di¤erence between our results and Lucas (2000) are mainly due to the Harberger triangle in the market for loans (inside money) which makes part of the household transaction time, of the …rm ‡oat labor and of the distortion in the banking industry. This triangle arises due to the widening interest rates spread in the presence of a distorting inflation tax and under a fractionally reserve system. The Harberger triangle can represent 80% of the total welfare costs of inflation while the remaining percentage is split almost equally between the Bailey triangle and the resources used for the bank services. Finally, we formulate several theorems in terms of the optimal nonneutral monetary policy so as to compare with the classical monetary theory.
Resumo:
The main objective of this Thesis was to encapsulate single viable cells within polyelectrolyte films using the Layer-by-Layer (LbL) technique. Most of the experiments used human mesenchymal stem cells (MSCs) whose characteristics (capacity of selfrenewal and potential to differentiate into several types of cells) make them particularly interesting to be used in biomedical applications. Also, most of the experiments used alginate (ALG) as the anionic polyelectrolyte and chitosan (CHI) or poly(allylamine hydrochloride) (PAH) as the cationic polyelectrolyte. Hyaluronic acid (HA) was also tested as an anionic polyelectrolyte. At the beginning of the work, the experimental conditions necessary to obtain the encapsulation of individual cells were studied and established. Through fluorescence microscopy visualization by staining the cell nucleus and using polyelectrolytes conjugated to fluorescent dyes, it was possible to prove the obtainment of capsules containing one single cell inside. Capsules aggregation was an observed problem which, despite the efforts to design an experimental process to avoid this situation (namely, by playing with cell concentration and different means of re-suspending and stirring the cells), was not completely overcome. In a second part of the project, single cells were encapsulated within polyelectrolyte layers made of CHI/ALG, PAH/ALG and PAH/HA and their viability was evaluated through the resazurin reduction assay and the Live/Dead assay. In these experiments, during the LbL process, polyelectrolyte solutions were used at a concentration of 1mg/mL based on literature. In general, the viability of the encapsulated cells was shown to be very low/absent. Then, as a consequence of the lack of viability of cells encapsulated within polyelectrolyte layers, the LbL technique was applied in cells growing adherent to the surface of cell culture plates. The cells were cultured like in a sandwich, between the surface of the cell culture dish and the polyelectrolyte layers. Also here, the polyelectrolyte solutions were used at a concentration of 1mg/mL during the LbL process. Surprisingly, cell viability was also absent in these systems. A systematic study (dose-effect study) was performed to evaluate the effect of the concentration of the individual polyelectrolytes (ALG, CHI and PAH were studied) in cell viability. Experiments were performed using cells growing adherent to the surface of cell culture plates. The results pointed out that a very high (cytotoxic) concentration of polyelectrolytes had been in use. Also, in general, PAH was much more cytotoxic than CHI, whereas ALG was the less cytotoxic polyelectrolyte. Finally, using alginate and chitosan solutions with adequate concentrations (low concentrations: 50ng/mL and 1μg/mL), the encapsulation of single viable cells was again attempted. Once again, the encapsulated cells were not shown to be viable. In conclusion, the viability of the encapsulated cells is not only dependent on the cytotoxic characteristics (or combined cytotoxic characteristics) of the polyelectrolytes but it seems that, when detached from the culture plates, the cells become too fragile and lose their viability very easily.
Resumo:
O regime eólico de uma região pode ser descrito por distribuição de frequências que fornecem informações e características extremamente necessárias para uma possível implantação de sistemas eólicos de captação de energia na região e consequentes aplicações no meio rural em regiões afastadas. Estas características, tais como a velocidade média anual, a variância das velocidades registradas e a densidade da potência eólica média horária, podem ser obtidas pela frequência de ocorrências de determinada velocidade, que por sua vez deve ser estudada através de expressões analíticas. A função analítica mais adequada para distribuições eólicas é a função de densidade de Weibull, que pode ser determinada por métodos numéricos e regressões lineares. O objetivo deste trabalho é caracterizar analítica e geometricamente todos os procedimentos metodológicos necessários para a realização de uma caracterização completa do regime eólico de uma região e suas aplicações na região de Botucatu - SP, visando a determinar o potencial energético para implementação de turbinas eólicas. Assim, foi possível estabelecer teoremas relacionados com a forma de caracterização do regime eólico, estabelecendo a metodologia concisa analiticamente para a definição dos parâmetros eólicos de qualquer região a ser estudada. Para o desenvolvimento desta pesquisa, utilizou-se um anemômetro da CAMPBELL.
Resumo:
Trigonometry, branch of mathematics related to the study of triangles, developed from practical needs, especially relating to astronomy, Surveying and Navigation. Johann Müller, the Regiomontanus (1436-1476) mathematician and astronomer of the fifteenth century played an important role in the development of this science. His work titled De Triangulis Omnimodis Libri Quinque written around 1464, and published posthumously in 1533, presents the first systematic exposure of European plane and spherical trigonometry, a treatment independent of astronomy. In this study we present a description, translation and analysis of some aspects of this important work in the history of trigonometry. Therefore, the translation was performed using a version of the book Regiomontanus on Triangles of Barnabas Hughes, 1967. In it you will find the original work in Latin and an English translation. For this study, we use for most of our translation in Portuguese, the English version, but some doubt utterance, statement and figures were made by the original Latin. In this work, we can see that trigonometry is considered as a branch of mathematics which is subordinated to geometry, that is, toward the study of triangles. Regiomontanus provides a large number of theorems as the original trigonometric formula for the area of a triangle. Use algebra to solve geometric problems and mainly shows the first practical theorem for the law of cosines in spherical trigonometry. Thus, this study shows some of the development of the trigonometry in the fifteenth century, especially with regard to concepts such as sine and cosine (sine reverse), the work discussed above, is of paramount importance for the research in the history of mathematics more specifically in the area of historical analysis and critique of literary sources or studying the work of a particular mathematician
Resumo:
This work develops a robustness analysis with respect to the modeling errors, being applied to the strategies of indirect control using Artificial Neural Networks - ANN s, belong to the multilayer feedforward perceptron class with on-line training based on gradient method (backpropagation). The presented schemes are called Indirect Hybrid Control and Indirect Neural Control. They are presented two Robustness Theorems, being one for each proposed indirect control scheme, which allow the computation of the maximum steady-state control error that will occur due to the modeling error what is caused by the neural identifier, either for the closed loop configuration having a conventional controller - Indirect Hybrid Control, or for the closed loop configuration having a neural controller - Indirect Neural Control. Considering that the robustness analysis is restrict only to the steady-state plant behavior, this work also includes a stability analysis transcription that is suitable for multilayer perceptron class of ANN s trained with backpropagation algorithm, to assure the convergence and stability of the used neural systems. By other side, the boundness of the initial transient behavior is assured by the assumption that the plant is BIBO (Bounded Input, Bounded Output) stable. The Robustness Theorems were tested on the proposed indirect control strategies, while applied to regulation control of simulated examples using nonlinear plants, and its results are presented
Resumo:
New materials made from industrial wastes have been studied as an alternative to traditional fabrication processes in building and civil engineering. These materials are produced considering some issues like: cost, efficiency and reduction of nvironmental damage. Specifically in cases of materials destined to dwellings in low latitude regions, like Brazilian Northeast, efficiency is related to mechanical and thermal resistance. Thus, when thermal insulation and energetic efficiency are aimed, it s important to increase thermal resistance without depletion of mechanical properties. This research was conducted on a construction element made of two plates of cement mortar, interspersed with a plate of recycled expanded polystyrene (EPS). This component, widely known as sandwich-panel, is commonly manufactured with commercial EPS whose substitution was proposed in this study. For this purpose it was applied a detailed methodology that defines parameters to a rational batching of the elements that constitute the nucleus. Samples of recycled EPS were made in two different values of apparent specific mass (ρ = 65 kg/m³; ρ = 130 kg/m³) and submitted to the Quick-Line 30TM that is a thermophysical properties analyzer. Based on the results of thermal conductivity, thermal capacity and thermal diffusivity obtained, it was possible to assure that recycled EPS has thermal insulation characteristics that qualify it to replace commercial EPS in building and civil engineering industry
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In the Einstein s theory of General Relativity the field equations relate the geometry of space-time with the content of matter and energy, sources of the gravitational field. This content is described by a second order tensor, known as energy-momentum tensor. On the other hand, the energy-momentum tensors that have physical meaning are not specified by this theory. In the 700s, Hawking and Ellis set a couple of conditions, considered feasible from a physical point of view, in order to limit the arbitrariness of these tensors. These conditions, which became known as Hawking-Ellis energy conditions, play important roles in the gravitation scenario. They are widely used as powerful tools for analysis; from the demonstration of important theorems concerning to the behavior of gravitational fields and geometries associated, the gravity quantum behavior, to the analysis of cosmological models. In this dissertation we present a rigorous deduction of the several energy conditions currently in vogue in the scientific literature, such as: the Null Energy Condition (NEC), Weak Energy Condition (WEC), the Strong Energy Condition (SEC), the Dominant Energy Condition (DEC) and Null Dominant Energy Condition (NDEC). Bearing in mind the most trivial applications in Cosmology and Gravitation, the deductions were initially made for an energy-momentum tensor of a generalized perfect fluid and then extended to scalar fields with minimal and non-minimal coupling to the gravitational field. We also present a study about the possible violations of some of these energy conditions. Aiming the study of the single nature of some exact solutions of Einstein s General Relativity, in 1955 the Indian physicist Raychaudhuri derived an equation that is today considered fundamental to the study of the gravitational attraction of matter, which became known as the Raychaudhuri equation. This famous equation is fundamental for to understanding of gravitational attraction in Astrophysics and Cosmology and for the comprehension of the singularity theorems, such as, the Hawking and Penrose theorem about the singularity of the gravitational collapse. In this dissertation we derive the Raychaudhuri equation, the Frobenius theorem and the Focusing theorem for congruences time-like and null congruences of a pseudo-riemannian manifold. We discuss the geometric and physical meaning of this equation, its connections with the energy conditions, and some of its several aplications.
Resumo:
In this work, we study and compare two percolation algorithms, one of then elaborated by Elias, and the other one by Newman and Ziff, using theorical tools of algorithms complexity and another algorithm that makes an experimental comparation. This work is divided in three chapters. The first one approaches some necessary definitions and theorems to a more formal mathematical study of percolation. The second presents technics that were used for the estimative calculation of the algorithms complexity, are they: worse case, better case e average case. We use the technique of the worse case to estimate the complexity of both algorithms and thus we can compare them. The last chapter shows several characteristics of each one of the algorithms and through the theoretical estimate of the complexity and the comparison between the execution time of the most important part of each one, we can compare these important algorithms that simulate the percolation.
Resumo:
This work presents a proposal for introducing the teaching of Geometry Space study attempts to demonstrate that the use of manipulatives as a teaching resource can be an alternative learning facilitator for fixing the primitive concepts of geometry, the postulates and theorems, position relationships between points, lines and planes and calculating distances. The development makes use of a sequence of activities aimed at ensuring that students can build a more systematic learning and these are divided into four steps
Resumo:
Among several theorems which are taught in basic education some of them can be proved in the classroom and others do not, because the degree of difficulty of its formal proof. A classic example is the Fundamental Theorem of Algebra which is not proved, it is necessary higher-level knowledge in mathematics. In this paper, we justify the validity of this theorem intuitively using the software Geogebra. And, based on [2] we will present a clear formal proof of this theorem that is addressed to school teachers and undergraduate students in mathematics