62 resultados para Stiff, Computação de (Equações diferenciais)
Resumo:
Recently the planar antennas have been studied due to their characteristics as well as the advantages that they offers when compared with another types of antennas. In the mobile communications area, the need for this kind of antennas have became each time bigger due to the intense increase of the mobile communications that needs of antennas which operate in multifrequency and wide bandwidth. The microstrip antennas presents narrow bandwidth due the loss in the dielectric generated by radiation. Another limitation is the radiation pattern degradation due the generation of surface waves in the substrate. In this work some used techniques to minimize the disadvantages (previously mentioned) of the use of microstrip antennas are presented, those are: substrates with PBG material - Photonic Bandgap, multilayer antennas and with stacked patches. The developed analysis in this work used the TTL - Transverse Transmission Line method in the domain of Fourier transform, that uses a component of propagation in the y direction (transverse to the direction real of propagation z), treating the general equations of electric and magnetic field as functions of y and y . This work has as objective the application of the TTL method to microstrip structures with single and multilayers of rectangular and triangular patches, to obtaining the resonance frequency and radiation pattern of each structure. This method is applied for the treatment of the fields in stacked structures. The Homogenization theory will be applied to obtaining the effective permittivity for s and p polarizations of the substrate composed of PBG material. Numerical results for the triangular and rectangular antennas with single layer, multilayers resonators with triangular and rectangular patches are presented (in photonic and isotropic substrates). Conclusions and suggestions for continuity of this work are presented
Resumo:
The pattern classification is one of the machine learning subareas that has the most outstanding. Among the various approaches to solve pattern classification problems, the Support Vector Machines (SVM) receive great emphasis, due to its ease of use and good generalization performance. The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming implemented in SVM. The LS-SVMs provide some free parameters that have to be correctly chosen to achieve satisfactory results in a given task. Despite the LS-SVMs having high performance, lots of tools have been developed to improve them, mainly the development of new classifying methods and the employment of ensembles, in other words, a combination of several classifiers. In this work, our proposal is to use an ensemble and a Genetic Algorithm (GA), search algorithm based on the evolution of species, to enhance the LSSVM classification. In the construction of this ensemble, we use a random selection of attributes of the original problem, which it splits the original problem into smaller ones where each classifier will act. So, we apply a genetic algorithm to find effective values of the LS-SVM parameters and also to find a weight vector, measuring the importance of each machine in the final classification. Finally, the final classification is obtained by a linear combination of the decision values of the LS-SVMs with the weight vector. We used several classification problems, taken as benchmarks to evaluate the performance of the algorithm and compared the results with other classifiers
Resumo:
Nowadays there has been a major breakthrough in the aerospace area, with regard to rocket launches to research, experiments, telemetry system, remote sensing, radar system (tracking and monitoring), satellite communications system and insertion of satellites in orbit. This work aims at the application of a circular cylindrical microstrip antenna, ring type, and other cylindrical rectangular in structure of a rocket or missile to obtain telemetry data, operating in the range of 2 to 4 GHz, in S-band. Throughout this was developed just the theoretical analysis of the Transverse transmission line method which is a method of rigorous analysis in spectral domain, for use in rockets and missiles. This analyzes the spread in the direction "ρ" , transverse to dielectric interfaces "z" and "φ", for cylindrical coordinates, thus taking the general equations of electromagnetic fields in function of e [1]. It is worth mentioning that in order to obtain results, simulations and analysis of the structure under study was used HFSS program (High Frequency Structural Simulator) that uses the finite element method. With the theory developed computational resources were used to obtain the numerical calculations, using Fortran Power Station, Scilab and Wolfram Mathematica ®. The prototype was built using, as a substrate, the ULTRALAM ® 3850, of Rogers Corporation, and an aluminum plate as a cylindrical structure used to support. The agreement between the measured and simulated results validate the established processes. Conclusions and suggestions are presented for continuing this work
Resumo:
The present study investigates how the inter-relationship of the content of polynomial equations works with structured activities and with the history of mathematics through a sequence of activities presented in an e-book, so that the result of this research will proceed will result in a didactic and pedagogic proposal for the teaching of polynomial equations in a historical approach via the reported e-book. Therefore, we have considered in theoretical and methodological assumptions of the History of Mathematics, in structured activities and new technologies with an emphasis on e-book tool. We used as a methodological approach the qualitative research, as our research object adjusts to the objectives of this research mode. As methodological instruments, we used the e-book as a synthesis tool of the sequence of activities to be evaluated, while the questionnaires, semi-structured interviews and participant observation were designed to register and analyze the evaluation made by the research, participants in the structured activities. The processing and analysis of data collected though the questionnaires were organized, classified and quantified in summary tables to facilitate visualization, interpretation, understanding, and analysis of these data. As for participant observation was used to contribute to the qualitative analysis of the quantified data. The interviews were synthetically transcribed and qualitatively analyzed. The analysis ratified our research objectives and contributed to improve, approve and indicate the use of e-book for the teaching of polynomial equations. Thus, we consider that this educational product will bring significant contributions to the teaching of mathematical content, in Basic Education
Resumo:
I ntroduction: The assessment of respiratory muscle strength is important in the diagnosis and monitoring of the respiratory muscles weakness of respiratory and neuromuscular diseases. However, there are still no studies that provide predictive equations and reference values for maximal respiratory pressures for children in our population. Aim: The purpose of this study was to propose predictive equations for maximal respiratory pressures in healthy school children. Method: This is an observational cross-sectional study. 144 healthy children were assessed. They were students from public and private schools in the city of Natal /RN (63 boys and 81 girls), subdivided in age groups of 7-8 and 9-11 years. The students presented the BMI, for age and sex, between 5 and 85 percentile. Maximal respiratory pressures were measured with the digital manometer MVD300 (Globalmed ®). The maximal inspiratory pressure (MIP) and maximal expiratory pressures (MEP) were measured from residual volume and total lung capacity, respectively. The data were analyzed using the SPSS Statistics 15.0 software (Statistical Package for Social Science) by assigning the significance level of 5%. Descriptive analysis was expressed as mean and standard deviation. T'Student test was used for unpaired comparison of averages of the variables. The comparison of measurements obtained with the predicted values in previous studies was performed using the paired t'Student test. The Pearson correlation test was used to verify the correlation of MRP's with the independent variables (age, sex, weight and height). For the equations analysis the stepwise linear regression was used. Results: By analyzing the data, we observed that in the age range studied MIP was significantly higher in boys. The MEP did not differ between boys and girls aged 7 to 8 years, the reverse occurred in the age between 9 and 11 years. The boys had a significant increase in respiratory muscle strength with advancing age. Regardless sex and age, MEP was always higher than the MIP. The reference values found in this study are similar to a sample of Spanish and Canadian children. The two models proposed in previous studies with children from other countries were not able to consistently predict the values observed in this studied population. The variables sex, age and weight correlated with MIP, whereas the MEP was also correlated with height. However, in the regression models proposed in this study, only gender and age were kept exerting influence on the variability of maximal inspiratory and expiratory pressures. Conclusion: This study provides reference values, lower limits of normality and proposes two models that allow predicting, through the independent variables, sex and age, the value of maximal static respiratory pressures in healthy children aged between 7 and 11 years old
Resumo:
One of the current challenges of Ubiquitous Computing is the development of complex applications, those are more than simple alarms triggered by sensors or simple systems to configure the environment according to user preferences. Those applications are hard to develop since they are composed by services provided by different middleware and it is needed to know the peculiarities of each of them, mainly the communication and context models. This thesis presents OpenCOPI, a platform which integrates various services providers, including context provision middleware. It provides an unified ontology-based context model, as well as an environment that enable easy development of ubiquitous applications via the definition of semantic workflows that contains the abstract description of the application. Those semantic workflows are converted into concrete workflows, called execution plans. An execution plan consists of a workflow instance containing activities that are automated by a set of Web services. OpenCOPI supports the automatic Web service selection and composition, enabling the use of services provided by distinct middleware in an independent and transparent way. Moreover, this platform also supports execution adaptation in case of service failures, user mobility and degradation of services quality. The validation of OpenCOPI is performed through the development of case studies, specifically applications of the oil industry. In addition, this work evaluates the overhead introduced by OpenCOPI and compares it with the provided benefits, and the efficiency of OpenCOPI s selection and adaptation mechanism
Resumo:
In this work will applied the technique of Differential Cryptanalysis, introduced in 1990 by Biham and Shamir, on Papílio s cryptosystem, developed by Karla Ramos, to test and most importantly, to prove its relevance to other block ciphers such as DES, Blowfish and FEAL-N (X). This technique is based on the analysis of differences between plaintext and theirs respective ciphertext, in search of patterns that will assist in the discovery of the subkeys and consequently in the discovery of master key. These differences are obtained by XOR operations. Through this analysis, in addition to obtaining patterns of Pap´ılio, it search to obtain also the main characteristics and behavior of Papilio throughout theirs 16 rounds, identifying and replacing when necessary factors that can be improved in accordance with pre-established definitions of the same, thus providing greater security in the use of his algoritm
Resumo:
With the advance of the Cloud Computing paradigm, a single service offered by a cloud platform may not be enough to meet all the application requirements. To fulfill such requirements, it may be necessary, instead of a single service, a composition of services that aggregates services provided by different cloud platforms. In order to generate aggregated value for the user, this composition of services provided by several Cloud Computing platforms requires a solution in terms of platforms integration, which encompasses the manipulation of a wide number of noninteroperable APIs and protocols from different platform vendors. In this scenario, this work presents Cloud Integrator, a middleware platform for composing services provided by different Cloud Computing platforms. Besides providing an environment that facilitates the development and execution of applications that use such services, Cloud Integrator works as a mediator by providing mechanisms for building applications through composition and selection of semantic Web services that take into account metadata about the services, such as QoS (Quality of Service), prices, etc. Moreover, the proposed middleware platform provides an adaptation mechanism that can be triggered in case of failure or quality degradation of one or more services used by the running application in order to ensure its quality and availability. In this work, through a case study that consists of an application that use services provided by different cloud platforms, Cloud Integrator is evaluated in terms of the efficiency of the performed service composition, selection and adaptation processes, as well as the potential of using this middleware in heterogeneous computational clouds scenarios
Resumo:
This study sprang from the hypothesis that spatial variations in the morbidity rate for dengue fever within the municipality of Natal are related to intra-city socioeconomic and environmental variations. The objective of the project was to classify the different suburbs of Natal according to their living conditions and establish if there was any correlation between this classification and the incidence rate for dengue fever, with the aim of enabling public health planners to better control this disease. Data on population density, access to safe drinking water, rubbish collection, sewage disposal facilities, income level, education and the incidence of dengue fever during the years 2001 and 2003 was drawn from the Brazilian Demographic Census 2000 and from the Reportable Disease Notification System -SINAN. The study is presented here in the form of two papers, corresponding to the types of analysis performed: a classification of the urban districts into quartiles according to the living conditions which exist there, in the first article; and the incidence of dengue fever in each of these quartiles, in the second. By applying factorial analysis to the chosen socioeconomic and environmental indicators for the year 2000, a compound index of living condition (ICV) was obtained. On the basis of this index, it was possible to classify the urban districts into quartiles. On undertaking this grouping (paper 1), a heterogeneous distribution of living conditions was found across the city. As to the incidence rate for dengue fever (paper 2), it was discovered that the quartile identified as having the best living conditions presented incidence rates of 15.62 and 15.24 per 1000 inhabitants respectively in the years 2001 and 2003; whereas the quartile representing worst living conditions showed incidence rates of 25.10 and 10.32 for the comparable periods. The results suggest that dengue fever occurs in all social classes, and that its incidence is not related in any evident way to the chosen formula for living conditions
Resumo:
In this work are presented, as a review and in a historical context, the most used methods to solve quadratic equations. It is also shown the simplest type of change of variables, namely: x = Ay + B where A;B 2 R, and some changes of variables that were used to solve quadratic equations throughout history. Finally, a change of variable, which has been used by the author in the classroom as an alternative method, is presented and the result of this methodoly is illustrated by the responses of a test that was done by the students in classroom
Resumo:
Atualmente, há diferentes definições de implicações fuzzy aceitas na literatura. Do ponto de vista teórico, esta falta de consenso demonstra que há discordâncias sobre o real significado de "implicação lógica" nos contextos Booleano e fuzzy. Do ponto de vista prático, isso gera dúvidas a respeito de quais "operadores de implicação" os engenheiros de software devem considerar para implementar um Sistema Baseado em Regras Fuzzy (SBRF). Uma escolha ruim destes operadores pode implicar em SBRF's com menor acurácia e menos apropriados aos seus domínios de aplicação. Uma forma de contornar esta situação e conhecer melhor os conectivos lógicos fuzzy. Para isso se faz necessário saber quais propriedades tais conectivos podem satisfazer. Portanto, a m de corroborar com o significado de implicação fuzzy e corroborar com a implementação de SBRF's mais apropriados, várias leis Booleanas têm sido generalizadas e estudadas como equações ou inequações nas lógicas fuzzy. Tais generalizações são chamadas de leis Boolean-like e elas não são comumente válidas em qualquer semântica fuzzy. Neste cenário, esta dissertação apresenta uma investigação sobre as condições suficientes e necessárias nas quais três leis Booleanlike like — y ≤ I(x, y), I(x, I(y, x)) = 1 e I(x, I(y, z)) = I(I(x, y), I(x, z)) — se mantém válidas no contexto fuzzy, considerando seis classes de implicações fuzzy e implicações geradas por automorfismos. Além disso, ainda no intuito de implementar SBRF's mais apropriados, propomos uma extensão para os mesmos
Resumo:
About 10% of faults involving the electrical system occurs in power transformers. Therefore, the protection applied to the power transformers is essential to ensure the continuous operation of this device and the efficiency of the electrical system. Among the protection functions applied to power transformers, the differential protection appears as one of the main schemes, presenting reliable discrimination between internal faults and external faults or inrush currents. However, when using the low frequency components of the differential currents flowing through the transformer, the main difficulty of the conventional methods of differential protection is the delay for detection of the events. However, internal faults, external faults and other disturbances related to the transformer operation present transient and can be appropriately detected by the wavelet transform. In this paper is proposed the development of a wavelet-based differential protection for detection and identification of external faults to the transformer, internal faults, and transformer energizing by using the wavelet coefficient energy of the differential currents. The obtained results reveal the advantages of using of the wavelet transform in the differential protection compared to conventional protection, since it provides reliability and speed in detection of these events.
Resumo:
The study of mortality by various differentials has been an important tool to guide public health policies, due to better describing the events of deaths in a population. This research aims to seek disparities in mortality according to educational level, sex and adulthood in large Brazilian regions and consequently for Brazil as a whole. A vast literature has shown that people with more education tend to have lower risk of death. Studies on inequalities in mortality by level of education in Brazil are still very specific and has still known very little about Brazil about mortality according to educational level, due to lack of information about the well-filled school in the records of deaths arising from the Mortality Information System (MIS) of the Ministry of Health. This data source has shown improvement in the coverage of sub reports in the last decade, however, it has still perceived negligence in completing the question regarding schooling of death (about 30% of registered deaths in 2010 to Brazil, Given this scenario, this work contributes to the national literature on the behavior of adult mortality differentials having as proposed, using data from the new variable mortality of the 2010 Census (CD 2010), assuming the characteristics of education of the head the household for deaths occurring in the same. It is therefore considered that the probability of mortality is homogeneous within the household. Events of deaths were corrected only for the records come from households where the head possessed levels of schooling and Instruction Elementary Education No Incomplete and Primary Education and Secondary Education Complete Incomplete through the Generations Extinct Adjusted method. With deaths already corrected, probabilities of death were calculated between 15 and 60 years life, as well as tables by sex and level of education to all regions of Brazil. No que se refere às probabilidades de morte por idade, nas idades entre 15 e 60 anos as maiores probabilidades seguem um gradiente, maior probabilidade para os menos escolarizados. Results corroborate the literature, the more educated the population is, the greater the life expectancy. In all Brazilian regions, life expectancy of the female population is greater than that of men at all levels of schooling. With respect to probabilities of death by age between the ages of 15 and 60 years the most likely follow a gradient, most likely to the least educated. At older ages (from 70 years), this behavior has presented another pattern, the lowest level of education has the lowest odds in the regions, North, Northeast, South and Midwest, except in the Southeast region
Resumo:
This work aims to understand how cloud computing contextualizes the IT government and decision agenda, in the light of the multiple streams model, considering the current status of public IT policies, the dynamics of the agenda setting for the area, the interface between the various institutions, and existing initiatives on the use of cloud computing in government. Therefore, a qualitative study was conducted through interviews with a group of policy makers and the other group consists of IT managers. As analysis technique, this work made use of content analysis and analysis of documents, with some results by word cloud. As regards the main results to overregulation to the area, usually scattered in various agencies of the federal government, which hinders the performance of the managers. Identified a lack of knowledge of standards, government programs, regulations and guidelines. Among these he highlighted a lack of understanding of the TI Maior Program, the lack of effectiveness of the National Broadband Plan in view of the respondents, as well as the influence of Internet Landmark as an element that can jam the advances in the use of computing cloud in the Brazilian government. Also noteworthy is the bureaucratization of the acquisition of goods to IT services, limited, in many cases, technological advances. Regarding the influence of the actors, it was not possible to identify the presence of a political entrepreneur, and it was noticed a lack of political force. Political flow was affected only by changes within the government. Fragmentation was a major factor for the theme of weakening the agenda formation. Information security was questioned by the respondents pointed out that the main limitation coupled with the lack of training of public servants. In terms of benefits, resource economy is highlighted, followed by improving efficiency. Finally, the discussion about cloud computing needs to advance within the public sphere, whereas the international experience is already far advanced, framing cloud computing as a responsible element for the improvement of processes, services and economy of public resources
Resumo:
Metamaterials have attracted great attention in recent decades, due to their electromagnetic properties which are not found in nature. Since metamaterials are now synthesized by the insertion of artificially manufactured inclusions in a specified homogeneous medium, it became possible for the researcher to work with a wide collection of independent parameters, for example, the electromagnetic properties of the material. An investigation of the properties of ring resonators was performed as well as those of metamaterials. A study of the major theories that clearly explain superconductivity was presented. The BCS theory, London Equations and the Two-Fluid Model are theories that support the application of superconducting microstrip antennas. Therefore, this thesis presents theoretical, numerical and experimental-computational analysis using full-wave formalism, through the application of the Transverse Transmission Line – LTT method applied in the Fourier Transform Domain (FTD). The LTT is a full wave method, which, as a rule, obtains the electromagnetic fields in terms of the transverse components of the structure. The inclusion of the superconducting patch is performed using the complex resistive boundary condition. Results of resonant frequency as a function of antenna parameters are obtained. To validate the analysis, computer programs were developed using Fortran, simulations were created using the commercial software, with curves being drawn using commercial software and MATLAB, in addition to comparing the conventional patch with the superconductor as well as comparing a metamaterial substrate with a conventional one, joining the substrate with the patch, observing what improves on both cas