959 resultados para Demonstrativos complexos


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The auditory system is composed by a set of relays from the outer ear to the cerebral cortex. In mammals, the central auditory system is composed by cochlear nuclei, superior olivary complex, inferior colliculus and medial geniculate body. In this study, the auditory rombencephalic centers, the cochlear nuclear complex and the superior olivary complex were evaluated from the cytoarchitecture and neurochemical aspects, thorough Nissl staining and immunohistochemical techniques to reveal specific neuron nuclear protein (NeuN), glutamate (Glu), glutamic acid decarboxilase (GAD), enkephalin (ENK), serotonin (5-HT), choline acetyltransferase (ChAT) and calcium-binding proteins calbindin (CB), calretinin (CR), and parvalbumin (PV). The common marmoset (Callithrix jacchus), a little native primate of the Brazilian atlantic forest was used as an experimental animal. As results, it was noted that the cochlear nuclear complex is composed by anteroventral, posteroventral and dorsal nuclei, and the superior olivary complex is constituted by the lateral and medial superior olivary nuclei and the trapezoid body nucleus. Glu, GAD, ENK, ChAT, CB, CR, PV-immunoreactive cells, fibers and terminals besides besides only 5-HT terminals were found unhomogeneously in all nuclei, of both complex. The emerging data are discussed in a comparative and functional context, and represent an important contribution to knowledge of the central auditory pathways in the common marmoset, and then in primates

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work were synthesized and studied the spectroscopic and electrochemical characteristics of the coordination compounds trans-[Co (cyclam)Cl2]Cl, trans- Na[Co(cyclam)(tios)2], trans-[Co(en)2Cl2]Cl and trans-Na[Co(en)2(tios)2], where tios = thiosulfate and en = ethylenediamine. The compounds were characterized by: Elemental Analysis (CHN), Absorption Spectroscopy in the Infrared (IR), Uv-Visible Absorption Spectroscopy, Luminescence Spectroscopy and Electrochemistry (cyclic voltammetry). Elemental Analysis (CHN) suggests the following structures for the complex: trans- [Co(cyclam)Cl2]Cl.6H2O and trans-Na[Co(cyclam)(tios)2].7H2O. The electrochemical analysis, when compared the cathodic potential (Ec) processes of the complexes trans- [Co(cyclam)Cl2]Cl and trans-[Co(en)2Cl2]Cl, indicated a more negative value (-655 mV) for the second complex, suggesting a greater electron donation to the metal center in this complex which can be attributed to a greater proximity of the nitrogen atoms of ethylenediamine in relation to metal-nitrogen cyclam. Due to the effect of setting macrocyclic ring to the metal center, the metal-nitrogen bound in the cyclam are not as close as the ethylenediamine, this fact became these two ligands different. Similar behavior is also observed for complexes in which the chlorides are replaced by thiosulfate ligand, trans-Na[Co(en)2(tios)2] (-640 mV) and trans-Na[Co(cyclam)(tios)2] (-376 mV). In absorption spectroscopy in the UV-visible, there is the band of charge transfer LMCT (ligand p d* the metal) in the trans-Na[Co(cyclam)(tios)2] (350 nm, p tios  d* Co3+) and in the trans-Na[Co(en)2(tios)2] (333 nm, p tios d* Co3+), that present higher wavelength compared to complex precursor trans- [Co(cyclam)Cl2]Cl (318 nm, pCl  d* Co3+), indicating a facility of electron density transfer for the metal in the complex with the thiosulfate ligand. The infrared analysis showed the coordination of the thiosulfate ligand to the metal by bands in the region (620-635 cm-1), features that prove the monodentate coordination via the sulfur atom. The νN-H bands of the complexes with ethylenediamine are (3283 and 3267 cm-1) and the complex with cyclam bands are (3213 and 3133 cm-1). The luminescence spectrum of the trans-Na[Co(cyclam)(tios)2] present charge transfer band at 397 nm and bands dd at 438, 450, 467, 481 and 492 nm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is a study of coordination compounds by quantum theory of atoms in molecules (QTAIM), based on the topological analysis of the electron density of molecular systems, both theoretically and experimentally obtained. The coordination chemistry topics which were studied are the chelate effect, bent titanocene and chemical bond in coordination complexes. The chelate effect was investigated according to topological and thermodynamic parameters. The exchange of monodentate ligands on polydentate ligands from same transition metal increases the stability of the complex both from entropy and enthalpy contributions. In some cases, the latter had a higher contribution to the stability of the complex in comparison with entropy. This enthalpic contribution is explained according to topological analysis of the M-ligand bonds where polidentate complex had higher values of electron density of bond critical point, Laplacian of electron density of bond critical point and delocalization index (number of shared electrons between two atoms). In the second chapter, was studied bent titanocenes with bulky cyclopentadienyl derivative π-ligand. The topological study showed the presence of secondary interactions between the atoms of π-ligands or between atoms of π-ligand and -ligand. It was found that, in the case of titanocenes with small difference in point group symmetry and with bulky ligands, there was an nearly linear relationship between stability and delocalization index involving the ring carbon atoms (Cp) and the titanium. However, the titanocene stability is not only related to the interaction between Ti and C atoms of Cp ring, but secondary interactions also play important role on the stability of voluminous titanocenes. The third chapter deals with the chemical bond in coordination compounds by means of QTAIM. The quantum theory of atoms in molecules so far classifies bonds and chemical interactions in two categories: closed shell interaction (ionic bond, hydrogen bond, van der Waals interaction, etc) and shared interaction (covalent bond). Based on topological parameters such as electron density, Laplacian of electron density, delocalization index, among others, was classified the chemical bond in coordination compounds as an intermediate between closed shell and shared interactions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the quantitative analysis of glucose, triglycerides and cholesterol (total and HDL) in both rat and human blood plasma was performed without any kind of pretreatment of samples, by using near infrared spectroscopy (NIR) combined with multivariate methods. For this purpose, different techniques and algorithms used to pre-process data, to select variables and to build multivariate regression models were compared between each other, such as partial least squares regression (PLS), non linear regression by artificial neural networks, interval partial least squares regression (iPLS), genetic algorithm (GA), successive projections algorithm (SPA), amongst others. Related to the determinations of rat blood plasma samples, the variables selection algorithms showed satisfactory results both for the correlation coefficients (R²) and for the values of root mean square error of prediction (RMSEP) for the three analytes, especially for triglycerides and cholesterol-HDL. The RMSEP values for glucose, triglycerides and cholesterol-HDL obtained through the best PLS model were 6.08, 16.07 e 2.03 mg dL-1, respectively. In the other case, for the determinations in human blood plasma, the predictions obtained by the PLS models provided unsatisfactory results with non linear tendency and presence of bias. Then, the ANN regression was applied as an alternative to PLS, considering its ability of modeling data from non linear systems. The root mean square error of monitoring (RMSEM) for glucose, triglycerides and total cholesterol, for the best ANN models, were 13.20, 10.31 e 12.35 mg dL-1, respectively. Statistical tests (F and t) suggest that NIR spectroscopy combined with multivariate regression methods (PLS and ANN) are capable to quantify the analytes (glucose, triglycerides and cholesterol) even when they are present in highly complex biological fluids, such as blood plasma

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work has as main objective to contribute to the coordination chemistry of the ligand kojic acid, with the synthesis and characterization of the homoleptic compounds [Al(kj)3], [Fe(kj)3], [Fe(kj)2], [Cu(kj)2] e [Ru(kj)3], and the new heteroleptic complexes, trans- K2[Fe(kj)2(CN)2] and trans-Na2[Ru(kj)2(CN)2]. The obtained compounds were characterized by vibrational spectroscopy in the infrared region (IV) and Electronic spectroscopy in the ultraviolet and visible region (Uv-Vis). The infrared results indicated the coordination of the bidentate ligand kojic acid, due to reductions in the values of the stretching frequencies of the carbonyl and double bonds, compared to the free ligand for all complexes obtained. The presence of new vibrational modes indicated the change of symmetry of the molecules in the new compounds synthesized. Additionally, the presence of vibrational modes assigned to metal-oxygen also contributed to confirm the ligand coordinating to the metal ions. Through this technique, was also possible to perform correlations of the numbers of vibrational modes, in the region 1400-900 cm-1 and the compounds geometry. The heteroleptic compounds exhibited υC≡N in 2065 and 2053 cm-1, respectively, for the trans-K2[Fe(kj)2(CN)2] and trans-Na2[Ru(kj)2(CN)2], indicating coordination of the cyano ligand to metal ions FeII e RuII. Comparing the obtained values with literature data was possible to identify the complex isomerism as trans. In relation to the results of electronic spectroscopy, studies of pH variation of kojic acid provided information on the distribution of electron density in the molecule, showing characteristic spectral profile of kojic ion and its protonated form (Hkj, kojic acid), with two bands at 215 and 269 nm, or deprotonated (kj-), with bands at 226 and 315 nm. The electronic spectra obtained for all complexes in aqueous medium, in the ultraviolet region, exhibited variations of the energies assigned to kojic acid intraligand transitions while in the visible region, only transitions assigned to charge transfer of iron and ruthenium complex have been identified

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to generate an asymmetric biocompactible and biodegradable chitosan membrane modified by the contact with a poly(acrylic acid) solution at one of its sides at room temperature and 60◦C. The pure chitosan membrane, as well as the ones treated with poly(acrylic acid) were characterized by infrared spectroscopy (FTIRATR) at angles of 39◦, 45◦ and 60◦ , swelling capacity in water, thermal analysis (TG/DTG), scanning electronic microscopy (SEM) and permeation experiments using metronidazole at 0,1% and 0,2% as a model drug. The results confirmed the presence of ionic interaction between chitosan and poly(acrylic acid) by means of a polyelectrolyte complex (PEC) formation. They also showed that such interactions were more effective at 60◦C since this temperature is above the chitosan glass transition temperature wich makes the diffusion of poly(acrylic acid) easier, and that the two treated membranes were asymmetrics, more thermically stable and less permeable in relation to metronidazole than the pure chitosan membrane

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVOS: analisar o desempenho de escolares de 2ª a 5ª série do ensino fundamental em provas de habilidades metalinguísticas e leitura segundo critérios psicolinguísticos e cognitivo-linguísticos e verificar similaridade e diferenças entre as análises. MÉTODOS: participaram 120 escolares de 2ª a 5ª série do ensino municipal, de ambos os gêneros, na faixa etária de sete a 12 anos de idade, divididos em 4 grupos de 30 escolares de cada série. Os escolares foram submetidos à aplicação de provas de habilidades metalinguísticas e de leitura. RESULTADOS: houve diferença estatisticamente significante entre os grupos nas habilidades metalinguísticas, nas regras de decodificação de palavras reais e pseudopalavras para todas as variáveis na leitura de palavras reais, com exceção do erro tipo Recusas, com médias superiores para Tentativas de Som Mal Sucedidas e Falha na Aplicação de Regras Ortográficas, indicando que esses tipos de erros foram os de maior ocorrência. Na leitura de pseudopalavras houve diferença estatisticamente significante em Tentativas de Som Mal Sucedidas, indicando que os escolares apresentaram desempenho inferior na decodificação de palavras que exigiram a utilização de informação fonológica. CONCLUSÃO: a adoção de critérios psicolinguísticos ou cognitivo-linguísticos na avaliação da leitura de palavras e pseudopalavras juntamente com a avaliação das habilidades metalinguísticas fornecem subsídios para a compreensão de como o escolar vem processando os complexos princípios do sistema de escrita do português do Brasil, além de dar o suporte necessário à compreensão das dificuldades específicas apresentadas pelos escolares, orientando o profissional fonoaudiólogo em relação aos objetivos precisos no seu atendimento.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A matemática intervalar é uma teoria matemática originada na década de 60 com o objetivo de responder questões de exatidão e eficiência que surgem na prática da computação científica e na resolução de problemas numéricos. As abordagens clássicas para teoria da computabilidade tratam com problemas discretos (por exemplo, sobre os números naturais, números inteiros, strings sobre um alfabeto finito, grafos, etc.). No entanto, campos da matemática pura e aplicada tratam com problemas envolvendo números reais e números complexos. Isto acontece, por exemplo, em análise numérica, sistemas dinâmicos, geometria computacional e teoria da otimização. Assim, uma abordagem computacional para problemas contínuos é desejável, ou ainda necessária, para tratar formalmente com computações analógicas e computações científicas em geral. Na literatura existem diferentes abordagens para a computabilidade nos números reais, mas, uma importante diferença entre estas abordagens está na maneira como é representado o número real. Existem basicamente duas linhas de estudo da computabilidade no contínuo. Na primeira delas uma aproximação da saída com precisão arbitrária é computada a partir de uma aproximação razoável da entrada [Bra95]. A outra linha de pesquisa para computabilidade real foi desenvolvida por Blum, Shub e Smale [BSS89]. Nesta aproximação, as chamadas máquinas BSS, um número real é visto como uma entidade acabada e as funções computáveis são geradas a partir de uma classe de funções básicas (numa maneira similar às funções parciais recursivas). Nesta dissertação estudaremos o modelo BSS, usado para se caracterizar uma teoria da computabilidade sobre os números reais e estenderemos este para se modelar a computabilidade no espaço dos intervalos reais. Assim, aqui veremos uma aproximação para computabilidade intervalar epistemologicamente diferente da estudada por Bedregal e Acióly [Bed96, BA97a, BA97b], na qual um intervalo real é visto como o limite de intervalos racionais, e a computabilidade de uma função intervalar real depende da computabilidade de uma função sobre os intervalos racionais

Relevância:

10.00% 10.00%

Publicador:

Resumo:

JUSTIFICATIVA E OBJETIVOS: As pesquisas recentes têm focalizado a plasticidade bioquímica e estrutural do sistema nervoso decorrente da lesão tissular. Os mecanismos envolvidos na transição da dor aguda para crônica são complexos e envolvem a interação de sistemas receptores e o fluxo de íons intracelulares, sistemas de segundo mensageiro e novas conexões sinápticas. O objetivo deste artigo foi discutir os novos mecanismos que envolvem a sensibilização periférica e central. CONTEÚDO: A lesão tissular provoca aumento na resposta dos nociceptores, chamada de sensibilização ou facilitação. Esses fenômenos iniciam-se após a liberação local de mediadores inflamatórios e a ativação de células do sistema imune ou de receptores específicos no sistema nervoso periférico e central. CONCLUSÕES: As lesões do tecido e dos neurônios resultam em sensibilização de nociceptores e facilitação da condução nervosa central e periférica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intendding to understand how the human mind operates, some philosophers and psycologists began to study about rationality. Theories were built from those studies and nowadays that interest have been extended to many other areas such as computing engineering and computing science, but with a minimal distinction at its goal: to understand the mind operational proccess and apply it on agents modelling to become possible the implementation (of softwares or hardwares) with the agent-oriented paradigm where agents are able to deliberate their own plans of actions. In computing science, the sub-area of multiagents systems has progressed using several works concerning artificial intelligence, computational logic, distributed systems, games theory and even philosophy and psycology. This present work hopes to show how it can be get a logical formalisation extention of a rational agents architecture model called BDI (based in a philosophic Bratman s Theory) in which agents are capable to deliberate actions from its beliefs, desires and intentions. The formalisation of this model is called BDI logic and it is a modal logic (in general it is a branching time logic) with three access relations: B, D and I. And here, it will show two possible extentions that tranform BDI logic in a modal-fuzzy logic where the formulae and the access relations can be evaluated by values from the interval [0,1]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of increasingly complex software applications is demanding greater investment in the development of such systems to ensure applications with better quality. Therefore, new techniques are being used in Software Engineering, thus making the development process more effective. Among these new approaches, we highlight Formal Methods, which use formal languages that are strongly based on mathematics and have a well-defined semantics and syntax. One of these languages is Circus, which can be used to model concurrent systems. It was developed from the union of concepts from two other specification languages: Z, which specifies systems with complex data, and CSP, which is normally used to model concurrent systems. Circus has an associated refinement calculus, which can be used to develop software in a precise and stepwise fashion. Each step is justified by the application of a refinement law (possibly with the discharge of proof obligations). Sometimes, the same laws can be applied in the same manner in different developments or even in different parts of a single development. A strategy to optimize this calculus is to formalise these application as a refinement tactic, which can then be used as a single transformation rule. CRefine was developed to support the Circus refinement calculus. However, before the work presented here, it did not provide support for refinement tactics. The aim of this work is to provide tool support for refinement tactics. For that, we develop a new module in CRefine, which automates the process of defining and applying refinement tactics that are formalised in the tactic language ArcAngelC. Finally, we validate the extension by applying the new module in a case study, which used the refinement tactics in a refinement strategy for verification of SPARK Ada implementations of control systems. In this work, we apply our module in the first two phases of this strategy

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work aims to develop modules that will increase the computational power of the Java-XSC library, and XSC an acronym for "Language Extensions for Scientific Computation . This library is actually an extension of the Java programming language that has standard functions and routines elementary mathematics useful interval. in this study two modules were added to the library, namely, the modulus of complex numbers and complex numbers of module interval which together with the modules original numerical applications that are designed to allow, for example in the engineering field, can be used in devices running Java programs