941 resultados para Modelagem estrutural de ambientes informacionais digitais


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As técnicas qualitativas disponiveis para a modelagem de cenários têm sido reconhecidas pela extrema limitação, evidenciada no principio das atividades do processo, como a fase inicial de concepção. As principais restrições têm sido: • inexistência de uma ferramenta que teste a consistência estrutural interna do modelo, ou pela utilização de relações econômicas com fundamentação teórica mas sem interface perfeita com o ambiente, ou pela adoção de variações binárias para testes de validação; • fixação "a priori" dos possíveis cenários, geralmente classificados sob três adjetivos - otimista, mais provável e pessimista - enviesados exatamente pelos atributos das pessoas que fornecem esta informação. o trabalho trata da utilização de uma ferramenta para a interação entre uma técnica que auxilia a geração de modelos, suportada pela lógica relacional com variações a quatro valores e expectativas fundamentadas no conhecimento do decisor acerca do mundo real. Tem em vista a construção de um sistema qualitativo de previsão exploratória, no qual os cenários são obtidos por procedimento essencialmente intuitivo e descritivos, para a demanda regional por eletricidade. Este tipo de abordagem - apresentada por J. Gershuny - visa principalmente ao fornecimento de suporte metodológico para a consistência dos cenários gerados qualitativamente. Desenvolvimento e estruturação do modelo são realizados em etapas, partindo-se de uma relação simples e prosseguindo com a inclusão de variáveis e efeitos que melhoram a explicação do modelo. o trabalho apresenta um conjunto de relações para a demanda regional de eletricidade nos principais setores de consumo residencial, comercial e industrial bem como os cenários resultantes das variações mais prováveis das suas componentes exógenas. Ao final conclui-se que esta técnica é útil em modelos que: • incluem variáveis sociais relevantes e de dificil mensuração; • acreditam na importância da consistência externa entre os resultados gerados pelo modelo e aqueles esperados para a tomada de decisões; • atribuem ao decisor a responsabilidade de compreender a fundamentação da estrutura conceitual do modelo. Adotado este procedimento, o autor aqui recomenda que o modelo seja validado através de um procedimento iterativo de ajustes com a participação do decisor. As técnicas quantitativas poderão ser adotadas em seguida, tendo o modelo como elemento de consistência.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modelos de predição baseados em estimações não-paramétricas continuam em desenvolvimento e têm permeado a comunidade quantitativa. Sua principal característica é que não consideram a priori distribuições de probabilidade conhecidas, mas permitem que os dados passados sirvam de base para a construção das próprias distribuições. Implementamos para o mercado brasileiro os estimadores agrupados não-paramétricos de Sam e Jiang (2009) para as funções de drift e de difusão do processo estocástico da taxa de juros instantânea, por meio do uso de séries de taxas de juros de diferentes maturidades fornecidas pelos contratos futuros de depósitos interfinanceiros de um dia (DI1). Os estimadores foram construídos sob a perspectiva da estimação por núcleos (kernels), que requer para a sua otimização um formato específico da função-núcleo. Neste trabalho, foi usado o núcleo de Epanechnikov, e um parâmetro de suavizamento (largura de banda), o qual é fundamental para encontrar a função de densidade de probabilidade ótima que forneça a estimação mais eficiente em termos do MISE (Mean Integrated Squared Error - Erro Quadrado Integrado Médio) no momento de testar o modelo com o tradicional método de validação cruzada de k-dobras. Ressalvas são feitas quando as séries não possuem os tamanhos adequados, mas a quebra estrutural do processo de difusão da taxa de juros brasileira, a partir do ano 2006, obriga à redução do tamanho das séries ao custo de reduzir o poder preditivo do modelo. A quebra estrutural representa um processo de amadurecimento do mercado brasileiro que provoca em grande medida o desempenho insatisfatório do estimador proposto.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MOREIRA, Luciana Moreira; SILVA, Armando Malheiro da. Impacto das tecnologias digitais nas bibliotecas universitarias: reflexões sobre o tema. Informaçao e sociedade: estudos. Joao Pessoa, v.19, n.3, p. 125-132,2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analisa as práticas de mediação desenvolvidas nas bibliotecas universitárias pelos bibliotecários diante das tecnologias digitais. Para tanto estabelece como objetivo geral analisar de forma comparativa, o impacto e mediação das tecnologias digitais no funcionamento de bibliotecas Universitárias de Portugal e da região Nordeste do Brasil. Integraram esta pesquisa 10 universidades federais brasileiras e 12 universidades públicas portuguesas, com um total geral de 115 bibliotecários, que são os sujeitos participantes. É uma pesquisa qualitativa que adota o método quadripolar – recomendado para os trabalhos desenvolvidos no âmbito das Ciências Sociais Aplicadas, e em especial na área de Ciência da Informação. Através da interação entre os polos: epistemológico, teórico, técnico e morfológico, que fundamentam este método, houve o fortalecimento e a fluidez das questões estudadas. Os resultados dos questionários aplicados aos bibliotecários, bem como da análise dos sítios das bibliotecas pesquisadas, foram interpretados através de um alicerce teórico baseado em três pontos principais: as questões paradigmáticas que envolvem a área de Ciência da Informação, a análise da mediação pós-custodial informacional e científica e as Tecnologias de Informação e Comunicação presentes nas bibliotecas. Como principais resultados vemos que o impacto das tecnologias digitais nas bibliotecas universitárias é considerado pelos bibliotecários brasileiros e portugueses como positivo, com ênfase em dois pontos: a inovação dos suportes de informação e a autossuficiência dos utilizadores. A maior diferença se percebe em relação ao aspecto social, através de uma maior preocupação entre os bibliotecários brasileiros com as barreiras informacionais causadas por questões econômica, social e educacional e sentido com menos intensidade pelos bibliotecários portugueses, que ascendem as tecnologias digitais com mais facilidade. De forma conclusiva, a análise do impacto e a mediação das tecnologias digitais nas bibliotecas pesquisadas, apontam para uma evolução nas práticas mediadoras das bibliotecas universitárias de Portugal e do Nordeste do Brasil e uma convergência laboral entre os bibliotecários portugueses e brasileiros.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The internationalization as an organizational phenomenon fundamentally strategic had as theoretical contributions some Schools that throughout the decades 60, 70, and 80 developed behavioral and economic approaches in order to explain the process. The behavioral approach deals with the perception of phenomenon as a gradual process from the perspective of the executives behavior (JOHANSON and VAHLNE, 1977; HALLÉN and WIEDERSHEIM - PAUL, 1979; CZINKOTA, 1985). This phenomenon in permanent theoretical and managerial evolution made an opportunity to build this investigation, whose goal is to analyse the impact comes from organizational capabilities and the external environment on the international performance of exporting firms. For both, were used as theoretical basis two types of analysis for the comprehension of international performance: Strategic Management - Industrial Organization and Resource-Based View and International Businesses - Current Economic and Behavioral. It was made a cross-sectional survey-based explanatory research, including 150 exporting companies with operations in the Northeast of Brazil. A conceptual model was made with eight constructs and eight research hypotheses, representative of the effects of external factors on international performance. The data were processed using the Exploratory Factor Analysis and Structural Equation Modeling. The structural equations model was reespecified and estimated through the use of the maximum-likelihood method up to achieve adequated values of indexes of adjustment. As the main theoretical contribution, were identified organizational and physical resources which shows the importance of the management skills development, of the learning capability and capability to establish strategic alliances abroad. That because the knowledge, as the operational point of view as in its strategic application, offers to organization conditions of market positioning which can create opportunities sustainable competitive advantages and which impact the performance of international companies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of the native prokaryotes in hazardous locations favors the application of biotechnology for bioremediation. Independent strategies for cultivation and metagenomics contribute to further microbiological knowledge, enabling studies with non-cultivable about the "native microbiological status and its potential role in bioremediation, for example, of polycyclic aromatic hydrocarbons (HPA's). Considering the biome mangrove interface fragile and critical bordering the ocean, this study characterizes the native microbiota mangrove potential biodegradability of HPA's using a biomarker for molecular detection and assessment of bacterial diversity by PCR in areas under the influence of oil companies in the Basin Petroleum Geology Potiguar (BPP). We chose PcaF, a metabolic enzyme, to be the molecular biomarker in a PCR-DGGE detection of prokaryotes that degrade HPA s. The PCR-DGGE fingerprints obtained from Paracuru-CE, Fortim-CE and Areia Branca-RN samples revealed the occurrence of fluctuations of microbial communities according to the sampling periods and in response to the impact of oil. In the analysis of microbial communities interference of the oil industry, in Areia Branca-RN and Paracuru-CE was observed that oil is a determinant of microbial diversity. Fortim-CE probably has no direct influence with the oil activity. In order to obtain data for better understanding the transport and biodegradation of HPA's, there were conducted in silico studies with modeling and simulation from obtaining 3-D models of proteins involved in the degradation of phenanthrene in the transport of HPA's and also getting the 3-D model of the enzyme PcaF used as molecular marker in this study. Were realized docking studies with substrates and products to a better understanding about the transport mechanism and catalysis of HPA s

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, the theoretical principles governing the molecular modeling were applied for electronic characterization of oligopeptide α3 and its variants (5Q, 7Q)-α3, as well as in the quantum description of the interaction of the aminoglycoside hygromycin B and the 30S subunit of bacterial ribosome. In the first study, the linear and neutral dipeptides which make up the mentioned oligopeptides were modeled and then optimized for a structure of lower potential energy and appropriate dihedral angles. In this case, three subsequent geometric optimization processes, based on classical Newtonian theory, the semi-empirical and density functional theory (DFT), explore the energy landscape of each dipeptide during the search of ideal minimum energy structures. Finally, great conformers were described about its electrostatic potential, ionization energy (amino acids), and frontier molecular orbitals and hopping term. From the hopping terms described in this study, it was possible in subsequent studies to characterize the charge transport propertie of these peptides models. It envisioned a new biosensor technology capable of diagnosing amyloid diseases, related to an accumulation of misshapen proteins, based on the conductivity displayed by proteins of the patient. In a second step of this dissertation, a study carried out by quantum molecular modeling of the interaction energy of an antibiotic ribosomal aminoglicosídico on your receiver. It is known that the hygromycin B (hygB) is an aminoglycoside antibiotic that affects ribosomal translocation by direct interaction with the small subunit of the bacterial ribosome (30S), specifically with nucleotides in helix 44 of the 16S ribosomal RNA (16S rRNA). Due to strong electrostatic character of this connection, it was proposed an energetic investigation of the binding mechanism of this complex using different values of dielectric constants (ε = 0, 4, 10, 20 and 40), which have been widely used to study the electrostatic properties of biomolecules. For this, increasing radii centered on the hygB centroid were measured from the 30S-hygB crystal structure (1HNZ.pdb), and only the individual interaction energy of each enclosed nucleotide was determined for quantum calculations using molecular fractionation with conjugate caps (MFCC) strategy. It was noticed that the dielectric constants underestimated the energies of individual interactions, allowing the convergence state is achieved quickly. But only for ε = 40, the total binding energy of drug-receptor interaction is stabilized at r = 18A, which provided an appropriate binding pocket because it encompassed the main residues that interact more strongly with the hygB - C1403, C1404, G1405, A1493, G1494, U1495, U1498 and C1496. Thus, the dielectric constant ≈ 40 is ideal for the treatment of systems with many electrical charges. By comparing the individual binding energies of 16S rRNA nucleotides with the experimental tests that determine the minimum inhibitory concentration (MIC) of hygB, it is believed that those residues with high binding values generated bacterial resistance to the drug when mutated. With the same reasoning, since those with low interaction energy do not influence effectively the affinity of the hygB in its binding site, there is no loss of effectiveness if they were replaced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to determine the influence of Digital Inclusion in school performance of students from public high school in the metropolitan area of Natal, through the use of computers in a pedagogical and Internet use.Throughout the paper we try to answer the question: The pedagogical use of computers connected to the Internet contributes to improving the academic performance of students in public schools in the RMNatal? To answer the research question, we focus on the database INEP on the infrastructure of schools and the bank rates of school performance. For both technical procedures performed to obtain the relationship between Internet and School Performance. Then the School Settings have been configured for Digital Inclusion and made crosses with the approval rates, distortion and failure. The survey results indicate that according to the classification established in: Included in School Settings, School Settings deficit, Adverse Environments School and School Settings Deleted, which has prevailed in the metropolitan area of Natal are the schools that are provided outside.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates teacher training and cognitive practice of teachers in a Basic Education school that adopted the Project One Computer per Student (OCS) in their school routine. Its relevance consists in provide directions for the continuation of training activities on the Project and guide the teachers with their pedagogical practices using the laptop model one to one. The thesis defended is that the educator formation for social using of digital media (specially the laptops from the Project UCA) gives space to establish new sociotechnical relationships, of new social and professionals practices, new identitary components and a process of reflexivity and knowledge reconstruction to teach. We reaffirm the importance of reflexivity and appropriation of digital culture for the better development of teaching practice using the Information and Communication Technologies (ICTs), giving focus to the aspects of social and professional use of the technology. The study is part of the qualitative aspect and is a procedural tracking based on principles of ethnographic research. As procedures and methodological tools, were used: intensive observation of school environments, documental analysis, focal group, semi-structured questionnaires and semi-structured individual interviews. The research was held in a public school in the city of Parnamirim - RN. The subject sample relates to 17 teachers, coming from the elementary school I and II, Youth and Adult Education and High School, who went through the process of training UCA and having entered the laptops in their teaching. The research corpus is structured based on the messages built into the process of data collection and is analyzed based on principles of Content Analysis, specified by Laurence Bardin (2011). Was taken as theoretical reference studies by Tardif (2000; 2011), Pimenta (2009), Gorz (2004, 2005), Giddens (1991), Dewey, J. (1916), Boudieu (1994; 1999), Freire (1996; 2005), among others. The analysis indicates a process of reconstruction / revision of knowledge to teach and work in digital culture, being these knowledges guided by the experience of the subjects investigated. The reconstructed knowledges will be revealed from a categorization process. The following groups of knowledges: "technical knowledges", "didactic-methodological knowledges and knowledges of professionalization" were built on the assumption of ownership of digital culture in the educational context. The analysis confirms the appearance of new ways of sociability when acquiring other forms of acting and thinking ICTs, despite the environment adverse to the reflexivity shared among the teachers. Also reveals, based on the ownership concept present on the data analysis, the construction of meanings of belonging and transformation of individuals into social routes from the interweaving of the teaching practice with the digital culture. Emphasizes, finally, the importance of a training for use of ICTs that exceeds the instrumentation, in other words, what we call "technical knowledges", but taking on its structural basis the shared reflection, the opening for the ressignificance (new meaning) and reconstruction of new knowledges and practices and that really allows, to the teacher, the living of an experience capable of providing socio-technical transformations of their relationships

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Usually masonry structures has low tension strength, hence the design to flexural efforts can results in high reinforcement ratio, specification of high unit and prism strength, structural members with larger section dimensions and modification in structural arrangement to be possible to use masonry members. The main objective of this study is to evaluate the stiffness, the efforts distribution and the effect of horizontal elements (girders) and vertical elements (counterforts) distribution on the behavior of masonry blocks retaining walls. For this purpose, numerical modeling was performed on typical retaining wall arrangements by varying the amount and placement of horizontal and vertical elements, beyond includes elements simulating the reactions of the soil supporting the foundation of the wall. The numerical modeling also include the macro modeling strategy in which the units, mortar and grout are discretized by a standard volume that represents the masonry elastic behavior. Also, numerical model results were compared with those ones of simplified models usually adopted in bending design of masonry elements. The results show horizontal displacements, principal and shear stresses distribution, and bending moments diagrams. From the analysis it was concluded that quantity and manner of distribution of the girders are both important factors to the panel flexural behavior, the inclusion of the foundation changed significantly the behavior of the wall, especially the horizontal displacements, and has been proposed a new way of considering the flanges section of the counterforts

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional perimeter-based approach for computer network security (the castle and the moat model) hinders the progress of enterprise systems and promotes, both in administrators and users, the delusion that systems are protected. To deal with the new range of threats, a new data-safety oriented paradigm, called de-perimeterisation , began to be studied in the last decade. One of the requirements for the implementation of the de-perimeterised model of security is the definition of a safe and effective mechanism for federated identity. This work seeks to fill this gap by presenting the specification, modelling and implementation of a mechanism for federated identity, based on the combination of SAML and X.509 digital certificates stored in smart-cards, following the A3 standard of ICP-Brasil (Brazilian official certificate authority and PKI)