986 resultados para Electronic spreadsheets -- Software


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neste trabalho nos propomos a fazer um estudo acerca da potencialidade de condução eletrônica no polímero BDT (1,3-benzoditiol 4H-ciclopenta[2,1-b:3,4b’]). O estudo usual de polímeros conjugados é feito de modo a obter sua densidade de estados com diversos tipos e níveis de dopagem. O método de Huckel é o mais utilizado e se baseia na separabilidade das ligações sigma e pi que é possível quando a molécula estudada é plana. Os polímeros conjugados são em sua maioria planos e estão inseridos nesta aproximação. O monômero do BDT apresenta sua geometria fora do plano por apresentar ligações com orbitais sp3. Para contornar esse problema foi desenvolvido o programa B3J, que considera todos os orbitais de valencia (s, px, py e pz). O programa B3J calcula a densidade de estados de sistemas poliméricos. O estudo das bandas do BDT foi feito com este software. Calculamos a densidade de estados do sistema neutro e com diversos níveis de dopagem, com distribuição aleatória e ordenada dos defeitos, dopagem do tipo n e do tipo p. O comportamento do quadrado do coeficiente da expansão da função de onda foi obtido para polímeros de até 20 monômeros. Estes cálculos foram obtidos com geometrias dos métodos AM1 e PM3. Obtivemos os espectros de absorção de oligômeros a fim de inferir seu comportamento para um polímero. Foram utilizados cálculos de otimização de geometria através dos métodos semi-empíricos AM1 e PM3 e ZINDO/S e o método DFT. Em outro objetivo desta monografia há o estudo do aproveitamento de tetrâmeros de BDT como dispositivos eletrônicos. Tais oligômeros foram otimizados em diversos valores de potencial elétrico, com a inserção em suas cadeias de moléculas doadoras e aceitadoras para induzir um aumento no momento de dipolo da mesma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

there is evidence that sport can trigger the onset of postural patterns specific to each modality, regardless of the geopolitical aspects, social, cultural habits of everyday life and ethnicity. Since changes in flexibility are cited as possible precursors of decreased range of motion, thereby harming the mechanics of the lower limbs and gait. Objetcive: The objective of this study was to analyze changes in posture and flexibility in young soccer players. Methods: were assessed 51 youngsters, aged between 14 and 18 years, soccer players registered in the Municipal Presidente Prudente SP and categories of the base of Gremio of Presidente Prudente. Data were collected from the assessment by the postural software assessment, and flexibility tests the Bench, to jail and later by tests of muscle length to jail earlier proposed by Kendall et al , was also collected anthropometric data were later confronted with the results statistically. The results were organized into spreadsheets for computing, which later could be performed the statistical analysis. Values are expressed by means of central tendency and variability as well as medians and 95% confidence intervals. The comparison for each profile height and BMI was made by means of analysis of variance complemented by Tukey test. Were considered the statistical differences when P <0.05. Results: In the sample studied 64% of the subjects classified as normal posture, the same happened with 70.59% of the athletes for flexibility in relation to the center of gravity of the sample had 100% anterior displacement of the trunk and 86.28% with a deviation of center of gravity to the left, showing a tendency to some postural deviations for the group assessed. Conclusion: from the results we can conclude that there was significant relationship between the postural angle of the right leg and left angle of the pelvis with BMI and also ankle angle... (Complete abstract click electronic access below)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to give greater visibility to the issue of software security, due to people talk a lot in security conferences, that much of both IT (Information Technology) staff and, more specifically, IS (Information Security) staff does not know this, and, thanks to the spread of the mobile computing and of the cloud computing, this lack of deeper knowledge on this subject is increasingly becoming worrisome. It aims too, make applications to be developed in a security manner, priorizing the security of the information processed. It attempts to demonstrate the secure coding techniques, the principles of software security, the means to identify software vulnerabilities, the cutting-edge software exploitation techniques and the mechanisms of mitigation. Nowadays, the security guys are in charge of the most of the security tests in applications, audits and pentests, and it is undeniable that the so-called security experts, most often come from computer network field, having few experience in software development and programming. Therefore, the development process does not consider the security issue, thanks to the lack of knowledge on the subject by the developer, and the security tests could be improved whether security experts had a greater know-how on application development. Given this problem, the goal here is to integrate information security with software development, spreading out the process of secure software development. To achieve this, a Linux distribution with proof of concept applicati... (Complete abstract click electronic access below)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults, and stories abound of spreadsheet faults that have led to multi-million dollar losses. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software. We have been working to address this problem by finding ways to provide at least some of the benefits of formal software engineering techniques to end-user programmers. In this talk, focusing on the spreadsheet application paradigm, I present several of our approaches, focusing on methodologies that utilize source-code-analysis techniques to help end-users build more dependable spreadsheets. Behind the scenes, our methodologies use static analyses such as dataflow analysis and slicing, together with dynamic analyses such as execution monitoring, to support user tasks such as validation and fault localization. I show how, to accommodate the user base of spreadsheet languages, an interface to these methodologies can be provided in a manner that does not require an understanding of the theory behind the analyses, yet supports the interactive, incremental process by which spreadsheets are created. Finally, I present empirical results gathered in the use of our methodologies that highlight several costs and benefits trade-offs, and many opportunities for future work.