980 resultados para Electronic spreadsheets -- Software
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Neste trabalho nos propomos a fazer um estudo acerca da potencialidade de condução eletrônica no polímero BDT (1,3-benzoditiol 4H-ciclopenta[2,1-b:3,4b’]). O estudo usual de polímeros conjugados é feito de modo a obter sua densidade de estados com diversos tipos e níveis de dopagem. O método de Huckel é o mais utilizado e se baseia na separabilidade das ligações sigma e pi que é possível quando a molécula estudada é plana. Os polímeros conjugados são em sua maioria planos e estão inseridos nesta aproximação. O monômero do BDT apresenta sua geometria fora do plano por apresentar ligações com orbitais sp3. Para contornar esse problema foi desenvolvido o programa B3J, que considera todos os orbitais de valencia (s, px, py e pz). O programa B3J calcula a densidade de estados de sistemas poliméricos. O estudo das bandas do BDT foi feito com este software. Calculamos a densidade de estados do sistema neutro e com diversos níveis de dopagem, com distribuição aleatória e ordenada dos defeitos, dopagem do tipo n e do tipo p. O comportamento do quadrado do coeficiente da expansão da função de onda foi obtido para polímeros de até 20 monômeros. Estes cálculos foram obtidos com geometrias dos métodos AM1 e PM3. Obtivemos os espectros de absorção de oligômeros a fim de inferir seu comportamento para um polímero. Foram utilizados cálculos de otimização de geometria através dos métodos semi-empíricos AM1 e PM3 e ZINDO/S e o método DFT. Em outro objetivo desta monografia há o estudo do aproveitamento de tetrâmeros de BDT como dispositivos eletrônicos. Tais oligômeros foram otimizados em diversos valores de potencial elétrico, com a inserção em suas cadeias de moléculas doadoras e aceitadoras para induzir um aumento no momento de dipolo da mesma.
Resumo:
there is evidence that sport can trigger the onset of postural patterns specific to each modality, regardless of the geopolitical aspects, social, cultural habits of everyday life and ethnicity. Since changes in flexibility are cited as possible precursors of decreased range of motion, thereby harming the mechanics of the lower limbs and gait. Objetcive: The objective of this study was to analyze changes in posture and flexibility in young soccer players. Methods: were assessed 51 youngsters, aged between 14 and 18 years, soccer players registered in the Municipal Presidente Prudente SP and categories of the base of Gremio of Presidente Prudente. Data were collected from the assessment by the postural software assessment, and flexibility tests the Bench, to jail and later by tests of muscle length to jail earlier proposed by Kendall et al , was also collected anthropometric data were later confronted with the results statistically. The results were organized into spreadsheets for computing, which later could be performed the statistical analysis. Values are expressed by means of central tendency and variability as well as medians and 95% confidence intervals. The comparison for each profile height and BMI was made by means of analysis of variance complemented by Tukey test. Were considered the statistical differences when P <0.05. Results: In the sample studied 64% of the subjects classified as normal posture, the same happened with 70.59% of the athletes for flexibility in relation to the center of gravity of the sample had 100% anterior displacement of the trunk and 86.28% with a deviation of center of gravity to the left, showing a tendency to some postural deviations for the group assessed. Conclusion: from the results we can conclude that there was significant relationship between the postural angle of the right leg and left angle of the pelvis with BMI and also ankle angle... (Complete abstract click electronic access below)
Resumo:
This work aims to give greater visibility to the issue of software security, due to people talk a lot in security conferences, that much of both IT (Information Technology) staff and, more specifically, IS (Information Security) staff does not know this, and, thanks to the spread of the mobile computing and of the cloud computing, this lack of deeper knowledge on this subject is increasingly becoming worrisome. It aims too, make applications to be developed in a security manner, priorizing the security of the information processed. It attempts to demonstrate the secure coding techniques, the principles of software security, the means to identify software vulnerabilities, the cutting-edge software exploitation techniques and the mechanisms of mitigation. Nowadays, the security guys are in charge of the most of the security tests in applications, audits and pentests, and it is undeniable that the so-called security experts, most often come from computer network field, having few experience in software development and programming. Therefore, the development process does not consider the security issue, thanks to the lack of knowledge on the subject by the developer, and the security tests could be improved whether security experts had a greater know-how on application development. Given this problem, the goal here is to integrate information security with software development, spreading out the process of secure software development. To achieve this, a Linux distribution with proof of concept applicati... (Complete abstract click electronic access below)
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults, and stories abound of spreadsheet faults that have led to multi-million dollar losses. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software.
Resumo:
Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software. We have been working to address this problem by finding ways to provide at least some of the benefits of formal software engineering techniques to end-user programmers. In this talk, focusing on the spreadsheet application paradigm, I present several of our approaches, focusing on methodologies that utilize source-code-analysis techniques to help end-users build more dependable spreadsheets. Behind the scenes, our methodologies use static analyses such as dataflow analysis and slicing, together with dynamic analyses such as execution monitoring, to support user tasks such as validation and fault localization. I show how, to accommodate the user base of spreadsheet languages, an interface to these methodologies can be provided in a manner that does not require an understanding of the theory behind the analyses, yet supports the interactive, incremental process by which spreadsheets are created. Finally, I present empirical results gathered in the use of our methodologies that highlight several costs and benefits trade-offs, and many opportunities for future work.
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
Telecommunications have been in constant evolution during past decades. Among the technological innovations, the use of digital technologies is very relevant. Digital communication systems have proven their efficiency and brought a new element in the chain of signal transmitting and receiving, the digital processor. This device offers to new radio equipments the flexibility of a programmable system. Nowadays, the behavior of a communication system can be modified by simply changing its software. This gave rising to a new radio model called Software Defined Radio (or Software-Defined Radio - SDR). In this new model, one moves to the software the task to set radio behavior, leaving to hardware only the implementation of RF front-end. Thus, the radio is no longer static, defined by their circuits and becomes a dynamic element, which may change their operating characteristics, such as bandwidth, modulation, coding rate, even modified during runtime according to software configuration. This article aims to present the use of GNU Radio software, an open-source solution for SDR specific applications, as a tool for development configurable digital radio.
Resumo:
Surveillance Levels (SLs) are categories for medical patients (used in Brazil) that represent different types of medical recommendations. SLs are defined according to risk factors and the medical and developmental history of patients. Each SL is associated with specific educational and clinical measures. The objective of the present paper was to verify computer-aided, automatic assignment of SLs. The present paper proposes a computer-aided approach for automatic recommendation of SLs. The approach is based on the classification of information from patient electronic records. For this purpose, a software architecture composed of three layers was developed. The architecture is formed by a classification layer that includes a linguistic module and machine learning classification modules. The classification layer allows for the use of different classification methods, including the use of preprocessed, normalized language data drawn from the linguistic module. We report the verification and validation of the software architecture in a Brazilian pediatric healthcare institution. The results indicate that selection of attributes can have a great effect on the performance of the system. Nonetheless, our automatic recommendation of surveillance level can still benefit from improvements in processing procedures when the linguistic module is applied prior to classification. Results from our efforts can be applied to different types of medical systems. The results of systems supported by the framework presented in this paper may be used by healthcare and governmental institutions to improve healthcare services in terms of establishing preventive measures and alerting authorities about the possibility of an epidemic.
Resumo:
Background Through this paper, we present the initial steps for the creation of an integrated platform for the provision of a series of eHealth tools and services to both citizens and travelers in isolated areas of thesoutheast Mediterranean, and on board ships travelling across it. The platform was created through an INTERREG IIIB ARCHIMED project called INTERMED. Methods The support of primary healthcare, home care and the continuous education of physicians are the three major issues that the proposed platform is trying to facilitate. The proposed system is based on state-of-the-art telemedicine systems and is able to provide the following healthcare services: i) Telecollaboration and teleconsultation services between remotely located healthcare providers, ii) telemedicine services in emergencies, iii) home telecare services for "at risk" citizens such as the elderly and patients with chronic diseases, and iv) eLearning services for the continuous training through seminars of both healthcare personnel (physicians, nurses etc) and persons supporting "at risk" citizens. These systems support data transmission over simple phone lines, internet connections, integrated services digital network/digital subscriber lines, satellite links, mobile networks (GPRS/3G), and wireless local area networks. The data corresponds, among others, to voice, vital biosignals, still medical images, video, and data used by eLearning applications. The proposed platform comprises several systems, each supporting different services. These were integrated using a common data storage and exchange scheme in order to achieve system interoperability in terms of software, language and national characteristics. Results The platform has been installed and evaluated in different rural and urban sites in Greece, Cyprus and Italy. The evaluation was mainly related to technical issues and user satisfaction. The selected sites are, among others, rural health centers, ambulances, homes of "at-risk" citizens, and a ferry. Conclusions The results proved the functionality and utilization of the platform in various rural places in Greece, Cyprus and Italy. However, further actions are needed to enable the local healthcare systems and the different population groups to be familiarized with, and use in their everyday lives, mature technological solutions for the provision of healthcare services.
Resumo:
Three-dimensional (3D) models of teeth and soft and hard tissues are tessellated surfaces used for diagnosis, treatment planning, appliance fabrication, outcome evaluation, and research. In scientific publications or communications with colleagues, these 3D data are often reduced to 2-dimensional pictures or need special software for visualization. The portable document format (PDF) offers a simple way to interactively display 3D surface data without additional software other than a recent version of Adobe Reader (Adobe, San Jose, Calif). The purposes of this article were to give an example of how 3D data and their analyses can be interactively displayed in 3 dimensions in electronic publications, and to show how they can be exported from any software for diagnostic reports and communications among colleagues.