877 resultados para ICL (Computer program language)
Resumo:
The emergence of digital imaging and of digital networks has made duplication of original artwork easier. Watermarking techniques, also referred to as digital signature, sign images by introducing changes that are imperceptible to the human eye but easily recoverable by a computer program. Usage of error correcting codes is one of the good choices in order to correct possible errors when extracting the signature. In this paper, we present a scheme of error correction based on a combination of Reed-Solomon codes and another optimal linear code as inner code. We have investigated the strength of the noise that this scheme is steady to for a fixed capacity of the image and various lengths of the signature. Finally, we compare our results with other error correcting techniques that are used in watermarking. We have also created a computer program for image watermarking that uses the newly presented scheme for error correction.
Resumo:
Purpose: To assess the inter and intra observer variability of subjective grading of the retinal arterio-venous ratio (AVR) using a visual grading and to compare the subjectively derived grades to an objective method using a semi-automated computer program. Methods: Following intraocular pressure and blood pressure measurements all subjects underwent dilated fundus photography. 86 monochromatic retinal images with the optic nerve head centred (52 healthy volunteers) were obtained using a Zeiss FF450+ fundus camera. Arterio-venous ratios (AVR), central retinal artery equivalent (CRAE) and central retinal vein equivalent (CRVE) were calculated on three separate occasions by one single observer semi-automatically using the software VesselMap (ImedosSystems, Jena, Germany). Following the automated grading, three examiners graded the AVR visually on three separate occasions in order to assess their agreement. Results: Reproducibility of the semi-automatic parameters was excellent (ICCs: 0.97 (CRAE); 0.985 (CRVE) and 0.952 (AVR)). However, visual grading of AVR showed inter grader differences as well as discrepancies between subjectively derived and objectively calculated AVR (all p < 0.000001). Conclusion: Grader education and experience leads to inter-grader differences but more importantly, subjective grading is not capable to pick up subtle differences across healthy individuals and does not represent true AVR when compared with an objective assessment method. Technology advancements mean we no longer rely on opthalmoscopic evaluation but can capture and store fundus images with retinal cameras, enabling us to measure vessel calibre more accurately compared to visual estimation; hence it should be integrated in optometric practise for improved accuracy and reliability of clinical assessments of retinal vessel calibres. © 2014 Spanish General Council of Optometry.
Resumo:
Considering the so-called "multinomial discrete choice" model the focus of this paper is on the estimation problem of the parameters. Especially, the basic question arises how to carry out the point and interval estimation of the parameters when the model is mixed i.e. includes both individual and choice-specific explanatory variables while a standard MDC computer program is not available for use. The basic idea behind the solution is the use of the Cox-proportional hazards method of survival analysis which is available in any standard statistical package and provided a data structure satisfying certain special requirements it yields the MDC solutions desired. The paper describes the features of the data set to be analysed.
Resumo:
The purpose of this ethnographic study was to describe and explain the congruency of psychological preferences identified by the Myers-Briggs Type Indicator (MBTI) and the human resource development (HRD) role of instructor/facilitator. This investigation was conducted with 23 HRD professionals who worked in the Miami, Florida area as instructors/facilitators with adult learners in job-related contexts.^ The study was conducted using qualitative strategies of data collection and analysis. The research participants were selected through a purposive sampling strategy. Data collection strategies included: (a) administration and scoring of the MBTI, Form G, (b) open-ended and semi-structured interviews, (c) participant observations of the research subjects at their respective work sites and while conducting training sessions, (d) field notes, and (e) contact summary sheets to record field research encounters. Data analysis was conducted with the use of a computer program for qualitative analysis called FolioViews 3.1 for Windows. This included: (a) coding of transcribed interviews and field notes, (b) theme analysis, (c) memoing, and (d) cross-case analysis.^ The three major themes that emerged in relation to the congruency of psychological preferences and the role of instructor/facilitator were: (1) designing and preparing instruction/facilitation, (2) conducting training and managing group process, and (3) interpersonal relations and perspectives among instructors/facilitators.^ The first two themes were analyzed through the combination of the four Jungian personality functions. These combinations are: sensing-thinking (ST), sensing-feeling (SF), intuition-thinking (NT), and intuition-feeling (NF). The third theme was analyzed through the combination of the attitudes or energy focus and the judgment function. These combinations are: extraversion-thinking (ET), extraversion-feeling (EF), introversion-thinking (IT), and introversion-feeling (IF).^ A last area uncovered by this ethnographic study was the influence exerted by a training and development culture on the instructor/facilitator role. This professional culture is described and explained in terms of the shared values and expectations reported by the study respondents. ^
Resumo:
Current reform initiatives recommend that geometry instruction include the study of three-dimensional geometric objects and provide students with opportunities to use spatial skills in problem-solving tasks. Geometer's Sketchpad (GSP) is a dynamic and interactive computer program that enables the user to investigate and explore geometric concepts and manipulate geometric structures. Research using GSP as an instructional tool has focused primarily on teaching and learning two-dimensional geometry. This study explored the effect of a GSP based instructional environment on students' geometric thinking and three-dimensional spatial ability as they used GSP to learn three-dimensional geometry. For 10 weeks, 18 tenth-grade students from an urban school district used GSP to construct and analyze dynamic, two-dimensional representations of three-dimensional objects in a classroom environment that encouraged exploration, discussion, conjecture, and verification. The data were collected primarily from participant observations and clinical interviews and analyzed using qualitative methods of analysis. In addition, pretest and posttest measures of three-dimensional spatial ability and van Hiele level of geometric thinking were obtained. Spatial ability measures were analyzed using standard t-test analysis. ^ The data from this study indicate that GSP is a viable tool to teach students about three-dimensional geometric objects. A comparison of students' pretest and posttest van Hiele levels showed an improvement in geometric thinking, especially for students on lower levels of the van Hiele theory. Evidence at the p < .05 level indicated that students' spatial ability improved significantly. Specifically, the GSP dynamic, visual environment supported students' visualization and reasoning processes as students attempted to solve challenging tasks about three-dimensional geometric objects. The GSP instructional activities also provided students with an experiential base and an intuitive understanding about three-dimensional objects from which more formal work in geometry could be pursued. This study demonstrates that by designing appropriate GSP based instructional environments, it is possible to help students improve their spatial skills, develop more coherent and accurate intuitions about three-dimensional geometric objects, and progress through the levels of geometric thinking proposed by van Hiele. ^
Resumo:
The need to incorporate advanced engineering tools in biology, biochemistry and medicine is in great demand. Many of the existing instruments and tools are usually expensive and require special facilities.^ With the advent of nanotechnology in the past decade, new approaches to develop devices and tools have been generated by academia and industry. ^ One such technology, NMR spectroscopy, has been used by biochemists for more than 2 decades to study the molecular structure of chemical compounds. However, NMR spectrometers are very expensive and require special laboratory rooms for their proper operation. High magnetic fields with strengths in the order of several Tesla make these instruments unaffordable to most research groups.^ This doctoral research proposes a new technology to develop NMR spectrometers that can operate at field strengths of less than 0.5 Tesla using an inexpensive permanent magnet and spin dependent nanoscale magnetic devices. This portable NMR system is intended to analyze samples as small as a few nanoliters.^ The main problem to resolve when downscaling the variables is to obtain an NMR signal with high Signal-To-Noise-Ratio (SNR). A special Tunneling Magneto-Resistive (TMR) sensor design was developed to achieve this goal. The minimum specifications for each component of the proposed NMR system were established. A complete NMR system was designed based on these minimum requirements. The goat was always to find cost effective realistic components. The novel design of the NMR system uses technologies such as Direct Digital Synthesis (DDS), Digital Signal Processing (DSP) and a special Backpropagation Neural Network that finds the best match of the NMR spectrum. The system was designed, calculated and simulated with excellent results.^ In addition, a general method to design TMR Sensors was developed. The technique was automated and a computer program was written to help the designer perform this task interactively.^
Resumo:
Current reform initiatives recommend that school geometry teaching and learning include the study of three-dimensional geometric objects and provide students with opportunities to use spatial abilities in mathematical tasks. Two ways of using Geometer's Sketchpad (GSP), a dynamic and interactive computer program, in conjunction with manipulatives enable students to investigate and explore geometric concepts, especially when used in a constructivist setting. Research on spatial abilities has focused on visual reasoning to improve visualization skills. This dissertation investigated the hypothesis that connecting visual and analytic reasoning may better improve students' spatial visualization abilities as compared to instruction that makes little or no use of the connection of the two. Data were collected using the Purdue Spatial Visualization Tests (PSVT) administered as a pretest and posttest to a control and two experimental groups. Sixty-four 10th grade students in three geometry classrooms participated in the study during 6 weeks. Research questions were answered using statistical procedures. An analysis of covariance was used for a quantitative analysis, whereas a description of students' visual-analytic processing strategies was presented using qualitative methods. The quantitative results indicated that there were significant differences in gender, but not in the group factor. However, when analyzing a sub sample of 33 participants with pretest scores below the 50th percentile, males in one of the experimental groups significantly benefited from the treatment. A review of previous research also indicated that students with low visualization skills benefited more than those with higher visualization skills. The qualitative results showed that girls were more sophisticated in their visual-analytic processing strategies to solve three-dimensional tasks. It is recommended that the teaching and learning of spatial visualization start in the middle school, prior to students' more rigorous mathematics exposure in high school. A duration longer than 6 weeks for treatments in similar future research studies is also recommended.
Resumo:
The field of chemical kinetics is an exciting and active field. The prevailing theories make a number of simplifying assumptions that do not always hold in actual cases. Another current problem concerns a development of efficient numerical algorithms for solving the master equations that arise in the description of complex reactions. The objective of the present work is to furnish a completely general and exact theory of reaction rates, in a form reminiscent of transition state theory, valid for all fluid phases and also to develop a computer program that can solve complex reactions by finding the concentrations of all participating substances as a function of time. To do so, the full quantum scattering theory is used for deriving the exact rate law, and then the resulting cumulative reaction probability is put into several equivalent forms that take into account all relativistic effects if applicable, including one that is strongly reminiscent of transition state theory, but includes corrections from scattering theory. Then two programs, one for solving complex reactions, the other for solving first order linear kinetic master equations to solve them, have been developed and tested for simple applications.
Resumo:
This research focuses on developing a capacity planning methodology for the emerging concurrent engineer-to-order (ETO) operations. The primary focus is placed on the capacity planning at sales stage. This study examines the characteristics of capacity planning in a concurrent ETO operation environment, models the problem analytically, and proposes a practical capacity planning methodology for concurrent ETO operations in the industry. A computer program that mimics a concurrent ETO operation environment was written to validate the proposed methodology and test a set of rules that affect the performance of a concurrent ETO operation. ^ This study takes a systems engineering approach to the problem and employs systems engineering concepts and tools for the modeling and analysis of the problem, as well as for developing a practical solution to this problem. This study depicts a concurrent ETO environment in which capacity is planned. The capacity planning problem is modeled into a mixed integer program and then solved for smaller-sized applications to evaluate its validity and solution complexity. The objective is to select the best set of available jobs to maximize the profit, while having sufficient capacity to meet each due date expectation. ^ The nature of capacity planning for concurrent ETO operations is different from other operation modes. The search for an effective solution to this problem has been an emerging research field. This study characterizes the problem of capacity planning and proposes a solution approach to the problem. This mathematical model relates work requirements to capacity over the planning horizon. The methodology is proposed for solving industry-scale problems. Along with the capacity planning methodology, a set of heuristic rules was evaluated for improving concurrent ETO planning. ^
Resumo:
Software Engineering is one of the most widely researched areas of Computer Science. The ability to reuse software, much like reuse of hardware components is one of the key issues in software development. The object-oriented programming methodology is revolutionary in that it promotes software reusability. This thesis describes the development of a tool that helps programmers to design and implement software from within the Smalltalk Environment (an Object- Oriented programming environment). The ASDN tool is part of the PEREAM (Programming Environment for the Reuse and Evolution of Abstract Models) system, which advocates incremental development of software. The Asdn tool along with the PEREAM system seeks to enhance the Smalltalk programming environment by providing facilities for structured development of abstractions (concepts). It produces a document that describes the abstractions that are developed using this tool. The features of the ASDN tool are illustrated by an example.
Resumo:
Student’s mistakes as viewed in a didactic and pedagogical perspective are a phenomenon inevitably observed in any context in which formal teaching-andlearning processes are taking place. Researchers have shown that such mistakes are viewed most of the times as undesirable and often as a consequence of lack of attention or poor commitment on the part of the student and rarely considered didactically useful. The object of our reflections in this work is exactly those mistakes, which are born in the entrails of the teaching-and-learning processes. It is our understanding that a mistake constitutes a tool which mediates knowledge and may therefore become a strong ally of the instructor’s actions in her/his teaching tasks and thus should be taken into the teacher’s best consideration. Understanding a mistake as so, we postulate that the teacher must face it as a possibility to be exploited rather than as a negative occurrence. Such an attitude on the part of the teacher would undoubtedly render profitable didactic situations. To deepen the understanding of our aim, we took a case study on the perception of senior college students in the program of Mathematics at UFRN in the year 2009, 2nd term. The reason of this choice is the fact that Mathematics is the field presenting traditionally the poorest records in terms of school grades. In this work we put forth data associated to ENEM1 , to the UFRN Vestibular2 and the undergraduate courses on Mathematics. The theoretical matrixes supporting our reflections in this thesis follow the ideas proposed by Castorina (1988); Davis e Espósito (1990); Aquino (1997); Luckesi (2006); Cury (1994; 2008); Pinto (2000); Torre (2007). To carry out the study, we applied a semi-structured questionnaire containing 14 questions, out of which 10 were open questions. The questions were methodologically based on the Thematic Analysis – One of the techniques for Content Analysis schemed by Bardin (1977) – and it was also used the computer program Modalisa 6.0 (A software designed by faculties the University of Paris VIII). The results indicate that most of the teachers training instructors in their pedagogical practice view the mistakes made by their students only as a guide for grading and, in this procedure, the student is frequently labeled as guilty. Conclusive analyses, therefore, signal to the necessity of orienting the teachers training instructors in the sense of building a new theoretical contemplation of the students’ mistakes and their pedagogical potentialities and so making those professionals perceive the importance of such mistakes, since they reveal gaps in the process of learning and provide valuable avenues for the teaching procedures.
Resumo:
We propose in this work, a new method of conceptual organization of areas involving assistive technology, categorizing them in a logical and simple manner; Furthermore, we also propose the implementation of an interface based on electroculography, able to generate high-level commands, to trigger robotic, computer and electromechanical devices. To validate the eye interface, was developed an electronic circuit associated with a computer program that captured the signals generated by eye movements of users, generating high-level commands, able to trigger an active bracing and many other electromechanical systems. The results showed that it was possible to control many electromechanical systems through only eye movements. The interface is presented as a viable way to perform the proposed task and can be improved in the signals analysis in the the digital level. The diagrammatic model developed, presented as a tool easy to use and understand, providing the conceptual organization needs of assistive technology
Resumo:
The main objective of this work was to enable the recognition of human gestures through the development of a computer program. The program created captures the gestures executed by the user through a camera attached to the computer and sends it to the robot command referring to the gesture. They were interpreted in total ve gestures made by human hand. The software (developed in C ++) widely used the computer vision concepts and open source library OpenCV that directly impact the overall e ciency of the control of mobile robots. The computer vision concepts take into account the use of lters to smooth/blur the image noise reduction, color space to better suit the developer's desktop as well as useful information for manipulating digital images. The OpenCV library was essential in creating the project because it was possible to use various functions/procedures for complete control lters, image borders, image area, the geometric center of borders, exchange of color spaces, convex hull and convexity defect, plus all the necessary means for the characterization of imaged features. During the development of the software was the appearance of several problems, as false positives (noise), underperforming the insertion of various lters with sizes oversized masks, as well as problems arising from the choice of color space for processing human skin tones. However, after the development of seven versions of the control software, it was possible to minimize the occurrence of false positives due to a better use of lters combined with a well-dimensioned mask size (tested at run time) all associated with a programming logic that has been perfected over the construction of the seven versions. After all the development is managed software that met the established requirements. After the completion of the control software, it was observed that the overall e ectiveness of the various programs, highlighting in particular the V programs: 84.75 %, with VI: 93.00 % and VII with: 94.67 % showed that the nal program performed well in interpreting gestures, proving that it was possible the mobile robot control through human gestures without the need for external accessories to give it a better mobility and cost savings for maintain such a system. The great merit of the program was to assist capacity in demystifying the man set/machine therefore uses an easy and intuitive interface for control of mobile robots. Another important feature observed is that to control the mobile robot is not necessary to be close to the same, as to control the equipment is necessary to receive only the address that the Robotino passes to the program via network or Wi-Fi.
Desempenho agronômico, bromatológico e estabilidade fenotípica de sorgo silageiro em Uberlândia - MG
Resumo:
Sorghum (Sorghum bicolor (L.) Moench) is a good alternative to be used as silage, especially in places with water scarcity and high temperatures, due to their morphological and physiological characteristics. The appropriate management, as the ideal seeding time, interferes both productivity and the quality of silage. The work was conducted with the objective of evaluating the agronomic and bromatological performance of varieties and hybrids of silage sorghum and their phenotypic stability in two seasons, season and off-season, in the city of Uberlândia, Minas Gerais. The experiments were performed at Capim Branco Experimental Farm of Federal University of Uberlândia (UFU), located in the referred city. There were two sowing dates in the same experimental area, off-season (March to June 2014) and season (November 2014 to March 2015), and the varieties and hybrids were evaluated in both situations. The design was a randomized block with 25 treatments (hybrids and varieties of sorghum) and three replications. Agronomical and bromatological data were subjected to an analysis of variance; averages were grouped by Scott-Knott test at 5% of probability, through Genes computer program; and to estimate the stability, it was opted for Annicchiarico method. The flowering of cultivars, dry matter productivity, plant height, Acid Detergent Fiber (ADF), Neutral Detergent Fiber (NDF) and Crude Protein (CP) are affected by the environment and the variety. Regarding productivity and quality of the fiber, SF11 variety was superior, independent of the rated environment. In relation to the performance stability of dry matter, the varieties SF15, SF11, SF25, PROG 134 IPA, 1141572, 1141570 and 1141562 were highlighted. For the stability of the quality of fibers (FDA and FDN), the variety 1141562 stood out. The environment reduces the expression of characters “days of flowering”, “plant height” and “productivity of dry matter of hybrids”. From the 25 hybrids analyzed for productivity and stability of dry matter performance, seven were highlighted, regardless of the rated environment: Volumax commercial hybrid and experiments 12F39006, 12F39007, 12F37014, 12F39014, 12F38009 and 12F02006.
Resumo:
Nesta dissertação apresentamos um trabalho de desenvolvimento e utilização de pulsos de radiofreqüência modulados simultaneamente em freqüência, amplitude e fase (pulsos fortemente modulados, SMP, do inglês Strongly Modulated Pulses) para criar estados iniciais e executar operações unitárias que servem como blocos básicos para processamento da informação quântica utilizando Ressonância Magnética Nuclear (RMN). As implementações experimentais foram realizas em um sistema de 3 q-bits constituído por spins nucleares de Césio 133 (spin nuclear 7/2) em uma amostra de cristal líquido em fase nemática. Os pulsos SMP´s foram construídos teoricamente utilizando um programa especialmente desenvolvido para esse fim, sendo o mesmo baseado no processo de otimização numérica Simplex Nelder-Mead. Através deste programa, os pulsos SMP foram otimizados de modo a executarem as operações lógicas desejadas com durações consideravelmente menores que aquelas realizadas usando o procedimento usual de RMN, ou seja, seqüências de pulsos e evoluções livres. Isso tem a vantagem de reduzir os efeitos de descoerência decorrentes da relaxação do sistema. Os conceitos teóricos envolvidos na criação dos SMPs são apresentados e as principais dificuldades (experimentais e teóricas) que podem surgir devido ao uso desses procedimentos são discutidas. Como exemplos de aplicação, foram produzidos os estados pseudo-puros usados como estados iniciais de operações lógicas em RMN, bem como operações lógicas que foram posteriormente aplicadas aos mesmos. Utilizando os SMP\'s também foi possível realizar experimentalmente os algoritmos quânticos de Grover e Deutsch-Jozsa para 3 q-bits. A fidelidade das implementações experimentais foi determinadas utilizando as matrizes densidade experimentais obtidas utilizando um método de tomografia da matriz densidade previamente desenvolvido.