948 resultados para Modula-2 (Computer program language)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particulate solids are complex redundant systems which consist of discrete particles. The interactions between the particles are complex and have been the subject of many theoretical and experimental investigations. Invetigations of particulate material have been restricted by the lack of quantitative information on the mechanisms occurring within an assembly. Laboratory experimentation is limited as information on the internal behaviour can only be inferred from measurements on the assembly boundary, or the use of intrusive measuring devices. In addition comparisons between test data are uncertain due to the difficulty in reproducing exact replicas of physical systems. Nevertheless, theoretical and technological advances require more detailed material information. However, numerical simulation affords access to information on every particle and hence the micro-mechanical behaviour within an assembly, and can replicate desired systems. To use a computer program to numerically simulate material behaviour accurately it is necessary to incorporte realistic interaction laws. This research programme used the finite difference simulation program `BALL', developed by Cundall (1971), which employed linear spring force-displacement laws. It was thus necessary to incorporate more realistic interaction laws. Therefore, this research programme was primarily concerned with the implementation of the normal force-displacement law of Hertz (1882) and the tangential force-displacement laws of Mindlin and Deresiewicz (1953). Within this thesis the contact mechanics theories employed in the program are developed and the adaptations which were necessary to incorporate these laws are detailed. Verification of the new contact force-displacement laws was achieved by simulating a quasi-static oblique contact and single particle oblique impact. Applications of the program to the simulation of large assemblies of particles is given, and the problems in undertaking quasi-static shear tests along with the results from two successful shear tests are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many workers have studied the ocular components which occur in eyes exhibiting differing amounts of central refractive error but few have ever considered the additional information that could be derived from a study of peripheral refraction. Before now, peripheral refraction has either been measured in real eyes or has otherwise been modelled in schematic eyes of varying levels of sophistication. Several differences occur between measured and modelled results which, if accounted for, could give rise to more information regarding the nature of the optical and retinal surfaces and their asymmetries. Measurements of ocular components and peripheral refraction, however, have never been made in the same sample of eyes. In this study, ocular component and peripheral refractive measurements were made in a sample of young near-emmetropic, myopic and hyperopic eyes. The data for each refractive group was averaged. A computer program was written to construct spherical surfaced schematic eyes from this data. More sophisticated eye models were developed making use of linear algebraic ray tracing program. This method allowed rays to be traced through toroidal aspheric surfaces which were translated or rotated with respect to each other. For simplicity, the gradient index optical nature of the crystalline lens was neglected. Various alterations were made in these eye models to reproduce the measured peripheral refractive patterns. Excellent agreement was found between the modelled and measured peripheral refractive values over the central 70o of the visual field. This implied that the additional biometric features incorporated in each eye model were representative of those which were present in the measured eyes. As some of these features are not otherwise obtainable using in vivo techniques, it is proposed that the variation of refraction in the periphery offers a very useful optical method for studying human ocular component dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the competitive challenge facing business today, the need to keep cost down and quality up is a matter of survival. One way in which wire manufacturers can meet this challenge is to possess a thorough understanding of deformation, friction and lubrication during the wire drawing process, and therefore to make good decisions regarding the selection and application of lubricants as well as the die design. Friction, lubrication and die design during wire drawing thus become the subject of this study. Although theoretical and experimental investigations have been being carried out ever since the establishment of wire drawing technology, many problems remain unsolved. It is therefore necessary to conduct further research on traditional and fundamental subjects such as the mechanics of deformation, friction, lubrication and die design in wire drawing. Drawing experiments were carried out on an existing bull-block under different cross-sectional area reductions, different speeds and different lubricants. The instrumentation to measure drawing load and drawing speed was set up and connected to the wire drawing machine, together with a data acquisition system. A die box connected to the existing die holder for using dry soap lubricant was designed and tested. The experimental results in terms of drawing stress vs percentage area reduction curves under different drawing conditions were analysed and compared. The effects on drawing stress of friction, lubrication, drawing speed and pressure die nozzle are discussed. In order to determine the flow stress of the material during deformation, tensile tests were performed on an Instron universal test machine, using the wires drawn under different area reductions. A polynomial function is used to correlate the flow stress of the material with the plastic strain, on which a general computer program has been written to find out the coefficients of the stress-strain function. The residual lubricant film on the steel wire after drawing was examined both radially and longitudinally using an SEM and optical microscope. The lubricant film on the drawn wire was clearly observed. Therefore, the micro-analysis by SEM provides a way of friction and lubrication assessment in wire drawing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to be determined the network capacity (number of necessary internal switching lines) based on detailed users’ behaviour and demanded quality of service parameters in an overall telecommunication system. We consider detailed conceptual and its corresponded analytical traffic model of telecommunication system with (virtual) circuit switching, in stationary state with generalized input flow, repeated calls, limited number of homogeneous terminals and losses due to abandoned and interrupted dialing, blocked and interrupted switching, not available intent terminal, blocked and abandoned ringing (absent called user) and abandoned conversation. We propose an analytical - numerical solution for finding the number of internal switching lines and values of the some basic traffic parameters as a function of telecommunication system state. These parameters are requisite for maintenance demand level of network quality of service (QoS). Dependencies, based on the numericalanalytical results are shown graphically. For proposed conceptual and its corresponding analytical model a network dimensioning task (NDT) is formulated, solvability of the NDT and the necessary conditions for analytical solution are researched as well. It is proposed a rule (algorithm) and computer program for calculation of the corresponded number of the internal switching lines, as well as corresponded values of traffic parameters, making the management of QoS easily.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To assess the inter and intra observer variability of subjective grading of the retinal arterio-venous ratio (AVR) using a visual grading and to compare the subjectively derived grades to an objective method using a semi-automated computer program. Methods: Following intraocular pressure and blood pressure measurements all subjects underwent dilated fundus photography. 86 monochromatic retinal images with the optic nerve head centred (52 healthy volunteers) were obtained using a Zeiss FF450+ fundus camera. Arterio-venous ratios (AVR), central retinal artery equivalent (CRAE) and central retinal vein equivalent (CRVE) were calculated on three separate occasions by one single observer semi-automatically using the software VesselMap (ImedosSystems, Jena, Germany). Following the automated grading, three examiners graded the AVR visually on three separate occasions in order to assess their agreement. Results: Reproducibility of the semi-automatic parameters was excellent (ICCs: 0.97 (CRAE); 0.985 (CRVE) and 0.952 (AVR)). However, visual grading of AVR showed inter grader differences as well as discrepancies between subjectively derived and objectively calculated AVR (all p < 0.000001). Conclusion: Grader education and experience leads to inter-grader differences but more importantly, subjective grading is not capable to pick up subtle differences across healthy individuals and does not represent true AVR when compared with an objective assessment method. Technology advancements mean we no longer rely on opthalmoscopic evaluation but can capture and store fundus images with retinal cameras, enabling us to measure vessel calibre more accurately compared to visual estimation; hence it should be integrated in optometric practise for improved accuracy and reliability of clinical assessments of retinal vessel calibres. © 2014 Spanish General Council of Optometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considering the so-called "multinomial discrete choice" model the focus of this paper is on the estimation problem of the parameters. Especially, the basic question arises how to carry out the point and interval estimation of the parameters when the model is mixed i.e. includes both individual and choice-specific explanatory variables while a standard MDC computer program is not available for use. The basic idea behind the solution is the use of the Cox-proportional hazards method of survival analysis which is available in any standard statistical package and provided a data structure satisfying certain special requirements it yields the MDC solutions desired. The paper describes the features of the data set to be analysed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current reform initiatives recommend that geometry instruction include the study of three-dimensional geometric objects and provide students with opportunities to use spatial skills in problem-solving tasks. Geometer's Sketchpad (GSP) is a dynamic and interactive computer program that enables the user to investigate and explore geometric concepts and manipulate geometric structures. Research using GSP as an instructional tool has focused primarily on teaching and learning two-dimensional geometry. This study explored the effect of a GSP based instructional environment on students' geometric thinking and three-dimensional spatial ability as they used GSP to learn three-dimensional geometry. For 10 weeks, 18 tenth-grade students from an urban school district used GSP to construct and analyze dynamic, two-dimensional representations of three-dimensional objects in a classroom environment that encouraged exploration, discussion, conjecture, and verification. The data were collected primarily from participant observations and clinical interviews and analyzed using qualitative methods of analysis. In addition, pretest and posttest measures of three-dimensional spatial ability and van Hiele level of geometric thinking were obtained. Spatial ability measures were analyzed using standard t-test analysis. ^ The data from this study indicate that GSP is a viable tool to teach students about three-dimensional geometric objects. A comparison of students' pretest and posttest van Hiele levels showed an improvement in geometric thinking, especially for students on lower levels of the van Hiele theory. Evidence at the p < .05 level indicated that students' spatial ability improved significantly. Specifically, the GSP dynamic, visual environment supported students' visualization and reasoning processes as students attempted to solve challenging tasks about three-dimensional geometric objects. The GSP instructional activities also provided students with an experiential base and an intuitive understanding about three-dimensional objects from which more formal work in geometry could be pursued. This study demonstrates that by designing appropriate GSP based instructional environments, it is possible to help students improve their spatial skills, develop more coherent and accurate intuitions about three-dimensional geometric objects, and progress through the levels of geometric thinking proposed by van Hiele. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current reform initiatives recommend that school geometry teaching and learning include the study of three-dimensional geometric objects and provide students with opportunities to use spatial abilities in mathematical tasks. Two ways of using Geometer's Sketchpad (GSP), a dynamic and interactive computer program, in conjunction with manipulatives enable students to investigate and explore geometric concepts, especially when used in a constructivist setting. Research on spatial abilities has focused on visual reasoning to improve visualization skills. This dissertation investigated the hypothesis that connecting visual and analytic reasoning may better improve students' spatial visualization abilities as compared to instruction that makes little or no use of the connection of the two. Data were collected using the Purdue Spatial Visualization Tests (PSVT) administered as a pretest and posttest to a control and two experimental groups. Sixty-four 10th grade students in three geometry classrooms participated in the study during 6 weeks. Research questions were answered using statistical procedures. An analysis of covariance was used for a quantitative analysis, whereas a description of students' visual-analytic processing strategies was presented using qualitative methods. The quantitative results indicated that there were significant differences in gender, but not in the group factor. However, when analyzing a sub sample of 33 participants with pretest scores below the 50th percentile, males in one of the experimental groups significantly benefited from the treatment. A review of previous research also indicated that students with low visualization skills benefited more than those with higher visualization skills. The qualitative results showed that girls were more sophisticated in their visual-analytic processing strategies to solve three-dimensional tasks. It is recommended that the teaching and learning of spatial visualization start in the middle school, prior to students' more rigorous mathematics exposure in high school. A duration longer than 6 weeks for treatments in similar future research studies is also recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The field of chemical kinetics is an exciting and active field. The prevailing theories make a number of simplifying assumptions that do not always hold in actual cases. Another current problem concerns a development of efficient numerical algorithms for solving the master equations that arise in the description of complex reactions. The objective of the present work is to furnish a completely general and exact theory of reaction rates, in a form reminiscent of transition state theory, valid for all fluid phases and also to develop a computer program that can solve complex reactions by finding the concentrations of all participating substances as a function of time. To do so, the full quantum scattering theory is used for deriving the exact rate law, and then the resulting cumulative reaction probability is put into several equivalent forms that take into account all relativistic effects if applicable, including one that is strongly reminiscent of transition state theory, but includes corrections from scattering theory. Then two programs, one for solving complex reactions, the other for solving first order linear kinetic master equations to solve them, have been developed and tested for simple applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research focuses on developing a capacity planning methodology for the emerging concurrent engineer-to-order (ETO) operations. The primary focus is placed on the capacity planning at sales stage. This study examines the characteristics of capacity planning in a concurrent ETO operation environment, models the problem analytically, and proposes a practical capacity planning methodology for concurrent ETO operations in the industry. A computer program that mimics a concurrent ETO operation environment was written to validate the proposed methodology and test a set of rules that affect the performance of a concurrent ETO operation. ^ This study takes a systems engineering approach to the problem and employs systems engineering concepts and tools for the modeling and analysis of the problem, as well as for developing a practical solution to this problem. This study depicts a concurrent ETO environment in which capacity is planned. The capacity planning problem is modeled into a mixed integer program and then solved for smaller-sized applications to evaluate its validity and solution complexity. The objective is to select the best set of available jobs to maximize the profit, while having sufficient capacity to meet each due date expectation. ^ The nature of capacity planning for concurrent ETO operations is different from other operation modes. The search for an effective solution to this problem has been an emerging research field. This study characterizes the problem of capacity planning and proposes a solution approach to the problem. This mathematical model relates work requirements to capacity over the planning horizon. The methodology is proposed for solving industry-scale problems. Along with the capacity planning methodology, a set of heuristic rules was evaluated for improving concurrent ETO planning. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software Engineering is one of the most widely researched areas of Computer Science. The ability to reuse software, much like reuse of hardware components is one of the key issues in software development. The object-oriented programming methodology is revolutionary in that it promotes software reusability. This thesis describes the development of a tool that helps programmers to design and implement software from within the Smalltalk Environment (an Object- Oriented programming environment). The ASDN tool is part of the PEREAM (Programming Environment for the Reuse and Evolution of Abstract Models) system, which advocates incremental development of software. The Asdn tool along with the PEREAM system seeks to enhance the Smalltalk programming environment by providing facilities for structured development of abstractions (concepts). It produces a document that describes the abstractions that are developed using this tool. The features of the ASDN tool are illustrated by an example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose in this work, a new method of conceptual organization of areas involving assistive technology, categorizing them in a logical and simple manner; Furthermore, we also propose the implementation of an interface based on electroculography, able to generate high-level commands, to trigger robotic, computer and electromechanical devices. To validate the eye interface, was developed an electronic circuit associated with a computer program that captured the signals generated by eye movements of users, generating high-level commands, able to trigger an active bracing and many other electromechanical systems. The results showed that it was possible to control many electromechanical systems through only eye movements. The interface is presented as a viable way to perform the proposed task and can be improved in the signals analysis in the the digital level. The diagrammatic model developed, presented as a tool easy to use and understand, providing the conceptual organization needs of assistive technology

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work was to enable the recognition of human gestures through the development of a computer program. The program created captures the gestures executed by the user through a camera attached to the computer and sends it to the robot command referring to the gesture. They were interpreted in total ve gestures made by human hand. The software (developed in C ++) widely used the computer vision concepts and open source library OpenCV that directly impact the overall e ciency of the control of mobile robots. The computer vision concepts take into account the use of lters to smooth/blur the image noise reduction, color space to better suit the developer's desktop as well as useful information for manipulating digital images. The OpenCV library was essential in creating the project because it was possible to use various functions/procedures for complete control lters, image borders, image area, the geometric center of borders, exchange of color spaces, convex hull and convexity defect, plus all the necessary means for the characterization of imaged features. During the development of the software was the appearance of several problems, as false positives (noise), underperforming the insertion of various lters with sizes oversized masks, as well as problems arising from the choice of color space for processing human skin tones. However, after the development of seven versions of the control software, it was possible to minimize the occurrence of false positives due to a better use of lters combined with a well-dimensioned mask size (tested at run time) all associated with a programming logic that has been perfected over the construction of the seven versions. After all the development is managed software that met the established requirements. After the completion of the control software, it was observed that the overall e ectiveness of the various programs, highlighting in particular the V programs: 84.75 %, with VI: 93.00 % and VII with: 94.67 % showed that the nal program performed well in interpreting gestures, proving that it was possible the mobile robot control through human gestures without the need for external accessories to give it a better mobility and cost savings for maintain such a system. The great merit of the program was to assist capacity in demystifying the man set/machine therefore uses an easy and intuitive interface for control of mobile robots. Another important feature observed is that to control the mobile robot is not necessary to be close to the same, as to control the equipment is necessary to receive only the address that the Robotino passes to the program via network or Wi-Fi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sorghum (Sorghum bicolor (L.) Moench) is a good alternative to be used as silage, especially in places with water scarcity and high temperatures, due to their morphological and physiological characteristics. The appropriate management, as the ideal seeding time, interferes both productivity and the quality of silage. The work was conducted with the objective of evaluating the agronomic and bromatological performance of varieties and hybrids of silage sorghum and their phenotypic stability in two seasons, season and off-season, in the city of Uberlândia, Minas Gerais. The experiments were performed at Capim Branco Experimental Farm of Federal University of Uberlândia (UFU), located in the referred city. There were two sowing dates in the same experimental area, off-season (March to June 2014) and season (November 2014 to March 2015), and the varieties and hybrids were evaluated in both situations. The design was a randomized block with 25 treatments (hybrids and varieties of sorghum) and three replications. Agronomical and bromatological data were subjected to an analysis of variance; averages were grouped by Scott-Knott test at 5% of probability, through Genes computer program; and to estimate the stability, it was opted for Annicchiarico method. The flowering of cultivars, dry matter productivity, plant height, Acid Detergent Fiber (ADF), Neutral Detergent Fiber (NDF) and Crude Protein (CP) are affected by the environment and the variety. Regarding productivity and quality of the fiber, SF11 variety was superior, independent of the rated environment. In relation to the performance stability of dry matter, the varieties SF15, SF11, SF25, PROG 134 IPA, 1141572, 1141570 and 1141562 were highlighted. For the stability of the quality of fibers (FDA and FDN), the variety 1141562 stood out. The environment reduces the expression of characters “days of flowering”, “plant height” and “productivity of dry matter of hybrids”. From the 25 hybrids analyzed for productivity and stability of dry matter performance, seven were highlighted, regardless of the rated environment: Volumax commercial hybrid and experiments 12F39006, 12F39007, 12F37014, 12F39014, 12F38009 and 12F02006.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on close examinations of instant message (IM) interactions, this chapter argues that an interactional sociolinguistic approach to computer-mediated language use could provide explanations for phenomena that previously could not be accounted for in computer-mediated discourse analysis (CMDA). Drawing on the theoretical framework of relational work (Locher, 2006), the analysis focuses on non-task oriented talk and its function in forming and establishing communication norms in the team, as well as micro-level phenomena, such as hesitation, backchannel signals and emoticons. The conclusions of this preliminary research suggest that the linguistic strategies used for substituting audio-visual signals are strategically used in discursive functions and have an important role in relational work