940 resultados para MODEL (Computer program language)
Resumo:
We explore the relationships between the construction of a work of art and the crafting of a computer program in Java and suggest that the structure of paintings and drawings may be used to teach the fundamental concepts of computer programming. This movement "from Art to Science", using art to drive computing, complements the common use of computing to inform art. We report on initial experiences using this approach with undergraduate and postgraduate students. An embryonic theory of the correspondence between art and computing is presented and a methodology proposed to develop this project further.
Resumo:
Abstract: This study was designed to validate a constructivist learning framework, herein referred to as Accessible Immersion Metrics (AIM), for second language acquisition (SLA) as well as to compare two delivery methods of the same framework. The AIM framework was originally developed in 2009 and is proposed as a “How to” guide for the application of constructivist learning principles to the second language classroom. Piloted in 2010 at Champlain College St-Lambert, the AIM model allows for language learning to occur, free of a fixed schedule, to be socially constructive through the use of task-based assessments and relevant to the learner’s life experience by focusing on the students’ needs rather than on course content.||Résumé : Cette étude a été principalement conçu pour valider un cadre d'apprentissage constructiviste, ci-après dénommé Accessible Immersion Metrics - AIM, pour l'acquisition d'une langue seconde - SLA. Le cadre de l'AIM est proposé comme un mode d'emploi pour l'application des principes constructivistes à l'apprentissage d’une langue seconde. Créé en 2009 par l'auteur, et piloté en 2010 au Collège Champlain St-Lambert, le modèle de l'AIM permet l'apprentissage des langues à se produire, sans horaire fixe et socialement constructive grâce à l'utilisation des évaluations alignées basées sur des tâches pertinentes à l'expérience de vie de l'étudiant en se concentrant sur les besoins des élèves plutôt que sur le contenu des cours.
Resumo:
The advantages of a COG (Component Object Graphic) approach to the composition of PDF pages have been set out in a previous paper [1]. However, if pages are to be composed in this way then the individual graphic objects must have known bounding boxes and must be correctly placed on the page in a process that resembles the link editing of a multi-module computer program. Ideally the linker should be able to utilize all declared resource information attached to each COG. We have investigated the use of an XML application called Personalized Print Markup Language (PPML) to control the link editing process for PDF COGs. Our experiments, though successful, have shown up the shortcomings of PPML's resource handling capabilities which are currently active at the document and page levels but which cannot be elegantly applied to individual graphic objects at a sub-page level. Proposals are put forward for modifications to PPML that would make easier any COG-based approach to page composition.
Resumo:
Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.
Resumo:
En este trabajo se aborda la aplicación de SPEA2, un método para optimización multiobjetivo, al cálculo de un esquema de dosificación para el tratamiento quimioterapéutico de una masa tumoral; entiéndase por esquema de dosificación la especificación del o de los agentes cito-tóxicos, sus dosis y tiempos en que deben administrarse. El problema de optimización aquí resuelto es uno multiobjetivo, pues el esquema de dosificación a calcularse debe minimizar no solo el tamaño del tumor, sino también la toxicidad remanente al término del tratamiento, su costo, etc. El SPEA2 es un algoritmo genético que aplica el criterio de Pareto; por lo tanto, lo que calcula es una aproximación a la frontera de Pareto, soluciones de entre las cuales el usuario puede escoger la “mejor”. En el proceso de esta investigación se construyó SoT-Q, una herramienta de software que consiste de dos módulos principales: un optimizador para calcular los esquemas de dosificación óptimos, y un simulador para aplicar dichos esquemas a un paciente (simulado) con masa tumoral; el funcionamiento del simulador se basa en un modelo fármaco-dinámico que representa el tumor. El programa SoT-Q podría en el futuro -una vez extensamente probado y depurado- asistir a médicos oncólogos en la toma de decisiones respecto a tratamientos quimioterapéuticos; o podría servir también como ayuda pedagógica en el entrenamiento de nuevos profesionales de la salud. Los resultados obtenidos fueron muy buenos; en todos los casos de prueba utilizados se logró reducir de manera significativa tanto el tamaño del tumor como la toxicidad remanente al término del tratamiento; en algunos casos la reducción fue de tres órdenes de magnitud.
Resumo:
En el campo bibliotecológico para desarrollar instrumentos de trabajo que nos permitan comunicarnos con las demás unidades y clientes de información, debemos pensar en normalizar nuestros procesos.Normalizar, en el sentido más amplio, significa seguir reglas que hagan un registro uniforme en cualquier lugar del mundo. Esto se logra mediante la aplicación de códigos y normas que han sido aceptadas internacionalmente por los organismos creados para este fin.El valor de la normalización estriba en evitar la duplicación de los esfuerzos de la catalogación en las diferentes unidades de información documental y facilitar el intercambio de datos bibliográficos tan pronto como sea posible. También, cosiste en darle al cliente la posibilidad de localizar la información en un a forma homogénea que obvie las barreras idiomáticas.
Resumo:
Dissertação de mestrado, Qualidade em Análises, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014
Resumo:
Na engenharia mecânica há cada vez mais necessidade de utilizar e de prever o comportamento das máquinas térmicas, mais propriamente dos motores de combustão interna, em especial na área da manutenção e da prevenção de falha num dos componentes vitais de um motor a 4 tempos: o veio de manivelas. Esta situação já tem sido bastante observada na indústria mecânica naval, nomeadamente na Marinha Portuguesa e, devido ao seu elevado grau de importância no desempenho de qualquer motor, decidiu-se focar o trabalho desta tese no estudo dos motores a diesel S.E.M.T Pielstick das unidades navais da Marinha Portuguesa, mais especificamente das corvetas da classe “João Coutinho” e da classe “ Baptista de Andrade”, devido ao historial de ocorrência de falhas no veio de manivelas nesta classe de navios e em outras da Marinha Portuguesa. Para efetuar este estudo, utilizaram-se todos os dados relativos ao historial de ocorrências de falhas destes motores, bem como todos os dados disponíveis do fabricante destes motores, por forma a reproduzir da forma mais fiável possível um modelo tridimensional do veio de manivelas no programa de modelação informática CAD Solidworks®, e possibilitar a análise cinemática do veio de manivelas. Desta forma, foi possível simular as condições de funcionamento do motor, assim como analisar e determinar a causa de falha do veio de manivelas, visando prolongar a vida útil dos veios de manivelas, contribuindo não só para menores custos de manutenção mas também para o aumento da operacionalidade destes navios.
Resumo:
The authors present a proposal to develop intelligent assisted living environments for home based healthcare. These environments unite the chronical patient clinical history sematic representation with the ability of monitoring the living conditions and events recurring to a fully managed Semantic Web of Things (SWoT). Several levels of acquired knowledge and the case based reasoning that is possible by knowledge representation of the health-disease history and acquisition of the scientific evidence will deliver, through various voice based natural interfaces, the adequate support systems for disease auto management but prominently by activating the less differentiated caregiver for any specific need. With these capabilities at hand, home based healthcare providing becomes a viable possibility reducing the institutionalization needs. The resulting integrated healthcare framework will provide significant savings while improving the generality of health and satisfaction indicators.
Resumo:
In order to determine the energy needed to artificially dry an agricultural product the latent heat of vaporization of moisture in the product, H, must be known. Generally, the expressions for H reported in the literature are of the form H = h(T)f(M), where h(T) is the latent heat of vaporization of free water, and f(M) is a function of the equilibrium moisture content, M, which is a simplification. In this article, a more general expression for the latent heat of vaporization, namely H = g(M,T), is used to determine H for cowpea, always-green variety. For this purpose, a computer program was developed which automatically fits about 500 functions, with one or two independent variables, imbedded in its library to experimental data. The program uses nonlinear regression, and classifies the best functions according to the least reduced chi-squared. A set of executed statistical tests shows that the generalized expression for H used in this work produces better results of H for cowpea than other equations found in literature.
Resumo:
A model where agents show discrete behavior regarding their actions, but have continuous opinions that are updated by interacting with other agents is presented. This new updating rule is applied to both the voter and Sznajd models for interaction between neighbors, and its consequences are discussed. The appearance of extremists is naturally observed and it seems to be a characteristic of this model.
Resumo:
The purpose of this paper is to propose a multiobjective optimization approach for solving the manufacturing cell formation problem, explicitly considering the performance of this said manufacturing system. Cells are formed so as to simultaneously minimize three conflicting objectives, namely, the level of the work-in-process, the intercell moves and the total machinery investment. A genetic algorithm performs a search in the design space, in order to approximate to the Pareto optimal set. The values of the objectives for each candidate solution in a population are assigned by running a discrete-event simulation, in which the model is automatically generated according to the number of machines and their distribution among cells implied by a particular solution. The potential of this approach is evaluated via its application to an illustrative example, and a case from the relevant literature. The obtained results are analyzed and reviewed. Therefore, it is concluded that this approach is capable of generating a set of alternative manufacturing cell configurations considering the optimization of multiple performance measures, greatly improving the decision making process involved in planning and designing cellular systems. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Leakage reduction in water supply systems and distribution networks has been an increasingly important issue in the water industry since leaks and ruptures result in major physical and economic losses. Hydraulic transient solvers can be used in the system operational diagnosis, namely for leak detection purposes, due to their capability to describe the dynamic behaviour of the systems and to provide substantial amounts of data. In this research work, the association of hydraulic transient analysis with an optimisation model, through inverse transient analysis (ITA), has been used for leak detection and its location in an experimental facility containing PVC pipes. Observed transient pressure data have been used for testing ITA. A key factor for the success of the leak detection technique used is the accurate calibration of the transient solver, namely adequate boundary conditions and the description of energy dissipation effects since PVC pipes are characterised by a viscoelastic mechanical response. Results have shown that leaks were located with an accuracy between 4-15% of the total length of the pipeline, depending on the discretisation of the system model.
Resumo:
The discrete-time neural network proposed by Hopfield can be used for storing and recognizing binary patterns. Here, we investigate how the performance of this network on pattern recognition task is altered when neurons are removed and the weights of the synapses corresponding to these deleted neurons are divided among the remaining synapses. Five distinct ways of distributing such weights are evaluated. We speculate how this numerical work about synaptic compensation may help to guide experimental studies on memory rehabilitation interventions.
Resumo:
A new digital computer mock circulatory system has been developed in order to replicate the physiologic and pathophysiologic characteristics of the human cardiovascular system. The computer performs the acquisition of pressure, flow, and temperature in an open loop system. A computer program has been developed in Labview programing environment to evaluate all these physical parameters. The acquisition system was composed of pressure, flow, and temperature sensors and also signal conditioning modules. In this study, some results of flow, cardiac frequencies, pressures, and temperature were evaluated according to physiologic ventricular states. The results were compared with literature data. In further works, performance investigations will be conducted on a ventricular assist device and endoprosthesis. Also, this device should allow for evaluation of several kinds of vascular diseases.