41 resultados para Spanish language -- To 1500 -- Participle -- Congresses
Resumo:
Experiences relating to the InternationalMasters in Rural Development from the Technical University of Madrid (Universidad Politécnica de Madrid, UPM), the first Spanish programme to receive a mention as a Registered Education Programme by InternationalProject Management Association (IPMA) are considered. Backed by an educational strategy based on Project-Based Learning dating back twenty years, this programme has managed to adapt to the competence evaluation requirements proposed by the European Space for Higher Education (ESHE). In order to do this the training is linked to the professional qualification using competences as a reference leading to the qualification in project management as established by the IPMA.
Resumo:
This paper discusses a novel hybrid approach for text categorization that combines a machine learning algorithm, which provides a base model trained with a labeled corpus, with a rule-based expert system, which is used to improve the results provided by the previous classifier, by filtering false positives and dealing with false negatives. The main advantage is that the system can be easily fine-tuned by adding specific rules for those noisy or conflicting categories that have not been successfully trained. We also describe an implementation based on k-Nearest Neighbor and a simple rule language to express lists of positive, negative and relevant (multiword) terms appearing in the input text. The system is evaluated in several scenarios, including the popular Reuters-21578 news corpus for comparison to other approaches, and categorization using IPTC metadata, EUROVOC thesaurus and others. Results show that this approach achieves a precision that is comparable to top ranked methods, with the added value that it does not require a demanding human expert workload to train
Resumo:
The Santa Irene flood, at the end of October 1982, is one of the most dramatically and widely reported flood events in Spain. Its renown is mainly attributable to the collapse of the Tous dam, but its main message is to be the paradigm of the incidence of the maritime/littoral weather and its temporal sea-level rise on the coastal plains inland floods. The Santa Irene flood was attributable to a meteorological phenomenon known as gota fría (cold drop), a relatively frequent and intense rainy phenomenon on the Iberian Peninsula, particularly on the Spanish E to SE inlands and coasts. There are some circumstances that can easily come together to unleash the cold drop there: cold and dry polar air masses coming onto the whole Iberian Peninsula and the north of Africa, high sea-water temperatures, and low atmospheric pressure (cyclone) areas in the western Mediterranean basin; these circumstances are quite common during the autumn and, as it happens, in other places around the world (E/SE Africa). Their occurrence, however, shows a great space-temporal variability (in a similar way to hurricanes on Caribbean and western North Atlantic areas or also in a similar way to typhoons). In fact, all of these are equivalent, although different, phenomena, able to have a different magnitude each time. This paper describes the results of a detailed analysis and reflection about this cold drop phenomenon as a whole, on the generation of its rains, and on the different natures and consequences of its flood. This paper also explains the ways in which the nearby maritime weather and the consequential sea level govern floods on different zones of any hydrographical basin. The Santa Irene case can be considered as a paradigm to explain the influence of nearby maritime climatic conditions on flooding phenomena not only in coastal but also in upward inland areas.
Resumo:
We present a new free library for Constraint Logic Programming over Finite Domains, included with the Ciao Prolog system. The library is entirely written in Prolog, leveraging on Ciao's module system and code transformation capabilities in order to achieve a highly modular design without compromising performance. We describe the interface, implementation, and design rationale of each modular component. The library meets several design goals: a high level of modularity, allowing the individual components to be replaced by different versions; highefficiency, being competitive with other TT> implementations; a glass-box approach, so the user can specify new constraints at different levels; and a Prolog implementation, in order to ease the integration with Ciao's code analysis components. The core is built upon two small libraries which implement integer ranges and closures. On top of that, a finite domain variable datatype is defined, taking care of constraint reexecution depending on range changes. These three libraries form what we call the TT> kernel of the library. This TT> kernel is used in turn to implement several higher-level finite domain constraints, specified using indexicals. Together with a labeling module this layer forms what we name the TT> solver. A final level integrates the CLP (J7©) paradigm with our TT> solver. This is achieved using attributed variables and a compiler from the CLP (J7©) language to the set of constraints provided by the solver. It should be noted that the user of the library is encouraged to work in any of those levels as seen convenient: from writing a new range module to enriching the set of TT> constraints by writing new indexicals.
Resumo:
Fusarium proliferatum has been reported on garlic in the Northwest USA, Spain and Serbia, causing water-soaked tan-colored lesions on cloves. In this work, Fusarium proliferatum was isolated from 300 symptomatic garlic bulbs. Morphological identification of Fusarium was confirmed using species-specific PCR assays and EF-1α sequencing. Confirmation of pathogenicity was conducted with eighteen isolates. Six randomly selected F. proliferatum isolates from garlic were tested for specific pathogenicity and screened for fusaric acid production. Additionally, pathogenicity of each F. proliferatum isolate was tested on healthy seedlings of onion (Allium cepa), leek (A. porrum), scallions (A. fistulosum), chives (A. schoenoprasum) and garlic (A. sativum). A disease severity index (DSI) was calculated as the mean severity on three plants of each species with four test replicates. Symptoms on onion and garlic plants were observed three weeks after inoculation. All isolates tested produced symptoms on all varieties inoculated. Inoculation of F. proliferatum isolates from diseased garlic onto other Allium species provided new information on host range and pathogenicity. The results demonstrated differences in susceptibility with respect to host species and cultivar. The F. proliferatum isolates tested all produced fusaric acid (FA); correlations between FA production and isolate pathogenicity are discussed. Additionally, all isolates showed the presence of the FUM1 gene suggesting the ability of Spanish isolates to produce fumonisins.
Resumo:
Kinetic Monte Carlo (KMC) is a widely used technique to simulate the evolution of radiation damage inside solids. Despite de fact that this technique was developed several decades ago, there is not an established and easy to access simulating tool for researchers interested in this field, unlike in the case of molecular dynamics or density functional theory calculations. In fact, scientists must develop their own tools or use unmaintained ones in order to perform these types of simulations. To fulfil this need, we have developed MMonCa, the Modular Monte Carlo simulator. MMonCa has been developed using professional C++ programming techniques and has been built on top of an interpreted language to allow having a powerful yet flexible, robust but customizable and easy to access modern simulator. Both non lattice and Lattice KMC modules have been developed. We will present in this conference, for the first time, the MMonCa simulator. Along with other (more detailed) contributions in this meeting, the versatility of MMonCa to study a number of problems in different materials (particularly, Fe and W) subject to a wide range of conditions will be shown. Regarding KMC simulations, we have studied neutron-generated cascade evolution in Fe (as a model material). Starting with a Frenkel pair distribution we have followed the defect evolution up to 450 K. Comparison with previous simulations and experiments shows excellent agreement. Furthermore, we have studied a more complex system (He-irradiated W:C) using a previous parametrization [1]. He-irradiation at 4 K followed by isochronal annealing steps up to 500 K has been simulated with MMonCa. The He energy was 400 eV or 3 keV. In the first case, no damage is associated to the He implantation, whereas in the second one, a significant Frenkel pair concentration (evolving into complex clusters) is associated to the He ions. We have been able to explain He desorption both in the absence and in the presence of Frenkel pairs and we have also applied MMonCa to high He doses and fluxes at elevated temperatures. He migration and trapping dominate the kinetics of He desorption. These processes will be discussed and compared to experimental results. [1] C.S. Becquart et al. J. Nucl. Mater. 403 (2010) 75
Resumo:
El presente estudio recopila los diferentes bancos de pruebas de EMC, Radio, SAR y seguridad eléctrica que se utiliza en la vigilancia del mercado. Describe las pruebas realizadas por el Estado español para llevar a cabo el seguimiento y control de los equipos vendidos. En primer lugar se presenta una descripción cualitativa de los ambientes clasificados por requisitos que explica en qué consisten las pruebas que se hacen, cómo se configura el entorno de medida y cuáles son los equipos que se deben utilizar. Está ilustrado con fotografías de los bastidores, antenas, amplificadores, generadores y otros equipos que la SETSI tiene en su laboratorio en El Casar (Guadalajara). La vigilancia de los equipos se realiza periódicamente en los estados miembros de la Unión Europea. Los funcionarios responsables también se reúnen periódicamente a nivel europeo para sacar conclusiones y hacer proyecciones futuras. El equipo adquirido en centros comerciales de venta al público, después de poco más de un mes, en el que se pasan las pruebas, vuelve a la cadena de comercialización, por lo que los dispositivos no se puedan dañar. Las pruebas realizadas en el laboratorio no pueden ser perjudiciales para los dispositivos por lo que es necesario tomar precauciones en la ejecución de las pruebas. El centro del PFC es el área de la Seguridad Eléctrica. El capítulo 4 es una reproducción de la Norma internacional IEC 60950-1:2007/A11 sobre seguridad en equipos de tecnología de la información adaptada a los requisitos de la vigilancia de mercado. Las autoridades competentes se basan en este estándar para evaluar y garantizar la seguridad del usuario en los equipos de consumo. Diseñados los procedimientos de ensayo y montado un banco de acuerdo a estos ensayos, se ha preparado un manual de operador, capítulo 5, en el que se recogen cada una de las pruebas y controles visuales que deben realizarse para probar que el equipo es conforme. Para finalizar, en el capítulo 6, se ejecutan ensayos de ejemplo sobre tres dispositivos de prueba y se elaboran las correspondientes hojas de ensayo. El propósito es proporcionar al operador un manual completo del banco de seguridad eléctrica, es por ello que en el manual de operador se han repetido muchos fragmentos de la norma para facilitar al operador el objetivo concreto de las pruebas y el acceso fácil a la información. This dissertation presents the different test benches of EMC, Radio, SAR and Electrical Safety that are used in market surveillance. Describes the tests performed by the Spanish State to carry out the monitoring and control of equipment sold. First a qualitative description of the environments classified by requisites that explains in that consist the tests that are make it, how it configures the measure environment and which are the equipment that it should use. It is illustrated with photographs of the racks, antennas, amplifiers, generators and other equipment that the S.E.T.S.I has in its laboratory on El Casar (Guadalajara). The surveillance of equipment is performed periodically in the member states of the European Union. Responsible officials also periodically meet at European level to draw conclusions and make future projections. The equipment purchased in malls public sale, after little more than a month, in which the tests are passed, returned to the trade chain, which is why the devices cannot be damaged. The tests done in the laboratory may not be harmful for devices are thus necessary to take precautions in the execution of the tests. The center of this dissertation is the area of Electrical Safety. Chapter 4 is a reproduction of the International Standard IEC 60950-1:2007/A11 safety in teams of information technology, adapted to the requirements of market surveillance. The competent authorities are based on this standard to assess and ensure the user's safety in the consumption equipment. Designed test procedures and mounted on a bench according to these tests, it has prepared a manual operator, in Chapter 5, with each of the tests and visual inspections in order to prove that the equipment complies. Finally, in Chapter 6, such tests run on three devices are developed test and test some leaves. The purpose of this book is to provide an operator manual of the bench of electrical safety, which is why the operator manual have been repeated fragments of the standard, in chapter 5, to facilitate the operator to test execution and access to information.
Resumo:
In the context of the Semantic Web, resources on the net can be enriched by well-defined, machine-understandable metadata describing their associated conceptual meaning. These metadata consisting of natural language descriptions of concepts are the focus of the activity we describe in this chapter, namely, ontology localization. In the framework of the NeOn Methodology, ontology localization is defined as the activity of adapting an ontology to a particular language and culture. This adaptation mainly involves the translation of the natural language descriptions of the ontology from a source natural language to a target natural language, with the final objective of obtaining a multilingual ontology, that is, an ontology documented in several natural languages. The purpose of this chapter is to provide detailed and prescriptive methodological guidelines to support the performance of this activity.
Resumo:
The purpose of this report is to build a model that represents, as best as possible, the seismic behavior of a pile cap bridge foundation by a nonlinear static (analysis) procedure. It will consist of a reproduction of a specimen already built in the laboratory. This model will carry out a pseudo static lateral and horizontal pushover test that will be applied onto the pile cap until the failure of the structure, the formation of a plastic hinge in the piles due to the horizontal deformation, occurs. The pushover test consists of increasing the horizontal load over the pile cap until the horizontal displacement wanted at the height of the pile cap is reached. The output of this model will be a Skeleton curve that will plot the lateral load (kN) over the displacement (m), so that the maximum movement the pile cap foundation can reach before its failure can be calculated. This failure will be achieved when the load at that specific shift is equal to 85% of the maximum. The pile cap foundation finite element model was based on pile cap built for a laboratory experiment already carried out by the Master student Deming Zhang at Tongji University. Two different pile caps were tested with a difference in height above the ground level. While one has 0:3m, the other rises 0:8m above the ground level. The computer model was calibrated using the experimental results. The pile cap foundation will be programmed in a finite element environment called OpenSees (Open System for Earthquake Engineering Simulation [28]). This environment is a free software developed by Berkeley University specialized, as it name says, in the study of earthquakes and its effects on structures. This specialization is the main reason why it is being used for building this model as it makes it possible to build any finite element model, and perform several analysis in order to get the results wanted. The development of OpenSees is sponsored by the Pacific Earthquake Engineering Research Center through the National Science Foundation engineering and education centers program. OpenSees uses Tcl language to program it, which is a language similar to C++.
Resumo:
Desde los orígenes, la construcción popular ha vivido al lado --a veces, dentro-- de la arquitectura culta, encargándose de dar noticia y alojo a lo más vital de la existencia y poniéndose al servicio de lo construido para la pura representación. El interés por incorporar intelectualmente los depósitos plásticos, constructivos y funcionales de esa arquitectura anónima ha sido determinante para la configuración del Ciclo Moderno. Esta tesis se focaliza en la arquitectura que se hizo en la España de mediados del pasado siglo y que, de la mano de arquitectos como Fisac, Coderch, De la Sota y Fernández del Amo, entre otros, concretó un momento dulce en nuestra manera del construir y de habitar, fundamentado en una nueva mirada a lo popular, que supuso una contestación desde el realismo al tipismo de la Reconstrucción, la arquitectura monumentalista de la década de los 40. A partir del estudio sobre ese momento concreto de la arquitectura, nuestro campo reflexivo se amplía, por un lado, hacia la arquitectura de la Vanguardia de la II República (a la que, en cierto modo, se remite) y, por otro, hacia la rama de la arquitectura del Desarrollismo de los años 60, en la que continuó la invocación a lo popular como sustancia del proyecto, si bien entendida ahora en un sentido más amplio. La arquitectura de los maestros que articulan nuestro trabajo, entró de lleno en el debate sobre lo contextual, que acabará oficializándose tras la crisis del Movimiento Moderno acaecida sobre las ruinas de la II Guerra Mundial. El motivo de esta tesis es intentar hacer aflorar los útiles proyectuales de estas arquitecturas empeñadas en el arraigo, particularmente en un momento como el actual, en el que están experimentando un interesante renacer global. ABSTRACT From its origins, popular or traditional architecture has lived next door -- sometimes within-- high architecture, in charge of hosting and providing information about the most vital part of human existence, serving to what is built just for purely representative matters. Such an interest in intellectually incorporating the plastic, constructive and functional deposits from anonymous architecture has been instrumental in the configuration of the Modern Architectural Cycle. This thesis focuses on the architecture made in Spain halfway through the last century. Our way of building and living went through an extraordinary period by means of the work of architects such as Fisac, Coderch, De la Sota and Fernández del Amo, among others. Their projects, based on a new look at our popular material, became a rejoinder from realism to the picturesque model of the so-called Reconstrucción, the monumental architecture of the 40s. From the study of this particular architectural moment, the objective of our reflection widens, on the one hand, to the architecture of the Avantgarde of the II Spanish Republic (to which it somehow refers) and, secondly, to a branch of the architecture of the 60s (when an extraordinary development boom took place in Spain) which kept on relying on the popular as the essence of the project, although now understood in a broader sense. The architecture of the masters who articulate our work, engaged fully in the debate on the role of context which will eventually be formalized after the crisis of the Modern Movement occurred on the ruins of World War II. The purpose of this thesis is to try to bring out some of the project tools from these architectures committed to roots, particularly now, when they are experiencing an interesting overall reborn.
Resumo:
The aim of the paper is to discuss the use of knowledge models to formulate general applications. First, the paper presents the recent evolution of the software field where increasing attention is paid to conceptual modeling. Then, the current state of knowledge modeling techniques is described where increased reliability is available through the modern knowledge acquisition techniques and supporting tools. The KSM (Knowledge Structure Manager) tool is described next. First, the concept of knowledge area is introduced as a building block where methods to perform a collection of tasks are included together with the bodies of knowledge providing the basic methods to perform the basic tasks. Then, the CONCEL language to define vocabularies of domains and the LINK language for methods formulation are introduced. Finally, the object oriented implementation of a knowledge area is described and a general methodology for application design and maintenance supported by KSM is proposed. To illustrate the concepts and methods, an example of system for intelligent traffic management in a road network is described. This example is followed by a proposal of generalization for reuse of the resulting architecture. Finally, some concluding comments are proposed about the feasibility of using the knowledge modeling tools and methods for general application design.
Resumo:
This paper describes the adaptation approach of reusable knowledge representation components used in the KSM environment for the formulation and operationalisation of structured knowledge models. Reusable knowledge representation components in KSM are called primitives of representation. A primitive of representation provides: (1) a knowledge representation formalism (2) a set of tasks that use this knowledge together with several problem-solving methods to carry out these tasks (3) a knowledge acquisition module that provides different services to acquire and validate this knowledge (4) an abstract terminology about the linguistic categories included in the representation language associated to the primitive. Primitives of representation usually are domain independent. A primitive of representation can be adapted to support knowledge in a given domain by importing concepts from this domain. The paper describes how this activity can be carried out by mean of a terminological importation. Informally, a terminological importation partially populates an abstract terminology with concepts taken from a given domain. The information provided by the importation can be used by the acquisition and validation facilities to constraint the classes of knowledge that can be described using the representation formalism according to the domain knowledge. KSM provides the LINK-S language to specify terminological importation from a domain terminology to an abstract one. These terminologies are described in KSM by mean of the CONCEL language. Terminological importation is used to adapt reusable primitives of representation in order to increase the usability degree of such components in these domains. In addition, two primitives of representation can share a common vocabulary by importing common domain CONCEL terminologies (conceptual vocabularies). It is a necessary condition to make possible the interoperability between different, heterogeneous knowledge representation components in the framework of complex knowledge - based architectures.
Resumo:
Applications involving travel behavior from the perspective of land use are dating from the 1990s. Usually, four important components are distinguished: density, diversity and design (3D?s of Cervero and Kockelman) and accessibility (introduced by Geurs and van Wee). But there is not a general agreement on how to measure each of those 4 components. Density is used to be measured as population and employment densities, but others authors separate population density between residential and building densities. A lot of measures have been developed to estimate diversity: among others, a dissimilarity index to indicate the degree to which different land uses lie within one another?s surrounding, an entropy index to quantify the degree of balance across various land use types or proximities to commercial-retail uses. Design has been characterized by site design, and dwelling and street characteristics. Lastly, accessibility has become a frequently used concept, but its meaning on travel behavior field always refers to the ability ?to reach activities or locations by means of a travel mode?, measured as accessibility to jobs, to leisure activities, and others. Furthermore, the previous evidence is mainly based on US data or on north European countries. Therefore, this paper adds some new evidence from a Spanish perspective to the research debate. Through a Madrid smartphone-based survey, factor analysis is used to linearly combine variables into the 3D?s and accessibility dimensions of the built environment. At a first step for future investigations, land use variables will be treated to define accurately the previous 4 components.
Resumo:
La idea de dotar a un grupo de robots o agentes artificiales de un lenguaje ha sido objeto de intenso estudio en las ultimas décadas. Como no podía ser de otra forma los primeros intentos se enfocaron hacia el estudio de la emergencia de vocabularios compartidos convencionalmente por el grupo de robots. Las ventajas que puede ofrecer un léxico común son evidentes, como también lo es que un lenguaje con una estructura más compleja, en la que se pudieran combinar palabras, sería todavía más beneficioso. Surgen así algunas propuestas enfocadas hacia la emergencia de un lenguaje consensuado que muestre una estructura sintáctica similar al lenguaje humano, entre las que se encuentra este trabajo. Tomar el lenguaje humano como modelo supone adoptar algunas de las hipótesis y teorías que disciplinas como la filosofía, la psicología o la lingüística entre otras se han encargado de proponer. Según estas aproximaciones teóricas el lenguaje presenta una doble dimension formal y funcional. En base a su dimensión formal parece claro que el lenguaje sigue unas reglas, por lo que el uso de una gramática se ha considerado esencial para su representación, pero también porque las gramáticas son un dispositivo muy sencillo y potente que permite generar fácilmente estructuras simbólicas. En cuanto a la dimension funcional se ha tenido en cuenta la teoría quizá más influyente de los últimos tiempos, que no es otra que la Teoría de los Actos del Habla. Esta teoría se basa en la idea de Wittgenstein por la que el significado reside en el uso del lenguaje, hasta el punto de que éste se entiende como una manera de actuar y de comportarse, en definitiva como una forma de vida. Teniendo presentes estas premisas en esta tesis se pretende experimentar con modelos computacionales que permitan a un grupo de robots alcanzar un lenguaje común de manera autónoma, simplemente mediante interacciones individuales entre los robots, en forma de juegos de lenguaje. Para ello se proponen tres modelos distintos de lenguaje: • Un modelo basado en gramáticas probabilísticas y aprendizaje por refuerzo en el que las interacciones y el uso del lenguaje son claves para su emergencia y que emplea una gramática generativa estática y diseñada de antemano. Este modelo se aplica a dos grupos distintos: uno formado exclusivamente por robots y otro que combina robots y un humano, de manera que en este segundo caso se plantea un aprendizaje supervisado por humanos. • Un modelo basado en evolución gramatical que permite estudiar no solo el consenso sintáctico, sino también cuestiones relativas a la génesis del lenguaje y que emplea una gramática universal a partir de la cual los robots pueden evolucionar por sí mismos la gramática más apropiada según la situación lingüística que traten en cada momento. • Un modelo basado en evolución gramatical y aprendizaje por refuerzo que toma aspectos de los anteriores y amplia las posibilidades de los robots al permitir desarrollar un lenguaje que se adapta a situaciones lingüísticas dinámicas que pueden cambiar en el tiempo y también posibilita la imposición de restricciones de orden muy frecuentes en las estructuras sintácticas complejas. Todos los modelos implican un planteamiento descentralizado y auto-organizado, de manera que ninguno de los robots es el dueño del lenguaje y todos deben cooperar y colaborar de forma coordinada para lograr el consenso sintáctico. En cada caso se plantean experimentos que tienen como objetivo validar los modelos propuestos, tanto en lo relativo al éxito en la emergencia del lenguaje como en lo relacionado con cuestiones paralelas de importancia, como la interacción hombre-máquina o la propia génesis del lenguaje. ABSTRACT The idea of giving a language to a group of robots or artificial agents has been the subject of intense study in recent decades. The first attempts have focused on the development and emergence of a conventionally shared vocabulary. The advantages that can provide a common vocabulary are evident and therefore a more complex language that combines words would be even more beneficial. Thus some proposals are put forward towards the emergence of a consensual language with a sintactical structure in similar terms to the human language. This work follows this trend. Taking the human language as a model means taking some of the assumptions and theories that disciplines such as philosophy, psychology or linguistics among others have provided. According to these theoretical positions language has a double formal and functional dimension. Based on its formal dimension it seems clear that language follows rules, so that the use of a grammar has been considered essential for representation, but also because grammars are a very simple and powerful device that easily generates these symbolic structures. As for the functional dimension perhaps the most influential theory of recent times, the Theory of Speech Acts has been taken into account. This theory is based on the Wittgenstein’s idea about that the meaning lies in the use of language, to the extent that it is understood as a way of acting and behaving. Having into account these issues this work implements some computational models in order to test if they allow a group of robots to reach in an autonomous way a shared language by means of individual interaction among them, that is by means of language games. Specifically, three different models of language for robots are proposed: • A reinforcement learning based model in which interactions and language use are key to its emergence. This model uses a static probabilistic generative grammar which is designed beforehand. The model is applied to two different groups: one formed exclusively by robots and other combining robots and a human. Therefore, in the second case the learning process is supervised by the human. • A model based on grammatical evolution that allows us to study not only the syntactic consensus, but also the very genesis of language. This model uses a universal grammar that allows robots to evolve for themselves the most appropriate grammar according to the current linguistic situation they deal with. • A model based on grammatical evolution and reinforcement learning that takes aspects of the previous models and increases their possibilities. This model allows robots to develop a language in order to adapt to dynamic language situations that can change over time and also allows the imposition of syntactical order restrictions which are very common in complex syntactic structures. All models involve a decentralized and self-organized approach so that none of the robots is the language’s owner and everyone must cooperate and work together in a coordinated manner to achieve syntactic consensus. In each case experiments are presented in order to validate the proposed models, both in terms of success about the emergence of language and it relates to the study of important parallel issues, such as human-computer interaction or the very genesis of language.
Resumo:
This paper suggests a new strategy to develop CAD applications taking into account some of the most interesting proposals which have recently appeared in the technology development arena. Programming languages, operating systems, user devices, software architecture, user interfaces and user experience are among the elements which are considered for a new development framework. This strategy considers the organizational and architectural aspects of the CAD application together with the development framework. The architectural and organizational aspects are based on the programmed design concept, which can be implemented by means of a three-level software architecture. These levels are the conceptual level based on a declarative language, the mathematical level based on the geometric formulation of the product model and the visual level based on the polyhedral representation of the model as required by the graphic card. The development framework which has been considered is Windows 8. This operating system offers three development environments, one for web pplications (HTML5 + CSS3 + JavaScript), and other for native applications C/C++) and of course yet another for .NET applications (C#, VB, F#, etc.). The use rinterface and user experience for non-web application is described ith XAML (a well known declarative XML language) and the 3D API for games and design applications is DirectX. Additionally, Windows 8 facilitates the use of hybrid solutions, in which native and managed code can interoperate easily. Some of the most remarkable advantages of this strategy are the possibility of targeting both desktop and touch screen devices with the same development framework, the usage of several programming paradigms to apply the most appropriate language to each domain and the multilevel segmentation of developers and designers to facilitate the implementation of an open network of collaborators.