981 resultados para Axiomatic formal system
Resumo:
BACKGROUND: The role of the language network in the pathophysiology of formal thought disorder has yet to be elucidated. AIMS: To investigate whether specific grey-matter deficits in schizophrenic formal thought disorder correlate with resting perfusion in the left-sided language network. METHOD: We investigated 13 right-handed patients with schizophrenia and formal thought disorder of varying severity and 13 matched healthy controls, using voxel-based morphometry and magnetic resonance imaging perfusion measurement (arterial spin labelling). RESULTS: We found positive correlations between perfusion and the severity of formal thought disorder in the left frontal and left temporoparietal language areas. We also observed bilateral deficits in grey-matter volume, positively correlated with the severity of thought disorder in temporoparietal areas and other brain regions. The results of the voxel-based morphometry and the arterial spin labelling measurements overlapped in the left posterior superior temporal gyrus and left angular gyrus. CONCLUSIONS: Specific grey-matter deficits may be a risk factor for state-related dysfunctions of the left-sided language system, leading to local hyperperfusion and formal thought disorder.
Resumo:
More than eighteen percent of the world’s population lives without reliable access to clean water, forced to walk long distances to get small amounts of contaminated surface water. Carrying heavy loads of water long distances and ingesting contaminated water can lead to long-term health problems and even death. These problems affect the most vulnerable populations, women, children, and the elderly, more than anyone else. Water access is one of the most pressing issues in development today. Boajibu, a small village in Sierra Leone, where the author served in Peace Corps for two years, lacks access to clean water. Construction of a water distribution system was halted when a civil war broke out in 1992 and has not been continued since. The community currently relies on hand-dug and borehole wells that can become dirty during the dry season, which forces people to drink contaminated water or to travel a far distance to collect clean water. This report is intended to provide a design the system as it was meant to be built. The water system design was completed based on the taps present, interviews with local community leaders, local surveying, and points taken with a GPS. The design is a gravity-fed branched water system, supplied by a natural spring on a hill adjacent to Boajibu. The system’s source is a natural spring on a hill above Boajibu, but the flow rate of the spring is unknown. There has to be enough flow from the spring over a 24-hour period to meet the demands of the users on a daily basis, or what is called providing continuous flow. If the spring has less than this amount of flow, the system must provide intermittent flow, flow that is restricted to a few hours a day. A minimum flow rate of 2.1 liters per second was found to be necessary to provide continuous flow to the users of Boajibu. If this flow is not met, intermittent flow can be provided to the users. In order to aid the construction of a distribution system in the absence of someone with formal engineering training, a table was created detailing water storage tank sizing based on possible source flow rates. A builder can interpolate using the source flow rate found to get the tank size from the table. However, any flow rate below 2.1 liters per second cannot be used in the table. In this case, the builder should size the tank such that it can take in the water that will be supplied overnight, as all the water will be drained during the day because the users will demand more than the spring can supply through the night. In the developing world, there is often a problem collecting enough money to fund large infrastructure projects, such as a water distribution system. Often there is only enough money to add only one or two loops to a water distribution system. It is helpful to know where these one or two loops can be most effectively placed in the system. Various possible loops were designated for the Boajibu water distribution system and the Adaptive Greedy Heuristic Loop Addition Selection Algorithm (AGHLASA) was used to rank the effectiveness of the possible loops to construct. Loop 1 which was furthest upstream was selected because it benefitted the most people for the least cost. While loops which were further downstream were found to be less effective because they would benefit fewer people. Further studies should be conducted on the water use habits of the people of Boajibu to more accurately predict the demands that will be placed on the system. Further population surveying should also be conducted to predict population change over time so that the appropriate capacity can be built into the system to accommodate future growth. The flow at the spring should be measured using a V-notch weir and the system adjusted accordingly. Future studies can be completed adjusting the loop ranking method so that two users who may be using the water system for different lengths of time are not counted the same and vulnerable users are weighted more heavily than more robust users.
Resumo:
File system security is fundamental to the security of UNIX and Linux systems since in these systems almost everything is in the form of a file. To protect the system files and other sensitive user files from unauthorized accesses, certain security schemes are chosen and used by different organizations in their computer systems. A file system security model provides a formal description of a protection system. Each security model is associated with specified security policies which focus on one or more of the security principles: confidentiality, integrity and availability. The security policy is not only about “who” can access an object, but also about “how” a subject can access an object. To enforce the security policies, each access request is checked against the specified policies to decide whether it is allowed or rejected. The current protection schemes in UNIX/Linux systems focus on the access control. Besides the basic access control scheme of the system itself, which includes permission bits, setuid and seteuid mechanism and the root, there are other protection models, such as Capabilities, Domain Type Enforcement (DTE) and Role-Based Access Control (RBAC), supported and used in certain organizations. These models protect the confidentiality of the data directly. The integrity of the data is protected indirectly by only allowing trusted users to operate on the objects. The access control decisions of these models depend on either the identity of the user or the attributes of the process the user can execute, and the attributes of the objects. Adoption of these sophisticated models has been slow; this is likely due to the enormous complexity of specifying controls over a large file system and the need for system administrators to learn a new paradigm for file protection. We propose a new security model: file system firewall. It is an adoption of the familiar network firewall protection model, used to control the data that flows between networked computers, toward file system protection. This model can support decisions of access control based on any system generated attributes about the access requests, e.g., time of day. The access control decisions are not on one entity, such as the account in traditional discretionary access control or the domain name in DTE. In file system firewall, the access decisions are made upon situations on multiple entities. A situation is programmable with predicates on the attributes of subject, object and the system. File system firewall specifies the appropriate actions on these situations. We implemented the prototype of file system firewall on SUSE Linux. Preliminary results of performance tests on the prototype indicate that the runtime overhead is acceptable. We compared file system firewall with TE in SELinux to show that firewall model can accommodate many other access control models. Finally, we show the ease of use of firewall model. When firewall system is restricted to specified part of the system, all the other resources are not affected. This enables a relatively smooth adoption. This fact and that it is a familiar model to system administrators will facilitate adoption and correct use. The user study we conducted on traditional UNIX access control, SELinux and file system firewall confirmed that. The beginner users found it easier to use and faster to learn then traditional UNIX access control scheme and SELinux.
Resumo:
The central question for this paper is how to improve the production process by closing the gap between industrial designers and software engineers of television(TV)-based User Interfaces (UI) in an industrial environment. Software engineers are highly interested whether one UI design can be converted into several fully functional UIs for TV products with different screen properties. The aim of the software engineers is to apply automatic layout and scaling in order to speed up and improve the production process. However, the question is whether a UI design lends itself for such automatic layout and scaling. This is investigated by analysing a prototype UI design done by industrial designers. In a first requirements study, industrial designers had created meta-annotations on top of their UI design in order to disclose their design rationale for discussions with software engineers. In a second study, five (out of ten) industrial designers assessed the potential of four different meta-annotation approaches. The question was which annotation method industrial designers would prefer and whether it could satisfy the technical requirements of the software engineering process. One main result is that the industrial designers preferred the method they were already familiar with, which therefore seems to be the most effective one although the main objective of automatic layout and scaling could still not be achieved.
Resumo:
The existing literature suggests that transitions in software-maintenance offshore outsourcing projects are prone to knowledge transfer blockades, i.e. situations in which the activities that would yield effective knowledge transfer do not occur, and that client management involvement is central to overcome them. However, the theoretical understanding of the knowledge transfer blockade is limited, and the reactive management behavior reported in case studies suggests that practitioners may frequently be astonished by the dynamics that may give rise to the blockade. Drawing on recent research from offshore sourcing and reference theories, this study proposes a system dynamics framework that may explain why knowledge transfer blockades emerge and how and why client management can overcome the blockade. The results suggest that blockades may emerge from a vicious circle of weak learning due to cognitive overload of vendor staff and resulting negative ability attributions that result in reduced helping behavior and thus aggravate cognitive load. Client management may avoid these vicious circles by selecting vendor staff with strong prior related experience. Longer phases of coexistence of vendor staff and subject matter experts and high formal and clan controls may also mitigate vicious circles.
Explaining Emergence and Consequences of Specific Formal Controls in IS Outsourcing – A Process-View
Resumo:
IS outsourcing projects often fail to achieve project goals. To inhibit this failure, managers need to design formal controls that are tailored to the specific contextual demands. However, the dynamic and uncertain nature of IS outsourcing projects makes the design of such specific formal controls at the outset of a project challenging. Hence, the process of translating high-level project goals into specific formal controls becomes crucial for success or failure of IS outsourcing projects. Based on a comparative case study of four IS outsourcing projects, our study enhances current understanding of such translation processes and their consequences by developing a process model that explains the success or failure to achieve high-level project goals as an outcome of two unique translation patterns. This novel process-based explanation for how and why IS outsourcing projects succeed or fail has important implications for control theory and IS project escalation literature.
Resumo:
Semi-presidential systems of democratic governance risk ending up in a stalemate when it is not clear which of the two „heads” – head of State or head of Government – shall take the lead. The current political situation in Romania features some of the commonly observed characteristics of such an institutional blockade. However, after addressing these formal aspects of political Romania, the author argues for not forgetting to take into account the informal, actor-related factors. The nature of the Romanian political parties and party system seems to hinder the finding of a consensus needed to exit the self-imposed blockade. More specifically, it is the Democratic Party (PD) that is the key to understanding the recent developments. The Government of April the third has yet to prove its efficiency.
Resumo:
The study of operations on representations of objects is well documented in the realm of spatial engineering. However, the mathematical structure and formal proof of these operational phenomena are not thoroughly explored. Other works have often focused on query-based models that seek to order classes and instances of objects in the form of semantic hierarchies or graphs. In some models, nodes of graphs represent objects and are connected by edges that represent different types of coarsening operators. This work, however, studies how the coarsening operator "simplification" can manipulate partitions of finite sets, independent from objects and their attributes. Partitions that are "simplified first have a collection of elements filtered (removed), and then the remaining partition is amalgamated (some sub-collections are unified). Simplification has many interesting mathematical properties. A finite composition of simplifications can also be accomplished with some single simplification. Also, if one partition is a simplification of the other, the simplified partition is defined to be less than the other partition according to the simp relation. This relation is shown to be a partial-order relation based on simplification. Collections of partitions can not only be proven to have a partial- order structure, but also have a lattice structure and are complete. In regard to a geographic information system (GIs), partitions related to subsets of attribute domains for objects are called views. Objects belong to different views based whether or not their attribute values lie in the underlying view domain. Given a particular view, objects with their attribute n-tuple codings contained in the view are part of the actualization set on views, and objects are labeled according to the particular subset of the view in which their coding lies. Though the scope of the work does not mainly focus on queries related directly to geographic objects, it provides verification for the existence of particular views in a system with this underlying structure. Given a finite attribute domain, one can say with mathematical certainty that different views of objects are partially ordered by simplification, and every collection of views has a greatest lower bound and least upper bound, which provides the validity for exploring queries in this regard.
Resumo:
Institutional Review Boards (IRBs) are the primary gatekeepers for the protection of ethical standards of federally regulated research on human subjects in this country. This paper focuses on what general, broad measures that may be instituted or enhanced to exemplify a "model IRB". This is done by examining the current regulatory standards of federally regulated IRBs, not private or commercial boards, and how many of those standards have been found either inadequate or not generally understood or followed. The analysis includes suggestions on how to bring about changes in order to make the IRB process more efficient, less subject to litigation, and create standardized educational protocols for members. The paper also considers how to include better oversight for multi-center research, increased centralization of IRBs, utilization of Data Safety Monitoring Boards when necessary, payment for research protocol review, voluntary accreditation, and the institution of evaluation/quality assurance programs. ^ This is a policy study utilizing secondary analysis of publicly available data. Therefore, the research for this paper focuses on scholarly medical/legal journals, web information from the Department of Health and Human Services, Federal Drug Administration, and the Office of the Inspector General, Accreditation Programs, law review articles, and current regulations applicable to the relevant portions of the paper. ^ Two issues are found to be consistently cited by the literature as major concerns. One is a need for basic, standardized educational requirements across all IRBs and its members, and secondly, much stricter and more informed management of continuing research. There is no federally regulated formal education system currently in place for IRB members, except for certain NIH-based trials. Also, IRBs are not keeping up with research once a study has begun, and although regulated to do so, it does not appear to be a great priority. This is the area most in danger of increased litigation. Other issues such as voluntary accreditation and outcomes evaluation are slowing gaining steam as the processes are becoming more available and more sought after, such as JCAHO accrediting of hospitals. ^ Adopting the principles discussed in this paper should promote better use of a local IRBs time, money, and expertise for protecting the vulnerable population in their care. Without further improvements to the system, there is concern that private and commercial IRBs will attempt to create a monopoly on much of the clinical research in the future as they are not as heavily regulated and can therefore offer companies quicker and more convenient reviews. IRBs need to consider the advantages of charging for their unique and important services as a cost of doing business. More importantly, there must be a minimum standard of education for all IRB members in the area of the ethical standards of human research and a greater emphasis placed on the follow-up of ongoing research as this is the most critical time for study participants and may soon lead to the largest area for litigation. Additionally, there should be a centralized IRB for multi-site trials or a study website with important information affecting the trial in real time. There needs to be development of standards and metrics to assess the performance of the IRBs for quality assurance and outcome evaluations. The boards should not be content to run the business of human subjects' research without determining how well that function is actually being carried out. It is important that federally regulated IRBs provide excellence in human research and promote those values most important to the public at large.^
Resumo:
One of the advantages of social networks is the possibility to socialize and personalize the content created or shared by the users. In mobile social networks, where the devices have limited capabilities in terms of screen size and computing power, Multimedia Recommender Systems help to present the most relevant content to the users, depending on their tastes, relationships and profile. Previous recommender systems are not able to cope with the uncertainty of automated tagging and are knowledge domain dependant. In addition, the instantiation of a recommender in this domain should cope with problems arising from the collaborative filtering inherent nature (cold start, banana problem, large number of users to run, etc.). The solution presented in this paper addresses the abovementioned problems by proposing a hybrid image recommender system, which combines collaborative filtering (social techniques) with content-based techniques, leaving the user the liberty to give these processes a personal weight. It takes into account aesthetics and the formal characteristics of the images to overcome the problems of current techniques, improving the performance of existing systems to create a mobile social networks recommender with a high degree of adaptation to any kind of user.
Resumo:
onceptual design phase is partially supported by product lifecycle management/computer-aided design (PLM/CAD) systems causing discontinuity of the design information flow: customer needs — functional requirements — key characteristics — design parameters (DPs) — geometric DPs. Aiming to address this issue, it is proposed a knowledge-based approach is proposed to integrate quality function deployment, failure mode and effects analysis, and axiomatic design into a commercial PLM/CAD system. A case study, main subject of this article, was carried out to validate the proposed process, to evaluate, by a pilot development, how the commercial PLM/CAD modules and application programming interface could support the information flow, and based on the pilot scheme results to propose a full development framework.
Resumo:
Teniendo en cuenta que no hay nada que se escape de la moda 1, y extendiendonos más allá de esta manida discusión sobre intersecciones formales, esta investigación propone la pasarela como un lugar real de mediación entre moda y arquitectura. Asumiendo esta condición, la pasarela encarna nuevos modos de producción apropiándose de su espacio y estructura, y convierténdose en una máquina capaz de generar múltiples y más bien infinitos significados. La moda es sin duda un proyecto creativo, que ha venido utilizando la pasarela como un marco para la reordenación de su narrativa visual, renovándose asi mismo como fenómeno social. Este proyecto de investigación plantea, que contrariamente las tipologías actuales de las pasarelas no nos facilitan la comprensión de una colección – que suele ser el objetivo principal. Presentan en cambio un entorno en el que se acoplan diferentes formatos visuales, -con varias capas-, conviéndolo en una compleja construcción y provocando nunerosas fricciones con el espacio-tiempo-acción durante el proceso de creación de otros territorios. Partiendo de la idea de la pasarela como un sistema, en el que sus numerosas variables pueden producir diversas combinaciones, esta investigación plantea la hipótesis por la cual un nuevo sistema de pasarela se estaría formando enteramente con capas de información. Este escenario nos conduciría a la inmersión final de la moda en los tejidos de la virtualidad. Si bien el debate sobre la relevancia de los desfiles de moda se ha vuelto más evidente hoy en día, esta investigación especula con la posibilidad del pensamiento arquitectónico y como este puede introducir metodologías de análisis en el marco de estos desfiles de moda, proponiendo una lectura de la pasarela como un sistema de procedimientos específicos inherente a los proyectos/procesos de la arquitectura. Este enfoque enlaza ambas prácticas en un territorio común donde el espacio, el diseño, el comportamiento, el movimiento, y los cuerpos son ordenados/organizados en la creación de estas nuevas posibilidades visuales, y donde las interacciones activan la generación de la novedad y los mensajes. PALABRAS CLAVES moda, sistema, virtual, información, arquitectura Considering that there is nothing left untouched by fashion2, and going beyond the already exhausted discussion about formal intersections, this research introduces the catwalk as the real arena of mediation between fashion and architecture. By assuming this condition, the catwalk embodies new modes of production that appropriates its space and turns it into a machine for generating multiple if not infinite meanings. Fashion, as a creative project, has utilized the catwalk as a frame for rearranging its visual narrative and renewing itself as social phenomena. This research disputes, however, that the current typologies of catwalks do not facilitate the understanding of the collection – as its primary goal - but, instead, present an environment composed of multi-layered visual formats, becoming a complex construct that collides space-time-action in the creation of other territories. Departing from the analysis of the catwalk as a system and how its many variables can produce diverse combinations, this research presents the hypothesis that a new system is being formed entirely built out of information. Such scenario indicates fashion´s final immersion into the fabrics of virtuality. While the discussion about the relevance of fashion shows has become more evident today, this research serves as an introductory speculation on how architectural thinking can introduce methodologies of analysis within the framework of the fashion shows, by proposing a reading of the catwalk as a system through specific procedures that are inherent to architectural projects. Such approach intertwines both practices into a common territory where space, design, behaviour, movement, and bodies are organized for the creation of visual possibilities, and where interactions are triggered in the making of novelty and messages. KEYWORDS fashion, system, virtual, information, architectural
Resumo:
Los tipos de datos concurrentes son implementaciones concurrentes de las abstracciones de datos clásicas, con la diferencia de que han sido específicamente diseñados para aprovechar el gran paralelismo disponible en las modernas arquitecturas multiprocesador y multinúcleo. La correcta manipulación de los tipos de datos concurrentes resulta esencial para demostrar la completa corrección de los sistemas de software que los utilizan. Una de las mayores dificultades a la hora de diseñar y verificar tipos de datos concurrentes surge de la necesidad de tener que razonar acerca de un número arbitrario de procesos que invocan estos tipos de datos de manera concurrente. Esto requiere considerar sistemas parametrizados. En este trabajo estudiamos la verificación formal de propiedades temporales de sistemas concurrentes parametrizados, poniendo especial énfasis en programas que manipulan estructuras de datos concurrentes. La principal dificultad a la hora de razonar acerca de sistemas concurrentes parametrizados proviene de la interacción entre el gran nivel de concurrencia que éstos poseen y la necesidad de razonar al mismo tiempo acerca de la memoria dinámica. La verificación de sistemas parametrizados resulta en sí un problema desafiante debido a que requiere razonar acerca de estructuras de datos complejas que son accedidas y modificadas por un numero ilimitado de procesos que manipulan de manera simultánea el contenido de la memoria dinámica empleando métodos de sincronización poco estructurados. En este trabajo, presentamos un marco formal basado en métodos deductivos capaz de ocuparse de la verificación de propiedades de safety y liveness de sistemas concurrentes parametrizados que manejan estructuras de datos complejas. Nuestro marco formal incluye reglas de prueba y técnicas especialmente adaptadas para sistemas parametrizados, las cuales trabajan en colaboración con procedimientos de decisión especialmente diseñados para analizar complejas estructuras de datos concurrentes. Un aspecto novedoso de nuestro marco formal es que efectúa una clara diferenciación entre el análisis del flujo de control del programa y el análisis de los datos que se manejan. El flujo de control del programa se analiza utilizando reglas de prueba y técnicas de verificación deductivas especialmente diseñadas para lidiar con sistemas parametrizados. Comenzando a partir de un programa concurrente y la especificación de una propiedad temporal, nuestras técnicas deductivas son capaces de generar un conjunto finito de condiciones de verificación cuya validez implican la satisfacción de dicha especificación temporal por parte de cualquier sistema, sin importar el número de procesos que formen parte del sistema. Las condiciones de verificación generadas se corresponden con los datos manipulados. Estudiamos el diseño de procedimientos de decisión especializados capaces de lidiar con estas condiciones de verificación de manera completamente automática. Investigamos teorías decidibles capaces de describir propiedades de tipos de datos complejos que manipulan punteros, tales como implementaciones imperativas de pilas, colas, listas y skiplists. Para cada una de estas teorías presentamos un procedimiento de decisión y una implementación práctica construida sobre SMT solvers. Estos procedimientos de decisión son finalmente utilizados para verificar de manera automática las condiciones de verificación generadas por nuestras técnicas de verificación parametrizada. Para concluir, demostramos como utilizando nuestro marco formal es posible probar no solo propiedades de safety sino además de liveness en algunas versiones de protocolos de exclusión mutua y programas que manipulan estructuras de datos concurrentes. El enfoque que presentamos en este trabajo resulta ser muy general y puede ser aplicado para verificar un amplio rango de tipos de datos concurrentes similares. Abstract Concurrent data types are concurrent implementations of classical data abstractions, specifically designed to exploit the great deal of parallelism available in modern multiprocessor and multi-core architectures. The correct manipulation of concurrent data types is essential for the overall correctness of the software system built using them. A major difficulty in designing and verifying concurrent data types arises by the need to reason about any number of threads invoking the data type simultaneously, which requires considering parametrized systems. In this work we study the formal verification of temporal properties of parametrized concurrent systems, with a special focus on programs that manipulate concurrent data structures. The main difficulty to reason about concurrent parametrized systems comes from the combination of their inherently high concurrency and the manipulation of dynamic memory. This parametrized verification problem is very challenging, because it requires to reason about complex concurrent data structures being accessed and modified by threads which simultaneously manipulate the heap using unstructured synchronization methods. In this work, we present a formal framework based on deductive methods which is capable of dealing with the verification of safety and liveness properties of concurrent parametrized systems that manipulate complex data structures. Our framework includes special proof rules and techniques adapted for parametrized systems which work in collaboration with specialized decision procedures for complex data structures. A novel aspect of our framework is that it cleanly differentiates the analysis of the program control flow from the analysis of the data being manipulated. The program control flow is analyzed using deductive proof rules and verification techniques specifically designed for coping with parametrized systems. Starting from a concurrent program and a temporal specification, our techniques generate a finite collection of verification conditions whose validity entails the satisfaction of the temporal specification by any client system, in spite of the number of threads. The verification conditions correspond to the data manipulation. We study the design of specialized decision procedures to deal with these verification conditions fully automatically. We investigate decidable theories capable of describing rich properties of complex pointer based data types such as stacks, queues, lists and skiplists. For each of these theories we present a decision procedure, and its practical implementation on top of existing SMT solvers. These decision procedures are ultimately used for automatically verifying the verification conditions generated by our specialized parametrized verification techniques. Finally, we show how using our framework it is possible to prove not only safety but also liveness properties of concurrent versions of some mutual exclusion protocols and programs that manipulate concurrent data structures. The approach we present in this work is very general, and can be applied to verify a wide range of similar concurrent data types.
Resumo:
La presente tesis estudia los rosetones románicos de la ciudad de Zamora. La elección del tema tiene como objetivo profundizar en el conocimiento de estos elementos ya que la información existente sobre ellos es muy escasa. El análisis de estos rosetones se ha realizado desde una perspectiva globalizadora que abarca aspectos tales como los geográficos, morfológicos, funcionales, compositivos, constructivos, geométricos, ornamentales, otros. Así mismo, para el desarrollo de esta investigación se ha considerado necesario el estudio de temas históricos, estilísticos, simbólicos, religiosos, culturales, etc., que aportan el marco contextual que permiten su mejor entendimiento. El estudio de cada rosetón ha permitido implementar y desarrollar un método de trabajo analítico basado en el estudio particular de una serie de aspectos como los anteriormente mencionados, así como plantear una estrategia que permite la reconstitución gráfica de los rosetones, basándose en un sistema de módulos que facilitan trabajar de acuerdo a las proporciones de los elementos; hecho que permite acercarnos con gran exactitud a la representación del objeto real cuando se carece de medidas. El desarrollo de esta investigación ha llevado a establecer entre otras cosas que la definición de “ventana circular” que se le atribuye a los rosetones románicos no es acertada, puesto que la función que cumplen en el edificio religioso es más bien de carácter simbólico. ABSTRACT This thesis studies the Romanesque rose windows of the Zamora city. The choice of topic is intended to deepen the knowledge of these elements as the existing information about them is very scarce. The analysis of these rose windows was made from a global perspective covering aspects such as geographic, morphological, functional, compositional, construction, geometric, ornamental, other. Also, for the development of this research it was considered necessary to study historical, stylistic, symbolic, religious, cultural issues, etc., that provide the contextual framework that allow for better understanding. The study of each rose windows has allowed implement and develop a method of analytical work based on the particular study a number of issues such as those mentioned above, as well as devise a strategy that allows the graphic reconstitution of the rose windows, based on a system of modules facilitate work according to the proportions of the elements; made with great precision approach allows the representation of the real thing when it lacks measures. The development of this research has led to establish among other things that the definition of "circular window" that is attributed to the Romanesque rose windows is not successful because the role in the religious building is rather symbolic.
Resumo:
This paper presents a model of a control system for robot systems inspired by the functionality and organisation of human neuroregulatory system. Our model was specified using software agents within a formal framework and implemented through Web Services. This approach allows the implementation of the control logic of a robot system with relative ease, in an incremental way, using the addition of new control centres to the system as its behaviour is observed or needs to be detailed with greater precision, without the need to modify existing functionality. The tests performed verify that the proposed model has the general characteristics of biological systems together with the desirable features of software, such as robustness, flexibility, reuse and decoupling.