11 resultados para community of inquiry model
em Universidad Politécnica de Madrid
Resumo:
A land classification method was designed for the Community of Madrid (CM), which has lands suitable for either agriculture use or natural spaces. The process started from an extensive previous CM study that contains sets of land attributes with data for 122 types and a minimum-requirements method providing a land quality classification (SQ) for each land. Borrowing some tools from Operations Research (OR) and from Decision Science, that SQ has been complemented by an additive valuation method that involves a more restricted set of 13 representative attributes analysed using Attribute Valuation Functions to obtain a quality index, QI, and by an original composite method that uses a fuzzy set procedure to obtain a combined quality index, CQI, that contains relevant information from both the SQ and the QI methods.
Resumo:
To model strength degradation due to low cycle fatigue, at least three different approaches can be considered. One possibility is based on the formulation of a new free energy function and damage energy release rate, as was proposed by Ju(1989). The second approach uses the notion of bounding surface introduced in cyclic plasticity by Dafalias and Popov (1975). From this concept, some models have been proposed to quantify damage in concrete or RC (Suaris et al. 1990). The model proposed by the author to include fatigue effects is based essentially in Marigo (1985) and can be included in this approach.
Resumo:
We consider non-negative solution of a chemotaxis system with non constant chemotaxis sensitivity function X. This system appears as a limit case of a model formorphogenesis proposed by Bollenbach et al. (Phys. Rev. E. 75, 2007).Under suitable boundary conditions, modeling the presence of a morphogen source at x=0, we prove the existence of a global and bounded weak solution using an approximation by problems where diffusion is introduced in the ordinary differential equation. Moreover,we prove the convergence of the solution to the unique steady state provided that ? is small and ? is large enough. Numerical simulations both illustrate these results and give rise to further conjectures on the solution behavior that go beyond the rigorously proved statements.
Resumo:
This research addressed the development of a consolidated model designed especially to cover the security and usability attributes of a software product. As a starting point, we built a new usability model on the basis of well-known quality standards and models. We then used an existing security model to analyse the relationship between these two approaches. This analysis consisted of a systematic mapping study of the relationship between security and usability as global quality factors. We identified five relationship types: inverse, direct, relative, one-way inverse, and no relationship. Most authors agree that there is an inverse relationship between security and usability. However, this is not a unanimous finding, and this study unveils a number of open questions, like application domain dependency and the need to explore lower-level relationships between attribute subcharacteristics. In order to clarify the questions raised during the research, we conducted a second systematic mapping to further analyse the finer-grained structure of these factors, such as authentication as a subset of security and user efficiency as a subset of usability. The most relevant finding is that efficiency does not depend on the security level during the authentication process. There are other subfactors that require analysis. Accordingly, this research is the first part of a larger project to develop a full-blown consolidated model for security and usability.
Resumo:
It is known that cross-curricular competences are required for main companies all over the world to be part of our university graduates as technical knowledge does. That is the reason which has led the university structure to include these competences in the every degree curriculo validated since the European Higher Education Area (EHEA)was introduced in the Spanish university context. But the way used for incorporating them has been developed without the necessary guidelines to generate a qualified model.
Resumo:
The communication presents part of the results of the study DEP2010- 19801/Plan Nacional I+D+i 2010-2013.
Resumo:
The Large Hadron Collider is the world’s largest and most powerful particle accelerator. The project is divided in phases. The first one goes from 2009 until 2020. The second phase will consist of the implementation of upgrades. One of the upgrades is to increase the ratio of collision, the luminosity. This objective is the main of one of the most important projects which is carrying out the upgrades: Hi-Lumi LHC project. Increasing luminosity could be done by using a new material in the superconductor magnets placed at the interaction points: Nb3Sn, instead of NbTi, the one being used right now. Before implementing it many aspects should be analysed. One of them is the induction magnetic field quality. The tool used so far has been ROXIE, software developed at CERN by S. Russenschuck. One of the main features of the programme is the time-transient analysis, which is based on three mathematical models. It is quite precise for fields above 1.5 Tesla. However, they are not very accurate for lower fields. Therefore the aim of this project is to evaluate a more accurate model: Classical Preisach Model of Hysteresis, in order to better analyse induced field quality in the new material Nb3Sn. Resumen: El Gran Colisionador de Hadrones es el mayor acelerador de partículas circular del mundo. Se trata de uno de los mayores proyectos de investigación. La primera fase de funcionamiento comprende desde 2009 a 2020, cuando comenzará la siguiente fase. Durante el primer periodo se han pensado mejoras para que puedan ser implementadas en la segunda fase. Una de ellas es el aumento del ratio de las colisiones entre protones por choque. Este es el principal objetivo de uno de los proyectos que está llevando a cabo las mejoras a ser implementadas en 2020: Hi- Lumi LHC. Se cambiarán los imanes superconductores de NbTi de las dos zonas principales de interacción, y se sustituirán por imanes de Nb3Sn. Esta sustituciónn conlleva un profundo estudio previo. Entre otros, uno de los factores a analizar es la calidad del campo magnético. La herramienta utilizada es el software desarrollado por S. Russenschuck en el CERN llamado ROXIE. Está basado en tres modelos de magnetización, los cuales son precisos para campos mayores de 1.5 T. Sin embargo, no lo son tanto para campos menores. Con este proyecto se pretende evaluar la implementación de un cuarto modelo, el modelo clásico de histéresis de Preisach que permita llevar a cabo un mejor análisis de la calidad del campo inducido por el futuro material a utilizar en algunos de los imanes.
Resumo:
In an early paper Herbert Mohring (J. Poi Et on , 49 (1961)) presented a model for land rent distribution yielding the well-known result that the price of land must fall with the distance from the city center to offset transportation costs. Our paper is an extension of Mohring's model in which we relax some of his drastic simplifying assumptions. This extended model has been incorporated in a method for economic evaluation of city master plans which has been applied to a Swedish city. In this method the interdependence among housing, heating, and transportation, the dura-bility of urban structures, and the uncertainty of future demand are explicitly considered within a cost-benefit approach. Some empirical results from this pilot study concerning land rent distributions are also presented here.
Resumo:
The DNDC (DeNitrification and DeComposition) model was first developed by Li et al. (1992) as a rain event-driven process-orientated simulation model for nitrous oxide, carbon dioxide and nitrogen gas emissions from the agricultural soils in the U.S. Over the last 20 years, the model has been modified and adapted by various research groups around the world to suit specific purposes and circumstances. The Global Research Alliance Modelling Platform (GRAMP) is a UK-led initiative for the establishment of a purposeful and credible web-based platform initially aimed at users of the DNDC model. With the aim of improving the predictions of soil C and N cycling in the context of climate change the objectives of GRAMP are to: 1) to document the existing versions of the DNDC model; 2) to create a family tree of the individual DNDC versions; 3) to provide information on model use and development; and 4) to identify strengths, weaknesses and potential improvements for the model.
Resumo:
Con esta tesis doctoral se pretende elaborar un modelo de Certificado de Calidad Cinegética independiente, de adhesión voluntaria y aplicable a todo tipo de espacios cinegéticos, de forma que posteriormente pueda convertirse en una metodología que sea empleada como instrumento válido de medición de la Calidad Cinegética y normalizada a través de una familia de Normas aprobadas por un organismo de normalización reconocido a nivel nacional o internacional. En primer lugar, se procedió a la realización de un riguroso y exhaustivo estudio de justificación siguiendo la metodología propuesta por la Norma UNE 66172:2003 IN, Directrices para la justificación y desarrollo de sistemas de gestión (equivalente a la Norma Internacional GUIA ISO/IEC 72:2001). A continuación, se procedió a la identificación y desarrollo de los parámetros de Ordenación Cinegética comunes a cualquier espacio cinegético en España y a la conceptualización de la Calidad Cinegética. Finalmente, se desarrolló un modelo estructurado en nueve Criterios y treinta y cuatro Indicadores de Calidad Cinegética, y un proyecto de familia de Normas para la Certificación de la Calidad Cinegética. ABSTRACT This doctoral thesis aims to produce a model of Hunting Quality Certificate independent, of voluntary adherence and applicable to all types of hunting areas, so that later it can become a methodology to be used as a valid instrument for measuring Hunting Quality and standardized through a family of standards approved by an organization of standardization recognized at a national or an international level. First, we proceeded to carry out a rigorous and comprehensive justification study following the methodology proposed by the UNE 66172: 2003 IN, Guidelines for the justification and development of management systems standards (equivalent to the International Standard GUIA ISO / IEC 72: 2001). Then, we proceeded to the identification and development of Hunting Management parameters common to any hunting area in Spain and the conceptualization of Hunting Quality. Finally, a model structured into nine Criteria and thirty-four Indicators of Hunting Quality and a draft of a family of standards for Hunting Quality Certification were developed.
Resumo:
PURPOSE The decision-making process plays a key role in organizations. Every decision-making process produces a final choice that may or may not prompt action. Recurrently, decision makers find themselves in the dichotomous question of following a traditional sequence decision-making process where the output of a decision is used as the input of the next stage of the decision, or following a joint decision-making approach where several decisions are taken simultaneously. The implication of the decision-making process will impact different players of the organization. The choice of the decision- making approach becomes difficult to find, even with the current literature and practitioners’ knowledge. The pursuit of better ways for making decisions has been a common goal for academics and practitioners. Management scientists use different techniques and approaches to improve different types of decisions. The purpose of this decision is to use the available resources as well as possible (data and techniques) to achieve the objectives of the organization. The developing and applying of models and concepts may be helpful to solve managerial problems faced every day in different companies. As a result of this research different decision models are presented to contribute to the body of knowledge of management science. The first models are focused on the manufacturing industry and the second part of the models on the health care industry. Despite these models being case specific, they serve the purpose of exemplifying that different approaches to the problems and could provide interesting results. Unfortunately, there is no universal recipe that could be applied to all the problems. Furthermore, the same model could deliver good results with certain data and bad results for other data. A framework to analyse the data before selecting the model to be used is presented and tested in the models developed to exemplify the ideas. METHODOLOGY As the first step of the research a systematic literature review on the joint decision is presented, as are the different opinions and suggestions of different scholars. For the next stage of the thesis, the decision-making process of more than 50 companies was analysed in companies from different sectors in the production planning area at the Job Shop level. The data was obtained using surveys and face-to-face interviews. The following part of the research into the decision-making process was held in two application fields that are highly relevant for our society; manufacturing and health care. The first step was to study the interactions and develop a mathematical model for the replenishment of the car assembly where the problem of “Vehicle routing problem and Inventory” were combined. The next step was to add the scheduling or car production (car sequencing) decision and use some metaheuristics such as ant colony and genetic algorithms to measure if the behaviour is kept up with different case size problems. A similar approach is presented in a production of semiconductors and aviation parts, where a hoist has to change from one station to another to deal with the work, and a jobs schedule has to be done. However, for this problem simulation was used for experimentation. In parallel, the scheduling of operating rooms was studied. Surgeries were allocated to surgeons and the scheduling of operating rooms was analysed. The first part of the research was done in a Teaching hospital, and for the second part the interaction of uncertainty was added. Once the previous problem had been analysed a general framework to characterize the instance was built. In the final chapter a general conclusion is presented. FINDINGS AND PRACTICAL IMPLICATIONS The first part of the contributions is an update of the decision-making literature review. Also an analysis of the possible savings resulting from a change in the decision process is made. Then, the results of the survey, which present a lack of consistency between what the managers believe and the reality of the integration of their decisions. In the next stage of the thesis, a contribution to the body of knowledge of the operation research, with the joint solution of the replenishment, sequencing and inventory problem in the assembly line is made, together with a parallel work with the operating rooms scheduling where different solutions approaches are presented. In addition to the contribution of the solving methods, with the use of different techniques, the main contribution is the framework that is proposed to pre-evaluate the problem before thinking of the techniques to solve it. However, there is no straightforward answer as to whether it is better to have joint or sequential solutions. Following the proposed framework with the evaluation of factors such as the flexibility of the answer, the number of actors, and the tightness of the data, give us important hints as to the most suitable direction to take to tackle the problem. RESEARCH LIMITATIONS AND AVENUES FOR FUTURE RESEARCH In the first part of the work it was really complicated to calculate the possible savings of different projects, since in many papers these quantities are not reported or the impact is based on non-quantifiable benefits. The other issue is the confidentiality of many projects where the data cannot be presented. For the car assembly line problem more computational power would allow us to solve bigger instances. For the operation research problem there was a lack of historical data to perform a parallel analysis in the teaching hospital. In order to keep testing the decision framework it is necessary to keep applying more case studies in order to generalize the results and make them more evident and less ambiguous. The health care field offers great opportunities since despite the recent awareness of the need to improve the decision-making process there are many opportunities to improve. Another big difference with the automotive industry is that the last improvements are not spread among all the actors. Therefore, in the future this research will focus more on the collaboration between academia and the health care sector.